Dec 06 01:02:05 crc systemd[1]: Starting Kubernetes Kubelet... Dec 06 01:02:05 crc restorecon[4682]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 01:02:05 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 01:02:06 crc restorecon[4682]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 06 01:02:06 crc restorecon[4682]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 06 01:02:06 crc kubenswrapper[4991]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 06 01:02:06 crc kubenswrapper[4991]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 06 01:02:06 crc kubenswrapper[4991]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 06 01:02:06 crc kubenswrapper[4991]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 06 01:02:06 crc kubenswrapper[4991]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 06 01:02:06 crc kubenswrapper[4991]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.855887 4991 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860131 4991 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860164 4991 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860171 4991 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860176 4991 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860181 4991 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860188 4991 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860193 4991 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860199 4991 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860205 4991 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860211 4991 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860218 4991 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860225 4991 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860232 4991 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860237 4991 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860243 4991 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860247 4991 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860252 4991 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860258 4991 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860262 4991 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860267 4991 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860272 4991 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860279 4991 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860285 4991 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860290 4991 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860296 4991 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860308 4991 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860314 4991 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860319 4991 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860324 4991 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860330 4991 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860336 4991 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860342 4991 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860350 4991 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860356 4991 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860363 4991 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860368 4991 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860373 4991 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860380 4991 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860387 4991 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860392 4991 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860397 4991 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860402 4991 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860407 4991 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860412 4991 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860416 4991 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860420 4991 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860426 4991 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860431 4991 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860436 4991 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860441 4991 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860446 4991 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860450 4991 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860455 4991 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860460 4991 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860465 4991 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860470 4991 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860475 4991 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860479 4991 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860484 4991 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860488 4991 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860493 4991 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860497 4991 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860502 4991 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860507 4991 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860513 4991 feature_gate.go:330] unrecognized feature gate: Example Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860519 4991 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860524 4991 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860529 4991 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860534 4991 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860538 4991 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.860543 4991 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.860850 4991 flags.go:64] FLAG: --address="0.0.0.0" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.860873 4991 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.860886 4991 flags.go:64] FLAG: --anonymous-auth="true" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.860899 4991 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.860909 4991 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.860916 4991 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.860925 4991 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.860932 4991 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.860939 4991 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.860946 4991 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.860953 4991 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.860960 4991 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.860966 4991 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.860973 4991 flags.go:64] FLAG: --cgroup-root="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.860980 4991 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861011 4991 flags.go:64] FLAG: --client-ca-file="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861018 4991 flags.go:64] FLAG: --cloud-config="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861045 4991 flags.go:64] FLAG: --cloud-provider="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861052 4991 flags.go:64] FLAG: --cluster-dns="[]" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861062 4991 flags.go:64] FLAG: --cluster-domain="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861068 4991 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861074 4991 flags.go:64] FLAG: --config-dir="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861079 4991 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861086 4991 flags.go:64] FLAG: --container-log-max-files="5" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861094 4991 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861100 4991 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861107 4991 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861113 4991 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861119 4991 flags.go:64] FLAG: --contention-profiling="false" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861125 4991 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861130 4991 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861136 4991 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861142 4991 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861151 4991 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861158 4991 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861164 4991 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861170 4991 flags.go:64] FLAG: --enable-load-reader="false" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861175 4991 flags.go:64] FLAG: --enable-server="true" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861181 4991 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861189 4991 flags.go:64] FLAG: --event-burst="100" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861196 4991 flags.go:64] FLAG: --event-qps="50" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861202 4991 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861208 4991 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861215 4991 flags.go:64] FLAG: --eviction-hard="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861223 4991 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861229 4991 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861235 4991 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861241 4991 flags.go:64] FLAG: --eviction-soft="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861247 4991 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861253 4991 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861259 4991 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861264 4991 flags.go:64] FLAG: --experimental-mounter-path="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861270 4991 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861275 4991 flags.go:64] FLAG: --fail-swap-on="true" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861280 4991 flags.go:64] FLAG: --feature-gates="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861288 4991 flags.go:64] FLAG: --file-check-frequency="20s" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861294 4991 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861300 4991 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861305 4991 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861311 4991 flags.go:64] FLAG: --healthz-port="10248" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861316 4991 flags.go:64] FLAG: --help="false" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861322 4991 flags.go:64] FLAG: --hostname-override="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861327 4991 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861333 4991 flags.go:64] FLAG: --http-check-frequency="20s" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861339 4991 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861345 4991 flags.go:64] FLAG: --image-credential-provider-config="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861351 4991 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861357 4991 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861362 4991 flags.go:64] FLAG: --image-service-endpoint="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861368 4991 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861373 4991 flags.go:64] FLAG: --kube-api-burst="100" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861379 4991 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861388 4991 flags.go:64] FLAG: --kube-api-qps="50" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861393 4991 flags.go:64] FLAG: --kube-reserved="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861399 4991 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861405 4991 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861412 4991 flags.go:64] FLAG: --kubelet-cgroups="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861417 4991 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861423 4991 flags.go:64] FLAG: --lock-file="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861429 4991 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861434 4991 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861440 4991 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861449 4991 flags.go:64] FLAG: --log-json-split-stream="false" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861455 4991 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861460 4991 flags.go:64] FLAG: --log-text-split-stream="false" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861465 4991 flags.go:64] FLAG: --logging-format="text" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861471 4991 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861479 4991 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861485 4991 flags.go:64] FLAG: --manifest-url="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861492 4991 flags.go:64] FLAG: --manifest-url-header="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861501 4991 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861509 4991 flags.go:64] FLAG: --max-open-files="1000000" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861518 4991 flags.go:64] FLAG: --max-pods="110" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861524 4991 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861530 4991 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861536 4991 flags.go:64] FLAG: --memory-manager-policy="None" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861542 4991 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861550 4991 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861556 4991 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861562 4991 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861579 4991 flags.go:64] FLAG: --node-status-max-images="50" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861586 4991 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861593 4991 flags.go:64] FLAG: --oom-score-adj="-999" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861599 4991 flags.go:64] FLAG: --pod-cidr="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861606 4991 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861619 4991 flags.go:64] FLAG: --pod-manifest-path="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861626 4991 flags.go:64] FLAG: --pod-max-pids="-1" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861631 4991 flags.go:64] FLAG: --pods-per-core="0" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861639 4991 flags.go:64] FLAG: --port="10250" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861645 4991 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861650 4991 flags.go:64] FLAG: --provider-id="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861656 4991 flags.go:64] FLAG: --qos-reserved="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861663 4991 flags.go:64] FLAG: --read-only-port="10255" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861669 4991 flags.go:64] FLAG: --register-node="true" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861675 4991 flags.go:64] FLAG: --register-schedulable="true" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861680 4991 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861693 4991 flags.go:64] FLAG: --registry-burst="10" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861699 4991 flags.go:64] FLAG: --registry-qps="5" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861705 4991 flags.go:64] FLAG: --reserved-cpus="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861711 4991 flags.go:64] FLAG: --reserved-memory="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861718 4991 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861724 4991 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861731 4991 flags.go:64] FLAG: --rotate-certificates="false" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861737 4991 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861743 4991 flags.go:64] FLAG: --runonce="false" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861748 4991 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861754 4991 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861761 4991 flags.go:64] FLAG: --seccomp-default="false" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861766 4991 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861772 4991 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861778 4991 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861785 4991 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861791 4991 flags.go:64] FLAG: --storage-driver-password="root" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861798 4991 flags.go:64] FLAG: --storage-driver-secure="false" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861804 4991 flags.go:64] FLAG: --storage-driver-table="stats" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861809 4991 flags.go:64] FLAG: --storage-driver-user="root" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861816 4991 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861822 4991 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861827 4991 flags.go:64] FLAG: --system-cgroups="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861833 4991 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861842 4991 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861848 4991 flags.go:64] FLAG: --tls-cert-file="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861853 4991 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861863 4991 flags.go:64] FLAG: --tls-min-version="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861868 4991 flags.go:64] FLAG: --tls-private-key-file="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861873 4991 flags.go:64] FLAG: --topology-manager-policy="none" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861879 4991 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861885 4991 flags.go:64] FLAG: --topology-manager-scope="container" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861892 4991 flags.go:64] FLAG: --v="2" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861900 4991 flags.go:64] FLAG: --version="false" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861908 4991 flags.go:64] FLAG: --vmodule="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861915 4991 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.861921 4991 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862086 4991 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862097 4991 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862103 4991 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862109 4991 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862115 4991 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862120 4991 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862125 4991 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862129 4991 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862134 4991 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862140 4991 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862145 4991 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862150 4991 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862155 4991 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862160 4991 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862165 4991 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862170 4991 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862175 4991 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862180 4991 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862184 4991 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862189 4991 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862194 4991 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862199 4991 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862204 4991 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862208 4991 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862213 4991 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862217 4991 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862222 4991 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862226 4991 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862231 4991 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862237 4991 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862241 4991 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862247 4991 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862251 4991 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862258 4991 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862263 4991 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862268 4991 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862274 4991 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862279 4991 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862283 4991 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862288 4991 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862293 4991 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862306 4991 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862313 4991 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862318 4991 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862323 4991 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862328 4991 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862334 4991 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862339 4991 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862344 4991 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862350 4991 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862355 4991 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862360 4991 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862365 4991 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862370 4991 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862376 4991 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862382 4991 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862387 4991 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862392 4991 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862397 4991 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862401 4991 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862406 4991 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862410 4991 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862414 4991 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862419 4991 feature_gate.go:330] unrecognized feature gate: Example Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862424 4991 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862428 4991 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862433 4991 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862439 4991 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862453 4991 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862460 4991 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.862465 4991 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.862480 4991 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.872172 4991 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.872228 4991 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872390 4991 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872418 4991 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872429 4991 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872440 4991 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872449 4991 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872459 4991 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872467 4991 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872476 4991 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872485 4991 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872492 4991 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872500 4991 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872508 4991 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872517 4991 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872525 4991 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872535 4991 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872546 4991 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872554 4991 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872563 4991 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872571 4991 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872578 4991 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872586 4991 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872594 4991 feature_gate.go:330] unrecognized feature gate: Example Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872604 4991 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872613 4991 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872623 4991 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872632 4991 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872641 4991 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872651 4991 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872660 4991 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872668 4991 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872676 4991 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872686 4991 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872695 4991 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872704 4991 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872711 4991 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872719 4991 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872727 4991 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872735 4991 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872744 4991 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872751 4991 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872760 4991 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872767 4991 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872775 4991 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872783 4991 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872791 4991 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872798 4991 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872806 4991 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872814 4991 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872822 4991 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872830 4991 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872838 4991 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872846 4991 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872854 4991 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872861 4991 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872870 4991 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872877 4991 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872885 4991 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872894 4991 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872903 4991 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872912 4991 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872920 4991 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872928 4991 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872936 4991 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872943 4991 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872951 4991 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872959 4991 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872967 4991 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.872975 4991 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873023 4991 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873035 4991 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873079 4991 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.873093 4991 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873335 4991 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873350 4991 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873362 4991 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873371 4991 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873379 4991 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873388 4991 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873396 4991 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873405 4991 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873413 4991 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873421 4991 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873429 4991 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873438 4991 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873446 4991 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873455 4991 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873463 4991 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873471 4991 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873479 4991 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873487 4991 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873497 4991 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873506 4991 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873513 4991 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873521 4991 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873530 4991 feature_gate.go:330] unrecognized feature gate: Example Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873538 4991 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873546 4991 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873553 4991 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873563 4991 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873573 4991 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873582 4991 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873591 4991 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873601 4991 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873610 4991 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873618 4991 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873626 4991 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873634 4991 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873643 4991 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873651 4991 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873659 4991 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873667 4991 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873675 4991 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873683 4991 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873690 4991 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873698 4991 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873705 4991 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873714 4991 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873721 4991 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873729 4991 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873736 4991 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873744 4991 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873752 4991 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873763 4991 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873771 4991 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873779 4991 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873787 4991 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873795 4991 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873802 4991 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873810 4991 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873818 4991 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873826 4991 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873834 4991 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873842 4991 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873850 4991 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873858 4991 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873865 4991 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873874 4991 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873883 4991 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873893 4991 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873904 4991 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873913 4991 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873924 4991 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.873933 4991 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.873946 4991 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.874256 4991 server.go:940] "Client rotation is on, will bootstrap in background" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.879623 4991 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.879770 4991 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.880566 4991 server.go:997] "Starting client certificate rotation" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.880615 4991 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.880917 4991 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-10 17:54:28.309611606 +0000 UTC Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.881121 4991 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 856h52m21.428496725s for next certificate rotation Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.887335 4991 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.890026 4991 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.900466 4991 log.go:25] "Validated CRI v1 runtime API" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.929070 4991 log.go:25] "Validated CRI v1 image API" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.932745 4991 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.937641 4991 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-06-00-57-36-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.937684 4991 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.958621 4991 manager.go:217] Machine: {Timestamp:2025-12-06 01:02:06.95604886 +0000 UTC m=+0.306339378 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:a16d5df6-771f-4852-a5a1-7a32af9d583d BootID:596c3607-4a7a-43fe-9f86-5d0b3f7502dc Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:73:f0:87 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:73:f0:87 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:8c:fe:27 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:2c:e2:35 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:27:a5:fd Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:17:42:70 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:d2:d6:e0:5d:e1:ea Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:6a:ab:d6:bd:d5:eb Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.959103 4991 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.959554 4991 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.960151 4991 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.960485 4991 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.960548 4991 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.960883 4991 topology_manager.go:138] "Creating topology manager with none policy" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.960902 4991 container_manager_linux.go:303] "Creating device plugin manager" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.961162 4991 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.961266 4991 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.961714 4991 state_mem.go:36] "Initialized new in-memory state store" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.962460 4991 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.964667 4991 kubelet.go:418] "Attempting to sync node with API server" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.964693 4991 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.964739 4991 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.964756 4991 kubelet.go:324] "Adding apiserver pod source" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.964785 4991 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.966893 4991 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.967322 4991 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.968766 4991 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.969124 4991 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 06 01:02:06 crc kubenswrapper[4991]: E1206 01:02:06.969264 4991 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.110:6443: connect: connection refused" logger="UnhandledError" Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.969159 4991 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 06 01:02:06 crc kubenswrapper[4991]: E1206 01:02:06.969367 4991 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.110:6443: connect: connection refused" logger="UnhandledError" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.969444 4991 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.969463 4991 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.969470 4991 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.969478 4991 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.969490 4991 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.969498 4991 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.969505 4991 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.969516 4991 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.969527 4991 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.969535 4991 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.969569 4991 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.969577 4991 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.969870 4991 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.970388 4991 server.go:1280] "Started kubelet" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.970724 4991 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.970755 4991 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.971713 4991 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 06 01:02:06 crc systemd[1]: Started Kubernetes Kubelet. Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.974608 4991 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.976610 4991 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.976675 4991 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.977058 4991 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 19:43:37.629218502 +0000 UTC Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.977692 4991 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.977735 4991 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 06 01:02:06 crc kubenswrapper[4991]: E1206 01:02:06.975561 4991 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.110:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187e7a9eecb27f83 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-06 01:02:06.970355587 +0000 UTC m=+0.320646055,LastTimestamp:2025-12-06 01:02:06.970355587 +0000 UTC m=+0.320646055,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.978133 4991 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 06 01:02:06 crc kubenswrapper[4991]: E1206 01:02:06.979255 4991 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.980358 4991 server.go:460] "Adding debug handlers to kubelet server" Dec 06 01:02:06 crc kubenswrapper[4991]: E1206 01:02:06.983646 4991 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="200ms" Dec 06 01:02:06 crc kubenswrapper[4991]: W1206 01:02:06.983647 4991 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 06 01:02:06 crc kubenswrapper[4991]: E1206 01:02:06.983777 4991 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.110:6443: connect: connection refused" logger="UnhandledError" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.984207 4991 factory.go:55] Registering systemd factory Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.984238 4991 factory.go:221] Registration of the systemd container factory successfully Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.984687 4991 factory.go:153] Registering CRI-O factory Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.984707 4991 factory.go:221] Registration of the crio container factory successfully Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.984804 4991 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.984837 4991 factory.go:103] Registering Raw factory Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.984861 4991 manager.go:1196] Started watching for new ooms in manager Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.985953 4991 manager.go:319] Starting recovery of all containers Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.994209 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.994295 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.994317 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.994342 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.994361 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.994380 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.994406 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.994423 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.994447 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.994466 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.994486 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.994504 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.994523 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.994548 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.994566 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.994586 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.994608 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.994626 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.994647 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.994666 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.994687 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.994705 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.994723 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.994741 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.994760 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.994780 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.994847 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.994869 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.994888 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.994906 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.994924 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.994943 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.994968 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.995016 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.995037 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.995058 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.995078 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.995099 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.995121 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.995139 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.995158 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.995179 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.995231 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.995252 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.995270 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.995289 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.995311 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.995330 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.995350 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.995369 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.995389 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.995406 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.995431 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.995451 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.995469 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.995489 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.995510 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.995529 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.996681 4991 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.996739 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.996763 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.996778 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.996798 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.996812 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.996829 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.996844 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.996858 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.996872 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.996888 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.996902 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.996916 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.996929 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.996942 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.996957 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.996969 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.996996 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.997010 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.997023 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.997035 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.997048 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.997064 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.997078 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.997093 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.997108 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.997120 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.997132 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.997145 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.997157 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.997169 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.997188 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.997201 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.997214 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.997226 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.997237 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.997252 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.997284 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.997296 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.997308 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.997320 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.997332 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.997343 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 06 01:02:06 crc kubenswrapper[4991]: I1206 01:02:06.997355 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997366 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997378 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997391 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997421 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997436 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997449 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997463 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997479 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997490 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997501 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997515 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997528 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997539 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997550 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997560 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997577 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997589 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997600 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997611 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997625 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997636 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997648 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997658 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997668 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997679 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997689 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997701 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997713 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997723 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997734 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997747 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997759 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997768 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997779 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997789 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997800 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997811 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997822 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997832 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997842 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997853 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997864 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997876 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997887 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997922 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997937 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997949 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.997960 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.998002 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.998015 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.998025 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.998038 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.998050 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.998078 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.998090 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.998101 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.998111 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.998125 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.998137 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.998147 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.998158 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.998169 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.998183 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.998202 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.998214 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.998227 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.998241 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.998558 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.998578 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.998621 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.998634 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.998656 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.998669 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.998683 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.998703 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.998715 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.998726 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.998745 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.998755 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.998771 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.998783 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.998794 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.998808 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.998824 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.998839 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.998848 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.998860 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.998877 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.998888 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.998907 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.998920 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.998931 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.998944 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.998958 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.998973 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.999015 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.999026 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.999042 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.999051 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.999289 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.999327 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.999374 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.999401 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.999443 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.999468 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.999493 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.999515 4991 reconstruct.go:97] "Volume reconstruction finished" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:06.999542 4991 reconciler.go:26] "Reconciler: start to sync state" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.013001 4991 manager.go:324] Recovery completed Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.021928 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.023509 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.023572 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.023589 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.024281 4991 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.024303 4991 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.024328 4991 state_mem.go:36] "Initialized new in-memory state store" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.034292 4991 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.036428 4991 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.036497 4991 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.036551 4991 kubelet.go:2335] "Starting kubelet main sync loop" Dec 06 01:02:07 crc kubenswrapper[4991]: E1206 01:02:07.036645 4991 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 06 01:02:07 crc kubenswrapper[4991]: W1206 01:02:07.040134 4991 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 06 01:02:07 crc kubenswrapper[4991]: E1206 01:02:07.041016 4991 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.110:6443: connect: connection refused" logger="UnhandledError" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.042235 4991 policy_none.go:49] "None policy: Start" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.042955 4991 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.043031 4991 state_mem.go:35] "Initializing new in-memory state store" Dec 06 01:02:07 crc kubenswrapper[4991]: E1206 01:02:07.079510 4991 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.085016 4991 manager.go:334] "Starting Device Plugin manager" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.085083 4991 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.085102 4991 server.go:79] "Starting device plugin registration server" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.085857 4991 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.085902 4991 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.086105 4991 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.086224 4991 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.086241 4991 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 06 01:02:07 crc kubenswrapper[4991]: E1206 01:02:07.098731 4991 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.137714 4991 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.137955 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.139243 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.139290 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.139306 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.139460 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.139585 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.139625 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.140413 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.140439 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.140449 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.140523 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.140566 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.140577 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.140805 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.140923 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.140962 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.141861 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.141881 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.141890 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.141866 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.142013 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.142028 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.142148 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.142317 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.142356 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.142746 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.142768 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.142778 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.142863 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.142965 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.143012 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.143033 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.143049 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.143058 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.143568 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.143632 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.143643 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.143667 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.143682 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.143694 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.143875 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.143910 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.144787 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.144824 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.144844 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:07 crc kubenswrapper[4991]: E1206 01:02:07.184803 4991 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="400ms" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.187644 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.189434 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.189510 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.189533 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.189609 4991 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 06 01:02:07 crc kubenswrapper[4991]: E1206 01:02:07.190394 4991 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.110:6443: connect: connection refused" node="crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.203746 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.203825 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.203886 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.203924 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.203957 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.204063 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.204119 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.204239 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.204291 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.204323 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.204346 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.204367 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.204383 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.204509 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.204570 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.306507 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.306628 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.306684 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.306720 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.306755 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.306788 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.306779 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.306821 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.306852 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.306886 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.306920 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.306956 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.306971 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.307023 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.306913 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.307105 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.307138 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.307169 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.307027 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.307233 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.307161 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.307154 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.307155 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.307074 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.307299 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.307069 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.307303 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.307346 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.307102 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.307202 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.390818 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.392382 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.392524 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.392555 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.392604 4991 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 06 01:02:07 crc kubenswrapper[4991]: E1206 01:02:07.393348 4991 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.110:6443: connect: connection refused" node="crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.469045 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.484354 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.491727 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: W1206 01:02:07.505778 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-f4f16ad41c7b47354260f34cc0a18383bd2516778b3872fe671f86cc2386b34e WatchSource:0}: Error finding container f4f16ad41c7b47354260f34cc0a18383bd2516778b3872fe671f86cc2386b34e: Status 404 returned error can't find the container with id f4f16ad41c7b47354260f34cc0a18383bd2516778b3872fe671f86cc2386b34e Dec 06 01:02:07 crc kubenswrapper[4991]: W1206 01:02:07.516955 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-65bcc7b4ff2f989850e68c849b61de5102e29ee670d328a8e293e7bde34e9b70 WatchSource:0}: Error finding container 65bcc7b4ff2f989850e68c849b61de5102e29ee670d328a8e293e7bde34e9b70: Status 404 returned error can't find the container with id 65bcc7b4ff2f989850e68c849b61de5102e29ee670d328a8e293e7bde34e9b70 Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.526350 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: W1206 01:02:07.526710 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-68b275ab0916101fb86952d45d301945f63b670b1ccb28be6fda67ebe78931d4 WatchSource:0}: Error finding container 68b275ab0916101fb86952d45d301945f63b670b1ccb28be6fda67ebe78931d4: Status 404 returned error can't find the container with id 68b275ab0916101fb86952d45d301945f63b670b1ccb28be6fda67ebe78931d4 Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.531494 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 01:02:07 crc kubenswrapper[4991]: W1206 01:02:07.549113 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-4bfbf36f76cfd9b2a6bdb5b77b7cc5943e2822766a137f12c56770207b658fe8 WatchSource:0}: Error finding container 4bfbf36f76cfd9b2a6bdb5b77b7cc5943e2822766a137f12c56770207b658fe8: Status 404 returned error can't find the container with id 4bfbf36f76cfd9b2a6bdb5b77b7cc5943e2822766a137f12c56770207b658fe8 Dec 06 01:02:07 crc kubenswrapper[4991]: W1206 01:02:07.558238 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-ecbaa9130f6a334ef68157c6fe9ace1d8723bcdc6ef6b8ab246cdce6357382ec WatchSource:0}: Error finding container ecbaa9130f6a334ef68157c6fe9ace1d8723bcdc6ef6b8ab246cdce6357382ec: Status 404 returned error can't find the container with id ecbaa9130f6a334ef68157c6fe9ace1d8723bcdc6ef6b8ab246cdce6357382ec Dec 06 01:02:07 crc kubenswrapper[4991]: E1206 01:02:07.586414 4991 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="800ms" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.793536 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.795653 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.795739 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.795757 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.795795 4991 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 06 01:02:07 crc kubenswrapper[4991]: E1206 01:02:07.796458 4991 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.110:6443: connect: connection refused" node="crc" Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.973372 4991 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 06 01:02:07 crc kubenswrapper[4991]: I1206 01:02:07.977372 4991 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 05:52:28.445557223 +0000 UTC Dec 06 01:02:08 crc kubenswrapper[4991]: I1206 01:02:08.044047 4991 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="4c9ac5eacade675cc065a0b9bbf0f5ef9622e237ff8dd6cc253f446184fba1f0" exitCode=0 Dec 06 01:02:08 crc kubenswrapper[4991]: I1206 01:02:08.044141 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"4c9ac5eacade675cc065a0b9bbf0f5ef9622e237ff8dd6cc253f446184fba1f0"} Dec 06 01:02:08 crc kubenswrapper[4991]: I1206 01:02:08.044279 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"65bcc7b4ff2f989850e68c849b61de5102e29ee670d328a8e293e7bde34e9b70"} Dec 06 01:02:08 crc kubenswrapper[4991]: I1206 01:02:08.044409 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:08 crc kubenswrapper[4991]: I1206 01:02:08.045641 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:08 crc kubenswrapper[4991]: I1206 01:02:08.045686 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:08 crc kubenswrapper[4991]: I1206 01:02:08.045709 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:08 crc kubenswrapper[4991]: I1206 01:02:08.046834 4991 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="cd4bdff2c70d7492d7819fa53b5844cd7b7c50157c2ca34841bcdd8fbffc1345" exitCode=0 Dec 06 01:02:08 crc kubenswrapper[4991]: I1206 01:02:08.046883 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"cd4bdff2c70d7492d7819fa53b5844cd7b7c50157c2ca34841bcdd8fbffc1345"} Dec 06 01:02:08 crc kubenswrapper[4991]: I1206 01:02:08.046911 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"f4f16ad41c7b47354260f34cc0a18383bd2516778b3872fe671f86cc2386b34e"} Dec 06 01:02:08 crc kubenswrapper[4991]: I1206 01:02:08.047127 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:08 crc kubenswrapper[4991]: I1206 01:02:08.048337 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:08 crc kubenswrapper[4991]: I1206 01:02:08.048359 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:08 crc kubenswrapper[4991]: I1206 01:02:08.048368 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:08 crc kubenswrapper[4991]: I1206 01:02:08.050447 4991 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="91e6ba531f19ebc253e48073eadf13d896be7771a81abf986f83893cc4eb3ea9" exitCode=0 Dec 06 01:02:08 crc kubenswrapper[4991]: I1206 01:02:08.050497 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"91e6ba531f19ebc253e48073eadf13d896be7771a81abf986f83893cc4eb3ea9"} Dec 06 01:02:08 crc kubenswrapper[4991]: I1206 01:02:08.050514 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ecbaa9130f6a334ef68157c6fe9ace1d8723bcdc6ef6b8ab246cdce6357382ec"} Dec 06 01:02:08 crc kubenswrapper[4991]: I1206 01:02:08.050579 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:08 crc kubenswrapper[4991]: I1206 01:02:08.051181 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:08 crc kubenswrapper[4991]: I1206 01:02:08.051203 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:08 crc kubenswrapper[4991]: I1206 01:02:08.051214 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:08 crc kubenswrapper[4991]: I1206 01:02:08.052762 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5"} Dec 06 01:02:08 crc kubenswrapper[4991]: I1206 01:02:08.052788 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4bfbf36f76cfd9b2a6bdb5b77b7cc5943e2822766a137f12c56770207b658fe8"} Dec 06 01:02:08 crc kubenswrapper[4991]: I1206 01:02:08.054353 4991 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72" exitCode=0 Dec 06 01:02:08 crc kubenswrapper[4991]: I1206 01:02:08.054419 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72"} Dec 06 01:02:08 crc kubenswrapper[4991]: I1206 01:02:08.054470 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"68b275ab0916101fb86952d45d301945f63b670b1ccb28be6fda67ebe78931d4"} Dec 06 01:02:08 crc kubenswrapper[4991]: I1206 01:02:08.054653 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:08 crc kubenswrapper[4991]: I1206 01:02:08.055933 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:08 crc kubenswrapper[4991]: I1206 01:02:08.055978 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:08 crc kubenswrapper[4991]: I1206 01:02:08.056028 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:08 crc kubenswrapper[4991]: I1206 01:02:08.057863 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:08 crc kubenswrapper[4991]: I1206 01:02:08.058751 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:08 crc kubenswrapper[4991]: I1206 01:02:08.058784 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:08 crc kubenswrapper[4991]: I1206 01:02:08.058793 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:08 crc kubenswrapper[4991]: W1206 01:02:08.168335 4991 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 06 01:02:08 crc kubenswrapper[4991]: E1206 01:02:08.168664 4991 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.110:6443: connect: connection refused" logger="UnhandledError" Dec 06 01:02:08 crc kubenswrapper[4991]: W1206 01:02:08.261757 4991 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 06 01:02:08 crc kubenswrapper[4991]: E1206 01:02:08.262041 4991 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.110:6443: connect: connection refused" logger="UnhandledError" Dec 06 01:02:08 crc kubenswrapper[4991]: E1206 01:02:08.388277 4991 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="1.6s" Dec 06 01:02:08 crc kubenswrapper[4991]: W1206 01:02:08.503942 4991 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 06 01:02:08 crc kubenswrapper[4991]: E1206 01:02:08.504054 4991 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.110:6443: connect: connection refused" logger="UnhandledError" Dec 06 01:02:08 crc kubenswrapper[4991]: W1206 01:02:08.532941 4991 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 06 01:02:08 crc kubenswrapper[4991]: E1206 01:02:08.533066 4991 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.110:6443: connect: connection refused" logger="UnhandledError" Dec 06 01:02:08 crc kubenswrapper[4991]: I1206 01:02:08.596821 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:08 crc kubenswrapper[4991]: I1206 01:02:08.598366 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:08 crc kubenswrapper[4991]: I1206 01:02:08.598721 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:08 crc kubenswrapper[4991]: I1206 01:02:08.598843 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:08 crc kubenswrapper[4991]: I1206 01:02:08.598957 4991 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 06 01:02:08 crc kubenswrapper[4991]: E1206 01:02:08.599961 4991 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.110:6443: connect: connection refused" node="crc" Dec 06 01:02:08 crc kubenswrapper[4991]: E1206 01:02:08.890087 4991 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.110:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187e7a9eecb27f83 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-06 01:02:06.970355587 +0000 UTC m=+0.320646055,LastTimestamp:2025-12-06 01:02:06.970355587 +0000 UTC m=+0.320646055,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 06 01:02:08 crc kubenswrapper[4991]: I1206 01:02:08.973456 4991 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 06 01:02:08 crc kubenswrapper[4991]: I1206 01:02:08.978520 4991 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 19:57:21.54873602 +0000 UTC Dec 06 01:02:09 crc kubenswrapper[4991]: I1206 01:02:09.059692 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"c2cf257527acb99d14685360262c24dfe896a1ae4e11680b2b8e1c68720182ca"} Dec 06 01:02:09 crc kubenswrapper[4991]: I1206 01:02:09.059845 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:09 crc kubenswrapper[4991]: I1206 01:02:09.061169 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:09 crc kubenswrapper[4991]: I1206 01:02:09.061226 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:09 crc kubenswrapper[4991]: I1206 01:02:09.061240 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:09 crc kubenswrapper[4991]: I1206 01:02:09.064531 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"4916d46886a800ca9719abe6ad8686b6208fec6dd5a84454728e228ac5cb46e4"} Dec 06 01:02:09 crc kubenswrapper[4991]: I1206 01:02:09.064577 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f4192bffb9f5423bfaf52510034884b3bbdd764b00e1b3dae354bcfe933a8c0e"} Dec 06 01:02:09 crc kubenswrapper[4991]: I1206 01:02:09.069171 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7e6fa450da9e79fab8a6d70eb261ad080499995eef8884b2e01c68f44f3d1149"} Dec 06 01:02:09 crc kubenswrapper[4991]: I1206 01:02:09.069239 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7db3c6493775f81f3e0feeee29f2b3c8420a15ba9ce8117fab50521b2251af46"} Dec 06 01:02:09 crc kubenswrapper[4991]: I1206 01:02:09.072618 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d"} Dec 06 01:02:09 crc kubenswrapper[4991]: I1206 01:02:09.072684 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0"} Dec 06 01:02:09 crc kubenswrapper[4991]: I1206 01:02:09.076095 4991 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="e90c095e0eedbf69c2bb3c468075593ce3376700cdc571994afac4d49a09e5fd" exitCode=0 Dec 06 01:02:09 crc kubenswrapper[4991]: I1206 01:02:09.076181 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"e90c095e0eedbf69c2bb3c468075593ce3376700cdc571994afac4d49a09e5fd"} Dec 06 01:02:09 crc kubenswrapper[4991]: I1206 01:02:09.076372 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:09 crc kubenswrapper[4991]: I1206 01:02:09.077375 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:09 crc kubenswrapper[4991]: I1206 01:02:09.077415 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:09 crc kubenswrapper[4991]: I1206 01:02:09.077429 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:09 crc kubenswrapper[4991]: I1206 01:02:09.972670 4991 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 06 01:02:09 crc kubenswrapper[4991]: I1206 01:02:09.978763 4991 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 02:25:15.876463436 +0000 UTC Dec 06 01:02:09 crc kubenswrapper[4991]: I1206 01:02:09.978790 4991 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 97h23m5.897677474s for next certificate rotation Dec 06 01:02:09 crc kubenswrapper[4991]: E1206 01:02:09.989626 4991 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="3.2s" Dec 06 01:02:10 crc kubenswrapper[4991]: I1206 01:02:10.081465 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"fd576444486a121ae8f02a6a7a64402e2db59d1baf0410a0b0036de8a0a1963e"} Dec 06 01:02:10 crc kubenswrapper[4991]: I1206 01:02:10.081531 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:10 crc kubenswrapper[4991]: I1206 01:02:10.086248 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:10 crc kubenswrapper[4991]: I1206 01:02:10.086326 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:10 crc kubenswrapper[4991]: I1206 01:02:10.086341 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:10 crc kubenswrapper[4991]: I1206 01:02:10.088938 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"95cb22622bd70c57cf25cb4e61bec6541175f1fee668e4b1aaa7e36853827315"} Dec 06 01:02:10 crc kubenswrapper[4991]: I1206 01:02:10.089033 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:10 crc kubenswrapper[4991]: I1206 01:02:10.089826 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:10 crc kubenswrapper[4991]: I1206 01:02:10.089850 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:10 crc kubenswrapper[4991]: I1206 01:02:10.089859 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:10 crc kubenswrapper[4991]: I1206 01:02:10.092200 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"edd6bf5a4df067dc0f7401221503f61b7c09304f9de9f9e4c956b15761aadd18"} Dec 06 01:02:10 crc kubenswrapper[4991]: I1206 01:02:10.092231 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96"} Dec 06 01:02:10 crc kubenswrapper[4991]: I1206 01:02:10.092243 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef"} Dec 06 01:02:10 crc kubenswrapper[4991]: I1206 01:02:10.092247 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:10 crc kubenswrapper[4991]: I1206 01:02:10.092849 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:10 crc kubenswrapper[4991]: I1206 01:02:10.092884 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:10 crc kubenswrapper[4991]: I1206 01:02:10.092895 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:10 crc kubenswrapper[4991]: I1206 01:02:10.094646 4991 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="76f9a40591f2f1c682e7f534acb271d5e015c380acbd8495e30082b0e4114b47" exitCode=0 Dec 06 01:02:10 crc kubenswrapper[4991]: I1206 01:02:10.094676 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"76f9a40591f2f1c682e7f534acb271d5e015c380acbd8495e30082b0e4114b47"} Dec 06 01:02:10 crc kubenswrapper[4991]: I1206 01:02:10.094781 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:10 crc kubenswrapper[4991]: I1206 01:02:10.095609 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:10 crc kubenswrapper[4991]: I1206 01:02:10.095643 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:10 crc kubenswrapper[4991]: I1206 01:02:10.095654 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:10 crc kubenswrapper[4991]: I1206 01:02:10.200419 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:10 crc kubenswrapper[4991]: I1206 01:02:10.201807 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:10 crc kubenswrapper[4991]: I1206 01:02:10.201893 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:10 crc kubenswrapper[4991]: I1206 01:02:10.201912 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:10 crc kubenswrapper[4991]: I1206 01:02:10.201958 4991 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 06 01:02:10 crc kubenswrapper[4991]: E1206 01:02:10.202754 4991 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.110:6443: connect: connection refused" node="crc" Dec 06 01:02:10 crc kubenswrapper[4991]: I1206 01:02:10.478973 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 01:02:10 crc kubenswrapper[4991]: I1206 01:02:10.771758 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 01:02:11 crc kubenswrapper[4991]: I1206 01:02:11.101878 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"380f473b8f2c7bb765927d3ce7e5b8c8b7e256f33563a3b757ef6a9607bd797e"} Dec 06 01:02:11 crc kubenswrapper[4991]: I1206 01:02:11.101928 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b45e185f43e3c61c2066f6d137d91c8f928e66602e40f9e8fd53109b2a4d949f"} Dec 06 01:02:11 crc kubenswrapper[4991]: I1206 01:02:11.101943 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"87816861e9eab5693a2d8d426667bd10220b806d625f48022cb7a64f01cbdda8"} Dec 06 01:02:11 crc kubenswrapper[4991]: I1206 01:02:11.103327 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 06 01:02:11 crc kubenswrapper[4991]: I1206 01:02:11.105267 4991 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="edd6bf5a4df067dc0f7401221503f61b7c09304f9de9f9e4c956b15761aadd18" exitCode=255 Dec 06 01:02:11 crc kubenswrapper[4991]: I1206 01:02:11.105386 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:11 crc kubenswrapper[4991]: I1206 01:02:11.105541 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"edd6bf5a4df067dc0f7401221503f61b7c09304f9de9f9e4c956b15761aadd18"} Dec 06 01:02:11 crc kubenswrapper[4991]: I1206 01:02:11.105676 4991 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 06 01:02:11 crc kubenswrapper[4991]: I1206 01:02:11.105750 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:11 crc kubenswrapper[4991]: I1206 01:02:11.105772 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:11 crc kubenswrapper[4991]: I1206 01:02:11.106164 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:11 crc kubenswrapper[4991]: I1206 01:02:11.106194 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:11 crc kubenswrapper[4991]: I1206 01:02:11.106204 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:11 crc kubenswrapper[4991]: I1206 01:02:11.106939 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:11 crc kubenswrapper[4991]: I1206 01:02:11.106973 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:11 crc kubenswrapper[4991]: I1206 01:02:11.107004 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:11 crc kubenswrapper[4991]: I1206 01:02:11.107000 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:11 crc kubenswrapper[4991]: I1206 01:02:11.107137 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:11 crc kubenswrapper[4991]: I1206 01:02:11.107165 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:11 crc kubenswrapper[4991]: I1206 01:02:11.107960 4991 scope.go:117] "RemoveContainer" containerID="edd6bf5a4df067dc0f7401221503f61b7c09304f9de9f9e4c956b15761aadd18" Dec 06 01:02:11 crc kubenswrapper[4991]: I1206 01:02:11.611307 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 01:02:12 crc kubenswrapper[4991]: I1206 01:02:12.117589 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a28671e696c6a362c864f4faec7570ef5396fdebb90e18e21d60b5a109982882"} Dec 06 01:02:12 crc kubenswrapper[4991]: I1206 01:02:12.117679 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c051621cde74bbe99154e5e507969df2760eb4f1a35dbca7eb839e13aa47e6d5"} Dec 06 01:02:12 crc kubenswrapper[4991]: I1206 01:02:12.117792 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:12 crc kubenswrapper[4991]: I1206 01:02:12.119425 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:12 crc kubenswrapper[4991]: I1206 01:02:12.119468 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:12 crc kubenswrapper[4991]: I1206 01:02:12.119483 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:12 crc kubenswrapper[4991]: I1206 01:02:12.122695 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 06 01:02:12 crc kubenswrapper[4991]: I1206 01:02:12.126249 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:12 crc kubenswrapper[4991]: I1206 01:02:12.126297 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:12 crc kubenswrapper[4991]: I1206 01:02:12.126657 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f"} Dec 06 01:02:12 crc kubenswrapper[4991]: I1206 01:02:12.128097 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:12 crc kubenswrapper[4991]: I1206 01:02:12.128179 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:12 crc kubenswrapper[4991]: I1206 01:02:12.128207 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:12 crc kubenswrapper[4991]: I1206 01:02:12.128261 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:12 crc kubenswrapper[4991]: I1206 01:02:12.128290 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:12 crc kubenswrapper[4991]: I1206 01:02:12.128304 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:12 crc kubenswrapper[4991]: I1206 01:02:12.394128 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 01:02:12 crc kubenswrapper[4991]: I1206 01:02:12.487409 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 01:02:12 crc kubenswrapper[4991]: I1206 01:02:12.487756 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:12 crc kubenswrapper[4991]: I1206 01:02:12.490175 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:12 crc kubenswrapper[4991]: I1206 01:02:12.490256 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:12 crc kubenswrapper[4991]: I1206 01:02:12.490279 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:13 crc kubenswrapper[4991]: I1206 01:02:13.129905 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:13 crc kubenswrapper[4991]: I1206 01:02:13.129972 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:13 crc kubenswrapper[4991]: I1206 01:02:13.130150 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 01:02:13 crc kubenswrapper[4991]: I1206 01:02:13.131366 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:13 crc kubenswrapper[4991]: I1206 01:02:13.131405 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:13 crc kubenswrapper[4991]: I1206 01:02:13.131418 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:13 crc kubenswrapper[4991]: I1206 01:02:13.131964 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:13 crc kubenswrapper[4991]: I1206 01:02:13.132043 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:13 crc kubenswrapper[4991]: I1206 01:02:13.132087 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:13 crc kubenswrapper[4991]: I1206 01:02:13.403352 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:13 crc kubenswrapper[4991]: I1206 01:02:13.405737 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:13 crc kubenswrapper[4991]: I1206 01:02:13.405796 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:13 crc kubenswrapper[4991]: I1206 01:02:13.405806 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:13 crc kubenswrapper[4991]: I1206 01:02:13.405839 4991 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 06 01:02:13 crc kubenswrapper[4991]: I1206 01:02:13.618351 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 06 01:02:14 crc kubenswrapper[4991]: I1206 01:02:14.132751 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:14 crc kubenswrapper[4991]: I1206 01:02:14.132778 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:14 crc kubenswrapper[4991]: I1206 01:02:14.134048 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:14 crc kubenswrapper[4991]: I1206 01:02:14.134082 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:14 crc kubenswrapper[4991]: I1206 01:02:14.134095 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:14 crc kubenswrapper[4991]: I1206 01:02:14.134107 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:14 crc kubenswrapper[4991]: I1206 01:02:14.134159 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:14 crc kubenswrapper[4991]: I1206 01:02:14.134177 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:14 crc kubenswrapper[4991]: I1206 01:02:14.666826 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 01:02:14 crc kubenswrapper[4991]: I1206 01:02:14.667123 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:14 crc kubenswrapper[4991]: I1206 01:02:14.668979 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:14 crc kubenswrapper[4991]: I1206 01:02:14.669209 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:14 crc kubenswrapper[4991]: I1206 01:02:14.669248 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:14 crc kubenswrapper[4991]: I1206 01:02:14.874064 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 01:02:15 crc kubenswrapper[4991]: I1206 01:02:15.136068 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:15 crc kubenswrapper[4991]: I1206 01:02:15.138212 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:15 crc kubenswrapper[4991]: I1206 01:02:15.138333 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:15 crc kubenswrapper[4991]: I1206 01:02:15.138356 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:16 crc kubenswrapper[4991]: I1206 01:02:16.536228 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 01:02:16 crc kubenswrapper[4991]: I1206 01:02:16.536531 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:16 crc kubenswrapper[4991]: I1206 01:02:16.538393 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:16 crc kubenswrapper[4991]: I1206 01:02:16.538482 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:16 crc kubenswrapper[4991]: I1206 01:02:16.538507 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:16 crc kubenswrapper[4991]: I1206 01:02:16.545955 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 01:02:16 crc kubenswrapper[4991]: I1206 01:02:16.580728 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 06 01:02:16 crc kubenswrapper[4991]: I1206 01:02:16.581057 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:16 crc kubenswrapper[4991]: I1206 01:02:16.582466 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:16 crc kubenswrapper[4991]: I1206 01:02:16.582544 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:16 crc kubenswrapper[4991]: I1206 01:02:16.582563 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:17 crc kubenswrapper[4991]: E1206 01:02:17.098879 4991 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 06 01:02:17 crc kubenswrapper[4991]: I1206 01:02:17.142230 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:17 crc kubenswrapper[4991]: I1206 01:02:17.143404 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:17 crc kubenswrapper[4991]: I1206 01:02:17.143466 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:17 crc kubenswrapper[4991]: I1206 01:02:17.143480 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:17 crc kubenswrapper[4991]: I1206 01:02:17.147913 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 01:02:17 crc kubenswrapper[4991]: I1206 01:02:17.874641 4991 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 06 01:02:17 crc kubenswrapper[4991]: I1206 01:02:17.874763 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 06 01:02:18 crc kubenswrapper[4991]: I1206 01:02:18.144479 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:18 crc kubenswrapper[4991]: I1206 01:02:18.145785 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:18 crc kubenswrapper[4991]: I1206 01:02:18.145857 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:18 crc kubenswrapper[4991]: I1206 01:02:18.145877 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:20 crc kubenswrapper[4991]: W1206 01:02:20.351804 4991 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 06 01:02:20 crc kubenswrapper[4991]: I1206 01:02:20.351917 4991 trace.go:236] Trace[2137712410]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Dec-2025 01:02:10.350) (total time: 10001ms): Dec 06 01:02:20 crc kubenswrapper[4991]: Trace[2137712410]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (01:02:20.351) Dec 06 01:02:20 crc kubenswrapper[4991]: Trace[2137712410]: [10.001146091s] [10.001146091s] END Dec 06 01:02:20 crc kubenswrapper[4991]: E1206 01:02:20.351944 4991 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 06 01:02:20 crc kubenswrapper[4991]: W1206 01:02:20.949058 4991 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 06 01:02:20 crc kubenswrapper[4991]: I1206 01:02:20.949203 4991 trace.go:236] Trace[1167978150]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Dec-2025 01:02:10.947) (total time: 10001ms): Dec 06 01:02:20 crc kubenswrapper[4991]: Trace[1167978150]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (01:02:20.949) Dec 06 01:02:20 crc kubenswrapper[4991]: Trace[1167978150]: [10.001640801s] [10.001640801s] END Dec 06 01:02:20 crc kubenswrapper[4991]: E1206 01:02:20.949243 4991 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 06 01:02:20 crc kubenswrapper[4991]: I1206 01:02:20.974402 4991 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 06 01:02:21 crc kubenswrapper[4991]: I1206 01:02:21.066601 4991 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 06 01:02:21 crc kubenswrapper[4991]: I1206 01:02:21.066690 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 06 01:02:21 crc kubenswrapper[4991]: I1206 01:02:21.073404 4991 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 06 01:02:21 crc kubenswrapper[4991]: I1206 01:02:21.073459 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 06 01:02:21 crc kubenswrapper[4991]: I1206 01:02:21.094239 4991 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 06 01:02:21 crc kubenswrapper[4991]: I1206 01:02:21.094342 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 06 01:02:21 crc kubenswrapper[4991]: I1206 01:02:21.619941 4991 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 06 01:02:21 crc kubenswrapper[4991]: [+]log ok Dec 06 01:02:21 crc kubenswrapper[4991]: [+]etcd ok Dec 06 01:02:21 crc kubenswrapper[4991]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 06 01:02:21 crc kubenswrapper[4991]: [+]poststarthook/openshift.io-api-request-count-filter ok Dec 06 01:02:21 crc kubenswrapper[4991]: [+]poststarthook/openshift.io-startkubeinformers ok Dec 06 01:02:21 crc kubenswrapper[4991]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Dec 06 01:02:21 crc kubenswrapper[4991]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Dec 06 01:02:21 crc kubenswrapper[4991]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 06 01:02:21 crc kubenswrapper[4991]: [+]poststarthook/generic-apiserver-start-informers ok Dec 06 01:02:21 crc kubenswrapper[4991]: [+]poststarthook/priority-and-fairness-config-consumer ok Dec 06 01:02:21 crc kubenswrapper[4991]: [+]poststarthook/priority-and-fairness-filter ok Dec 06 01:02:21 crc kubenswrapper[4991]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 06 01:02:21 crc kubenswrapper[4991]: [+]poststarthook/start-apiextensions-informers ok Dec 06 01:02:21 crc kubenswrapper[4991]: [+]poststarthook/start-apiextensions-controllers ok Dec 06 01:02:21 crc kubenswrapper[4991]: [+]poststarthook/crd-informer-synced ok Dec 06 01:02:21 crc kubenswrapper[4991]: [+]poststarthook/start-system-namespaces-controller ok Dec 06 01:02:21 crc kubenswrapper[4991]: [+]poststarthook/start-cluster-authentication-info-controller ok Dec 06 01:02:21 crc kubenswrapper[4991]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Dec 06 01:02:21 crc kubenswrapper[4991]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Dec 06 01:02:21 crc kubenswrapper[4991]: [+]poststarthook/start-legacy-token-tracking-controller ok Dec 06 01:02:21 crc kubenswrapper[4991]: [+]poststarthook/start-service-ip-repair-controllers ok Dec 06 01:02:21 crc kubenswrapper[4991]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Dec 06 01:02:21 crc kubenswrapper[4991]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Dec 06 01:02:21 crc kubenswrapper[4991]: [+]poststarthook/priority-and-fairness-config-producer ok Dec 06 01:02:21 crc kubenswrapper[4991]: [+]poststarthook/bootstrap-controller ok Dec 06 01:02:21 crc kubenswrapper[4991]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Dec 06 01:02:21 crc kubenswrapper[4991]: [+]poststarthook/start-kube-aggregator-informers ok Dec 06 01:02:21 crc kubenswrapper[4991]: [+]poststarthook/apiservice-status-local-available-controller ok Dec 06 01:02:21 crc kubenswrapper[4991]: [+]poststarthook/apiservice-status-remote-available-controller ok Dec 06 01:02:21 crc kubenswrapper[4991]: [+]poststarthook/apiservice-registration-controller ok Dec 06 01:02:21 crc kubenswrapper[4991]: [+]poststarthook/apiservice-wait-for-first-sync ok Dec 06 01:02:21 crc kubenswrapper[4991]: [+]poststarthook/apiservice-discovery-controller ok Dec 06 01:02:21 crc kubenswrapper[4991]: [+]poststarthook/kube-apiserver-autoregistration ok Dec 06 01:02:21 crc kubenswrapper[4991]: [+]autoregister-completion ok Dec 06 01:02:21 crc kubenswrapper[4991]: [+]poststarthook/apiservice-openapi-controller ok Dec 06 01:02:21 crc kubenswrapper[4991]: [+]poststarthook/apiservice-openapiv3-controller ok Dec 06 01:02:21 crc kubenswrapper[4991]: livez check failed Dec 06 01:02:21 crc kubenswrapper[4991]: I1206 01:02:21.620071 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 01:02:23 crc kubenswrapper[4991]: I1206 01:02:23.658777 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 06 01:02:23 crc kubenswrapper[4991]: I1206 01:02:23.659491 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:23 crc kubenswrapper[4991]: I1206 01:02:23.662147 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:23 crc kubenswrapper[4991]: I1206 01:02:23.662222 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:23 crc kubenswrapper[4991]: I1206 01:02:23.662244 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:23 crc kubenswrapper[4991]: I1206 01:02:23.682842 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 06 01:02:24 crc kubenswrapper[4991]: I1206 01:02:24.161829 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:24 crc kubenswrapper[4991]: I1206 01:02:24.163132 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:24 crc kubenswrapper[4991]: I1206 01:02:24.163219 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:24 crc kubenswrapper[4991]: I1206 01:02:24.163248 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:25 crc kubenswrapper[4991]: I1206 01:02:25.816127 4991 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 06 01:02:25 crc kubenswrapper[4991]: I1206 01:02:25.976695 4991 apiserver.go:52] "Watching apiserver" Dec 06 01:02:25 crc kubenswrapper[4991]: I1206 01:02:25.982509 4991 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 06 01:02:25 crc kubenswrapper[4991]: I1206 01:02:25.983029 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb"] Dec 06 01:02:25 crc kubenswrapper[4991]: I1206 01:02:25.983500 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 06 01:02:25 crc kubenswrapper[4991]: I1206 01:02:25.983587 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:02:25 crc kubenswrapper[4991]: I1206 01:02:25.983750 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:02:25 crc kubenswrapper[4991]: E1206 01:02:25.984102 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:02:25 crc kubenswrapper[4991]: I1206 01:02:25.984149 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 01:02:25 crc kubenswrapper[4991]: E1206 01:02:25.984175 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:02:25 crc kubenswrapper[4991]: I1206 01:02:25.984746 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 06 01:02:25 crc kubenswrapper[4991]: I1206 01:02:25.984776 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:02:25 crc kubenswrapper[4991]: E1206 01:02:25.984840 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:02:25 crc kubenswrapper[4991]: I1206 01:02:25.987324 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 06 01:02:25 crc kubenswrapper[4991]: I1206 01:02:25.987491 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 06 01:02:25 crc kubenswrapper[4991]: I1206 01:02:25.987615 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 06 01:02:25 crc kubenswrapper[4991]: I1206 01:02:25.987672 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 06 01:02:25 crc kubenswrapper[4991]: I1206 01:02:25.987621 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 06 01:02:25 crc kubenswrapper[4991]: I1206 01:02:25.987697 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 06 01:02:25 crc kubenswrapper[4991]: I1206 01:02:25.987888 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 06 01:02:25 crc kubenswrapper[4991]: I1206 01:02:25.988572 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 06 01:02:25 crc kubenswrapper[4991]: I1206 01:02:25.988874 4991 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 06 01:02:25 crc kubenswrapper[4991]: I1206 01:02:25.989511 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.034890 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.052611 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 01:02:26 crc kubenswrapper[4991]: E1206 01:02:26.067240 4991 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.069552 4991 trace.go:236] Trace[437242583]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Dec-2025 01:02:11.485) (total time: 14583ms): Dec 06 01:02:26 crc kubenswrapper[4991]: Trace[437242583]: ---"Objects listed" error: 14583ms (01:02:26.069) Dec 06 01:02:26 crc kubenswrapper[4991]: Trace[437242583]: [14.58369536s] [14.58369536s] END Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.069571 4991 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 06 01:02:26 crc kubenswrapper[4991]: E1206 01:02:26.072102 4991 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.072796 4991 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.073151 4991 trace.go:236] Trace[131501526]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Dec-2025 01:02:11.181) (total time: 14891ms): Dec 06 01:02:26 crc kubenswrapper[4991]: Trace[131501526]: ---"Objects listed" error: 14891ms (01:02:26.072) Dec 06 01:02:26 crc kubenswrapper[4991]: Trace[131501526]: [14.891843102s] [14.891843102s] END Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.073191 4991 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.074449 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.081148 4991 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.095593 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.112614 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.129099 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.155133 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.173789 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.173869 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.173922 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.174018 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.174071 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.174100 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.174151 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.174176 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.174201 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.174251 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.174274 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.174318 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.174341 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.174383 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.174407 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.174426 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.174443 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.174510 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.174563 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.174591 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.174713 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.174737 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.174760 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.174807 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.174883 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.174908 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.174958 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.175019 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.175042 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.175063 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.175110 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.175135 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.175181 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.175212 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.175257 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.175278 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.175303 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.175347 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.175368 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.175412 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.175434 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.175456 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.175534 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.175556 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.175604 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.175625 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.175647 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.175692 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.175768 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.175795 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.175844 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.175926 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.175956 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.176024 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.176051 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.176245 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.176271 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.176322 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.176349 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.176409 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.176433 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.176454 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.176497 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.176522 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.176574 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.176597 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.176619 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.176692 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.176717 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.176765 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.176830 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.176857 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.176880 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.176927 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.176961 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.178089 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.178502 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.178546 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.178574 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.178602 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.178630 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.178657 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.178681 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.178708 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.178737 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.178760 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.178783 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.178807 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.178855 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.178906 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.178950 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.179019 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.179064 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.179105 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.179148 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.179198 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.179323 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.179378 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.179443 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.179485 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.179525 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.179568 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.179618 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.179662 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.179705 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.179749 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.179793 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.179834 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.179872 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.180005 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.180034 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.180063 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.180110 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.180140 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.180164 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.180190 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.180217 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.180239 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.180262 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.180288 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.180313 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.180341 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.180366 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.180391 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.180417 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.180444 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.180467 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.180491 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.180514 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.180538 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.180560 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.180585 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.180608 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.180630 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.180653 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.180676 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.180700 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.180724 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.180774 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.180808 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.180843 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.180877 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.180908 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.180932 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.180956 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.180980 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.181028 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.181059 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.181092 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.181125 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.181152 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.181176 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.181202 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.181236 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.181263 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.181286 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.181313 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.181337 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.181361 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.181386 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.181410 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.181436 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.181460 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.181491 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.181528 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.181554 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.181577 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.181602 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.181629 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.181658 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.181683 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.181705 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.181730 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.181753 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.181779 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.181805 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.181828 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.181856 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.181882 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.181906 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.181931 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.181954 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.181978 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.182025 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.182050 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.174767 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.182080 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.182106 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.174785 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.182130 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.182159 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.182185 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.182216 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.182254 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.182326 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.182353 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.182378 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.182406 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.182469 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.182747 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.182810 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.182912 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.182963 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.183047 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.183090 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.183129 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.183161 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.183194 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.183221 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.183254 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.183285 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.183312 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.183400 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.183418 4991 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.183436 4991 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.206784 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.207682 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.207872 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.209502 4991 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.174875 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.175086 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.175115 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.175299 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.175432 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.175516 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.175735 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.175766 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.175777 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.175758 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.175800 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.176058 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.176057 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.176101 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.176124 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.176140 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.176274 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.176296 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.176387 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.176398 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.176488 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.176510 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.176552 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.176576 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.176647 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.176661 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.176710 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.176760 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.176855 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.176913 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.176946 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.176970 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.177693 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.177851 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.178394 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.178644 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.178839 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.182218 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.182390 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.182448 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.182793 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.182810 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.182812 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.183581 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.183603 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.184440 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.184469 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: E1206 01:02:26.185285 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:02:26.685260635 +0000 UTC m=+20.035551103 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.185661 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.185747 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.185745 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.185876 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.186278 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.186388 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.186595 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.186771 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.186799 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.187049 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.187073 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.187113 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.187155 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.187199 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.187324 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.187382 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.187548 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.187582 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.188170 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.188195 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.188254 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.188345 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.188493 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.188711 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.188788 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.192734 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.193251 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.194188 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.194734 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.195089 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.195424 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.195629 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.195627 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.195894 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.196150 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.196173 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.196347 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.196419 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.196671 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.196869 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.197229 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.197502 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.198123 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.198234 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.198657 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.198728 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.199127 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.199886 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.200558 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.200639 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.200876 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.201218 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.201298 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: E1206 01:02:26.218653 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 01:02:26 crc kubenswrapper[4991]: E1206 01:02:26.218687 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 01:02:26 crc kubenswrapper[4991]: E1206 01:02:26.218703 4991 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 01:02:26 crc kubenswrapper[4991]: E1206 01:02:26.218782 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-06 01:02:26.718758717 +0000 UTC m=+20.069049185 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.201374 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.201684 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.201754 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.202105 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.202134 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.202189 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.202327 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.202429 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: E1206 01:02:26.203369 4991 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.221052 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: E1206 01:02:26.221126 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 01:02:26.721099682 +0000 UTC m=+20.071390150 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.204604 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.204973 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.205424 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: E1206 01:02:26.206289 4991 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 01:02:26 crc kubenswrapper[4991]: E1206 01:02:26.221208 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 01:02:26.721200804 +0000 UTC m=+20.071491272 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.209273 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.211873 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.211950 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.221232 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.212187 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.212271 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.212661 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.213067 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.213268 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.213271 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.213481 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.213596 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.213888 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.214435 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.220320 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.220808 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.221255 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.221538 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.221703 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.221835 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.223077 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.224214 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.224865 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.225263 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.225438 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.225874 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.226227 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.226399 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.227453 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.227582 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.227956 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.228041 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.228522 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.228534 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.228743 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.229005 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.229194 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.229297 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.229406 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.229788 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.229927 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.230147 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.230900 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.231096 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.231393 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.231560 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.231759 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.233688 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.234748 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.234822 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.237097 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.237167 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.237513 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.237547 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.237682 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.237831 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.238106 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: E1206 01:02:26.243338 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 01:02:26 crc kubenswrapper[4991]: E1206 01:02:26.243473 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 01:02:26 crc kubenswrapper[4991]: E1206 01:02:26.243584 4991 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 01:02:26 crc kubenswrapper[4991]: E1206 01:02:26.244608 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-06 01:02:26.744574938 +0000 UTC m=+20.094865406 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.243479 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.243604 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.243509 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.246615 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.246978 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.247061 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.247456 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.249083 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.248778 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.251374 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.255797 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.256151 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.256330 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.256550 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.256614 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.256951 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.257069 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.257330 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.257368 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.257415 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.257492 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.257818 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.259622 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.260393 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.265971 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.280973 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.284570 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.284636 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.284687 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.284851 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.284981 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285138 4991 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285218 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285239 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285263 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285286 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285304 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285324 4991 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285344 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285365 4991 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285383 4991 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285400 4991 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285421 4991 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285443 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285462 4991 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285480 4991 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285498 4991 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285518 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285537 4991 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285555 4991 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285573 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285591 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285609 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285623 4991 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285636 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285648 4991 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285661 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285675 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285689 4991 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285710 4991 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285723 4991 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285736 4991 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285753 4991 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285767 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285781 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285797 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285810 4991 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285822 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285837 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285850 4991 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285863 4991 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285877 4991 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285891 4991 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285904 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285918 4991 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285930 4991 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285944 4991 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285957 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.285971 4991 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286004 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286018 4991 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286031 4991 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286046 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286060 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286076 4991 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286089 4991 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286103 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286117 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286130 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286143 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286154 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286169 4991 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286184 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286197 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286213 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286231 4991 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286247 4991 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286266 4991 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286283 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286299 4991 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286317 4991 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286336 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286356 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286374 4991 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286394 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286413 4991 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286431 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286448 4991 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286465 4991 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286480 4991 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286492 4991 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286505 4991 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286518 4991 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286531 4991 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286544 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286558 4991 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286575 4991 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286593 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286609 4991 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286625 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286637 4991 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286649 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286662 4991 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286679 4991 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286697 4991 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286714 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286732 4991 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286749 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286762 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286774 4991 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286787 4991 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286801 4991 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286815 4991 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286833 4991 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286852 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286870 4991 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286887 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286903 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286915 4991 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.286928 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287022 4991 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287037 4991 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287053 4991 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287070 4991 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287089 4991 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287107 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287124 4991 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287142 4991 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287159 4991 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287176 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287193 4991 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287210 4991 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287227 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287243 4991 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287256 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287270 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287283 4991 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287297 4991 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287311 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287326 4991 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287340 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287353 4991 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287365 4991 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287378 4991 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287391 4991 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287404 4991 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287418 4991 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287431 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287444 4991 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287478 4991 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287491 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287503 4991 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287518 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287534 4991 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287546 4991 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287560 4991 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287578 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287595 4991 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287610 4991 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287624 4991 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287637 4991 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287651 4991 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287664 4991 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287678 4991 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287696 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287714 4991 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287731 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287749 4991 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287767 4991 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287784 4991 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287802 4991 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287821 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287838 4991 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287855 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287873 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287890 4991 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287908 4991 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287924 4991 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287943 4991 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287961 4991 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.287980 4991 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.288029 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.288042 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.288057 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.288071 4991 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.288084 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.288096 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.288107 4991 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.288118 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.288129 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.288142 4991 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.288154 4991 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.288167 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.288179 4991 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.288191 4991 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.288205 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.288217 4991 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.288230 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.288241 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.288253 4991 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.288265 4991 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.303404 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.315650 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.324745 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.389403 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.618535 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.619478 4991 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.619556 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.625603 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.633140 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.634447 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.648206 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.661502 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.672406 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.682958 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.691723 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:02:26 crc kubenswrapper[4991]: E1206 01:02:26.691951 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:02:27.691916468 +0000 UTC m=+21.042206936 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.695658 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.709571 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.726029 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.739135 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.755500 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.772566 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f3f8be-7093-40c6-8c6e-6bb65c878b31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edd6bf5a4df067dc0f7401221503f61b7c09304f9de9f9e4c956b15761aadd18\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"message\\\":\\\"W1206 01:02:09.776859 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1206 01:02:09.777333 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764982929 cert, and key in /tmp/serving-cert-4060281674/serving-signer.crt, /tmp/serving-cert-4060281674/serving-signer.key\\\\nI1206 01:02:09.960996 1 observer_polling.go:159] Starting file observer\\\\nW1206 01:02:09.963838 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1206 01:02:09.963998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 01:02:09.964732 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4060281674/tls.crt::/tmp/serving-cert-4060281674/tls.key\\\\\\\"\\\\nF1206 01:02:10.254246 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.786426 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.792311 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.792378 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.792407 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.792430 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:02:26 crc kubenswrapper[4991]: E1206 01:02:26.792552 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 01:02:26 crc kubenswrapper[4991]: E1206 01:02:26.792574 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 01:02:26 crc kubenswrapper[4991]: E1206 01:02:26.792586 4991 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 01:02:26 crc kubenswrapper[4991]: E1206 01:02:26.792634 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-06 01:02:27.792618509 +0000 UTC m=+21.142908977 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 01:02:26 crc kubenswrapper[4991]: E1206 01:02:26.792858 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 01:02:26 crc kubenswrapper[4991]: E1206 01:02:26.792928 4991 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 01:02:26 crc kubenswrapper[4991]: E1206 01:02:26.792902 4991 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 01:02:26 crc kubenswrapper[4991]: E1206 01:02:26.792943 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 01:02:26 crc kubenswrapper[4991]: E1206 01:02:26.793133 4991 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 01:02:26 crc kubenswrapper[4991]: E1206 01:02:26.793009 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 01:02:27.792993319 +0000 UTC m=+21.143283787 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 01:02:26 crc kubenswrapper[4991]: E1206 01:02:26.793314 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 01:02:27.793300708 +0000 UTC m=+21.143591176 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 01:02:26 crc kubenswrapper[4991]: E1206 01:02:26.793392 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-06 01:02:27.79338136 +0000 UTC m=+21.143671828 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 01:02:26 crc kubenswrapper[4991]: I1206 01:02:26.796251 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.045213 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.045775 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.052304 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.053332 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.054025 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.054637 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.055262 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.055287 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f3f8be-7093-40c6-8c6e-6bb65c878b31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edd6bf5a4df067dc0f7401221503f61b7c09304f9de9f9e4c956b15761aadd18\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"message\\\":\\\"W1206 01:02:09.776859 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1206 01:02:09.777333 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764982929 cert, and key in /tmp/serving-cert-4060281674/serving-signer.crt, /tmp/serving-cert-4060281674/serving-signer.key\\\\nI1206 01:02:09.960996 1 observer_polling.go:159] Starting file observer\\\\nW1206 01:02:09.963838 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1206 01:02:09.963998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 01:02:09.964732 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4060281674/tls.crt::/tmp/serving-cert-4060281674/tls.key\\\\\\\"\\\\nF1206 01:02:10.254246 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:27Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.057215 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.057929 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.058695 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.059220 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.060081 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.060572 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.061099 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.061616 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.062161 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.062704 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.063108 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.063651 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.064266 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.064777 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.065339 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.065767 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.066559 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.067946 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.069011 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.070406 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.071151 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.071907 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.072538 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.073193 4991 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.073338 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.075286 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:27Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.076100 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.077367 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.078498 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.081234 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.082732 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.083365 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.084960 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.086254 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.087428 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.088527 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.090739 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.091022 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:27Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.091499 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.092086 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.092674 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.093245 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.094143 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.094695 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.095210 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.095755 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.096334 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.096950 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.097606 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.107368 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:27Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.116468 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.120872 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.125831 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.127939 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:27Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.141623 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:27Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.151466 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:27Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.164458 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f3f8be-7093-40c6-8c6e-6bb65c878b31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edd6bf5a4df067dc0f7401221503f61b7c09304f9de9f9e4c956b15761aadd18\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"message\\\":\\\"W1206 01:02:09.776859 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1206 01:02:09.777333 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764982929 cert, and key in /tmp/serving-cert-4060281674/serving-signer.crt, /tmp/serving-cert-4060281674/serving-signer.key\\\\nI1206 01:02:09.960996 1 observer_polling.go:159] Starting file observer\\\\nW1206 01:02:09.963838 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1206 01:02:09.963998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 01:02:09.964732 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4060281674/tls.crt::/tmp/serving-cert-4060281674/tls.key\\\\\\\"\\\\nF1206 01:02:10.254246 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:27Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.173274 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.173850 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.176013 4991 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f" exitCode=255 Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.176060 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f"} Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.176209 4991 scope.go:117] "RemoveContainer" containerID="edd6bf5a4df067dc0f7401221503f61b7c09304f9de9f9e4c956b15761aadd18" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.177296 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"b961949e2ac3f678dd43c24e1bdf59988fa7ff83738af6ff292c1afddef8b7cc"} Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.177363 4991 scope.go:117] "RemoveContainer" containerID="d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f" Dec 06 01:02:27 crc kubenswrapper[4991]: E1206 01:02:27.178504 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.180024 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"021d3a65c56a737a682268cde0d57a186389f597de592f541a72ea5958a170c2"} Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.180063 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"70371e1283910d9d652089554a815343eb1d54110a2472e067d5a2e8878c9193"} Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.180102 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"439e38427e6c406799476c905308119ec400ac5b28650958d3161c0e30e0a831"} Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.181849 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:27Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.182902 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"bc01e1f52af4386bd143ff8779458f0e335475f40f067c608c70a765d6fad952"} Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.182934 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"0c6da11331bf09596c1e1af16dbbff7c4a0c33a70c8b8ef3e3c9a34409b3f8de"} Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.201312 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:27Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.216251 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5409f18b-76b3-4388-a9c8-50a1ec8a37d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3c6493775f81f3e0feeee29f2b3c8420a15ba9ce8117fab50521b2251af46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6fa450da9e79fab8a6d70eb261ad080499995eef8884b2e01c68f44f3d1149\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cb22622bd70c57cf25cb4e61bec6541175f1fee668e4b1aaa7e36853827315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:27Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.233832 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:27Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.252289 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:27Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.266485 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:27Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.283391 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:27Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.307754 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f3f8be-7093-40c6-8c6e-6bb65c878b31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://edd6bf5a4df067dc0f7401221503f61b7c09304f9de9f9e4c956b15761aadd18\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"message\\\":\\\"W1206 01:02:09.776859 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1206 01:02:09.777333 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764982929 cert, and key in /tmp/serving-cert-4060281674/serving-signer.crt, /tmp/serving-cert-4060281674/serving-signer.key\\\\nI1206 01:02:09.960996 1 observer_polling.go:159] Starting file observer\\\\nW1206 01:02:09.963838 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1206 01:02:09.963998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1206 01:02:09.964732 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4060281674/tls.crt::/tmp/serving-cert-4060281674/tls.key\\\\\\\"\\\\nF1206 01:02:10.254246 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:26Z\\\",\\\"message\\\":\\\"ication::requestheader-client-ca-file\\\\nI1206 01:02:26.397262 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1206 01:02:26.397306 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1206 01:02:26.397362 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764982931\\\\\\\\\\\\\\\" (2025-12-06 01:02:10 +0000 UTC to 2026-01-05 01:02:11 +0000 UTC (now=2025-12-06 01:02:26.397254869 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397406 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1206 01:02:26.397421 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1206 01:02:26.397440 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\"\\\\nI1206 01:02:26.397610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764982946\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764982946\\\\\\\\\\\\\\\" (2025-12-06 00:02:26 +0000 UTC to 2026-12-06 00:02:26 +0000 UTC (now=2025-12-06 01:02:26.397572448 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397373 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1206 01:02:26.397647 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1206 01:02:26.397687 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1206 01:02:26.397767 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1206 01:02:26.399022 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:27Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.327094 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:27Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.342093 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:27Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.357109 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5409f18b-76b3-4388-a9c8-50a1ec8a37d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3c6493775f81f3e0feeee29f2b3c8420a15ba9ce8117fab50521b2251af46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6fa450da9e79fab8a6d70eb261ad080499995eef8884b2e01c68f44f3d1149\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cb22622bd70c57cf25cb4e61bec6541175f1fee668e4b1aaa7e36853827315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:27Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.371400 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc01e1f52af4386bd143ff8779458f0e335475f40f067c608c70a765d6fad952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:27Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.387428 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021d3a65c56a737a682268cde0d57a186389f597de592f541a72ea5958a170c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70371e1283910d9d652089554a815343eb1d54110a2472e067d5a2e8878c9193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:27Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.400873 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:27Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.413347 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:27Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.699647 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:02:27 crc kubenswrapper[4991]: E1206 01:02:27.699879 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:02:29.699835623 +0000 UTC m=+23.050126101 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.800754 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.800806 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.800825 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:02:27 crc kubenswrapper[4991]: I1206 01:02:27.800845 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:02:27 crc kubenswrapper[4991]: E1206 01:02:27.800922 4991 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 01:02:27 crc kubenswrapper[4991]: E1206 01:02:27.800966 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 01:02:27 crc kubenswrapper[4991]: E1206 01:02:27.801007 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 01:02:27 crc kubenswrapper[4991]: E1206 01:02:27.801020 4991 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 01:02:27 crc kubenswrapper[4991]: E1206 01:02:27.800979 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 01:02:29.800964615 +0000 UTC m=+23.151255083 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 01:02:27 crc kubenswrapper[4991]: E1206 01:02:27.801059 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-06 01:02:29.801050148 +0000 UTC m=+23.151340616 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 01:02:27 crc kubenswrapper[4991]: E1206 01:02:27.801065 4991 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 01:02:27 crc kubenswrapper[4991]: E1206 01:02:27.801147 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 01:02:27 crc kubenswrapper[4991]: E1206 01:02:27.801196 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 01:02:27 crc kubenswrapper[4991]: E1206 01:02:27.801201 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 01:02:29.801171331 +0000 UTC m=+23.151461809 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 01:02:27 crc kubenswrapper[4991]: E1206 01:02:27.801215 4991 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 01:02:27 crc kubenswrapper[4991]: E1206 01:02:27.801311 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-06 01:02:29.801278944 +0000 UTC m=+23.151569612 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 01:02:28 crc kubenswrapper[4991]: I1206 01:02:28.037741 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:02:28 crc kubenswrapper[4991]: I1206 01:02:28.037789 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:02:28 crc kubenswrapper[4991]: I1206 01:02:28.037769 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:02:28 crc kubenswrapper[4991]: E1206 01:02:28.037890 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:02:28 crc kubenswrapper[4991]: E1206 01:02:28.038004 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:02:28 crc kubenswrapper[4991]: E1206 01:02:28.038051 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:02:28 crc kubenswrapper[4991]: I1206 01:02:28.187869 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Dec 06 01:02:28 crc kubenswrapper[4991]: I1206 01:02:28.191377 4991 scope.go:117] "RemoveContainer" containerID="d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f" Dec 06 01:02:28 crc kubenswrapper[4991]: E1206 01:02:28.191525 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Dec 06 01:02:28 crc kubenswrapper[4991]: I1206 01:02:28.207176 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f3f8be-7093-40c6-8c6e-6bb65c878b31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:26Z\\\",\\\"message\\\":\\\"ication::requestheader-client-ca-file\\\\nI1206 01:02:26.397262 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1206 01:02:26.397306 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1206 01:02:26.397362 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764982931\\\\\\\\\\\\\\\" (2025-12-06 01:02:10 +0000 UTC to 2026-01-05 01:02:11 +0000 UTC (now=2025-12-06 01:02:26.397254869 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397406 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1206 01:02:26.397421 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1206 01:02:26.397440 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\"\\\\nI1206 01:02:26.397610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764982946\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764982946\\\\\\\\\\\\\\\" (2025-12-06 00:02:26 +0000 UTC to 2026-12-06 00:02:26 +0000 UTC (now=2025-12-06 01:02:26.397572448 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397373 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1206 01:02:26.397647 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1206 01:02:26.397687 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1206 01:02:26.397767 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1206 01:02:26.399022 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:28Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:28 crc kubenswrapper[4991]: I1206 01:02:28.221191 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:28Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:28 crc kubenswrapper[4991]: I1206 01:02:28.234121 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:28Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:28 crc kubenswrapper[4991]: I1206 01:02:28.247414 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:28Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:28 crc kubenswrapper[4991]: I1206 01:02:28.260970 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:28Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:28 crc kubenswrapper[4991]: I1206 01:02:28.277570 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5409f18b-76b3-4388-a9c8-50a1ec8a37d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3c6493775f81f3e0feeee29f2b3c8420a15ba9ce8117fab50521b2251af46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6fa450da9e79fab8a6d70eb261ad080499995eef8884b2e01c68f44f3d1149\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cb22622bd70c57cf25cb4e61bec6541175f1fee668e4b1aaa7e36853827315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:28Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:28 crc kubenswrapper[4991]: I1206 01:02:28.291939 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc01e1f52af4386bd143ff8779458f0e335475f40f067c608c70a765d6fad952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:28Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:28 crc kubenswrapper[4991]: I1206 01:02:28.308639 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021d3a65c56a737a682268cde0d57a186389f597de592f541a72ea5958a170c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70371e1283910d9d652089554a815343eb1d54110a2472e067d5a2e8878c9193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:28Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:29 crc kubenswrapper[4991]: I1206 01:02:29.719125 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:02:29 crc kubenswrapper[4991]: E1206 01:02:29.719512 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:02:33.719466939 +0000 UTC m=+27.069757407 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:02:29 crc kubenswrapper[4991]: I1206 01:02:29.820615 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:02:29 crc kubenswrapper[4991]: I1206 01:02:29.820666 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:02:29 crc kubenswrapper[4991]: I1206 01:02:29.820688 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:02:29 crc kubenswrapper[4991]: I1206 01:02:29.820713 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:02:29 crc kubenswrapper[4991]: E1206 01:02:29.820801 4991 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 01:02:29 crc kubenswrapper[4991]: E1206 01:02:29.820847 4991 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 01:02:29 crc kubenswrapper[4991]: E1206 01:02:29.820927 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 01:02:29 crc kubenswrapper[4991]: E1206 01:02:29.821012 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 01:02:29 crc kubenswrapper[4991]: E1206 01:02:29.821030 4991 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 01:02:29 crc kubenswrapper[4991]: E1206 01:02:29.820866 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 01:02:29 crc kubenswrapper[4991]: E1206 01:02:29.821130 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 01:02:29 crc kubenswrapper[4991]: E1206 01:02:29.821144 4991 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 01:02:29 crc kubenswrapper[4991]: E1206 01:02:29.820876 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 01:02:33.820857289 +0000 UTC m=+27.171147757 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 01:02:29 crc kubenswrapper[4991]: E1206 01:02:29.821247 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 01:02:33.821210379 +0000 UTC m=+27.171500987 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 01:02:29 crc kubenswrapper[4991]: E1206 01:02:29.821289 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-06 01:02:33.82126881 +0000 UTC m=+27.171559488 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 01:02:29 crc kubenswrapper[4991]: E1206 01:02:29.821318 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-06 01:02:33.821309291 +0000 UTC m=+27.171599969 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 01:02:30 crc kubenswrapper[4991]: I1206 01:02:30.037033 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:02:30 crc kubenswrapper[4991]: I1206 01:02:30.037132 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:02:30 crc kubenswrapper[4991]: I1206 01:02:30.037072 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:02:30 crc kubenswrapper[4991]: E1206 01:02:30.037215 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:02:30 crc kubenswrapper[4991]: E1206 01:02:30.037295 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:02:30 crc kubenswrapper[4991]: E1206 01:02:30.037356 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:02:30 crc kubenswrapper[4991]: I1206 01:02:30.197935 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"4d8086b693e86f8e2ea45695911c64df4acded68da7cc61db4cdbc19b2402879"} Dec 06 01:02:30 crc kubenswrapper[4991]: I1206 01:02:30.213359 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f3f8be-7093-40c6-8c6e-6bb65c878b31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:26Z\\\",\\\"message\\\":\\\"ication::requestheader-client-ca-file\\\\nI1206 01:02:26.397262 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1206 01:02:26.397306 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1206 01:02:26.397362 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764982931\\\\\\\\\\\\\\\" (2025-12-06 01:02:10 +0000 UTC to 2026-01-05 01:02:11 +0000 UTC (now=2025-12-06 01:02:26.397254869 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397406 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1206 01:02:26.397421 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1206 01:02:26.397440 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\"\\\\nI1206 01:02:26.397610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764982946\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764982946\\\\\\\\\\\\\\\" (2025-12-06 00:02:26 +0000 UTC to 2026-12-06 00:02:26 +0000 UTC (now=2025-12-06 01:02:26.397572448 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397373 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1206 01:02:26.397647 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1206 01:02:26.397687 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1206 01:02:26.397767 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1206 01:02:26.399022 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:30Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:30 crc kubenswrapper[4991]: I1206 01:02:30.224813 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:30Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:30 crc kubenswrapper[4991]: I1206 01:02:30.238804 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:30Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:30 crc kubenswrapper[4991]: I1206 01:02:30.254103 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5409f18b-76b3-4388-a9c8-50a1ec8a37d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3c6493775f81f3e0feeee29f2b3c8420a15ba9ce8117fab50521b2251af46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6fa450da9e79fab8a6d70eb261ad080499995eef8884b2e01c68f44f3d1149\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cb22622bd70c57cf25cb4e61bec6541175f1fee668e4b1aaa7e36853827315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:30Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:30 crc kubenswrapper[4991]: I1206 01:02:30.267121 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc01e1f52af4386bd143ff8779458f0e335475f40f067c608c70a765d6fad952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:30Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:30 crc kubenswrapper[4991]: I1206 01:02:30.281319 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021d3a65c56a737a682268cde0d57a186389f597de592f541a72ea5958a170c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70371e1283910d9d652089554a815343eb1d54110a2472e067d5a2e8878c9193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:30Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:30 crc kubenswrapper[4991]: I1206 01:02:30.293201 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d8086b693e86f8e2ea45695911c64df4acded68da7cc61db4cdbc19b2402879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:30Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:30 crc kubenswrapper[4991]: I1206 01:02:30.306542 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:30Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:31 crc kubenswrapper[4991]: I1206 01:02:31.094224 4991 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 01:02:31 crc kubenswrapper[4991]: I1206 01:02:31.094959 4991 scope.go:117] "RemoveContainer" containerID="d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f" Dec 06 01:02:31 crc kubenswrapper[4991]: E1206 01:02:31.095146 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.037744 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.037766 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:02:32 crc kubenswrapper[4991]: E1206 01:02:32.037898 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.037876 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:02:32 crc kubenswrapper[4991]: E1206 01:02:32.038022 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:02:32 crc kubenswrapper[4991]: E1206 01:02:32.038197 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.059706 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-kw4lt"] Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.060103 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-kw4lt" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.062464 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.062597 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.065908 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.081484 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f3f8be-7093-40c6-8c6e-6bb65c878b31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:26Z\\\",\\\"message\\\":\\\"ication::requestheader-client-ca-file\\\\nI1206 01:02:26.397262 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1206 01:02:26.397306 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1206 01:02:26.397362 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764982931\\\\\\\\\\\\\\\" (2025-12-06 01:02:10 +0000 UTC to 2026-01-05 01:02:11 +0000 UTC (now=2025-12-06 01:02:26.397254869 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397406 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1206 01:02:26.397421 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1206 01:02:26.397440 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\"\\\\nI1206 01:02:26.397610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764982946\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764982946\\\\\\\\\\\\\\\" (2025-12-06 00:02:26 +0000 UTC to 2026-12-06 00:02:26 +0000 UTC (now=2025-12-06 01:02:26.397572448 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397373 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1206 01:02:26.397647 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1206 01:02:26.397687 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1206 01:02:26.397767 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1206 01:02:26.399022 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:32Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.099852 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:32Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.117683 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:32Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.130385 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kw4lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c790aea7-3412-47a6-a3d0-3dd019c26022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnwnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kw4lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:32Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.143360 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnwnr\" (UniqueName: \"kubernetes.io/projected/c790aea7-3412-47a6-a3d0-3dd019c26022-kube-api-access-hnwnr\") pod \"node-resolver-kw4lt\" (UID: \"c790aea7-3412-47a6-a3d0-3dd019c26022\") " pod="openshift-dns/node-resolver-kw4lt" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.143440 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c790aea7-3412-47a6-a3d0-3dd019c26022-hosts-file\") pod \"node-resolver-kw4lt\" (UID: \"c790aea7-3412-47a6-a3d0-3dd019c26022\") " pod="openshift-dns/node-resolver-kw4lt" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.146403 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5409f18b-76b3-4388-a9c8-50a1ec8a37d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3c6493775f81f3e0feeee29f2b3c8420a15ba9ce8117fab50521b2251af46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6fa450da9e79fab8a6d70eb261ad080499995eef8884b2e01c68f44f3d1149\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cb22622bd70c57cf25cb4e61bec6541175f1fee668e4b1aaa7e36853827315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:32Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.188356 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc01e1f52af4386bd143ff8779458f0e335475f40f067c608c70a765d6fad952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:32Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.207577 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021d3a65c56a737a682268cde0d57a186389f597de592f541a72ea5958a170c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70371e1283910d9d652089554a815343eb1d54110a2472e067d5a2e8878c9193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:32Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.233589 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d8086b693e86f8e2ea45695911c64df4acded68da7cc61db4cdbc19b2402879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:32Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.243924 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnwnr\" (UniqueName: \"kubernetes.io/projected/c790aea7-3412-47a6-a3d0-3dd019c26022-kube-api-access-hnwnr\") pod \"node-resolver-kw4lt\" (UID: \"c790aea7-3412-47a6-a3d0-3dd019c26022\") " pod="openshift-dns/node-resolver-kw4lt" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.243970 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c790aea7-3412-47a6-a3d0-3dd019c26022-hosts-file\") pod \"node-resolver-kw4lt\" (UID: \"c790aea7-3412-47a6-a3d0-3dd019c26022\") " pod="openshift-dns/node-resolver-kw4lt" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.244149 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c790aea7-3412-47a6-a3d0-3dd019c26022-hosts-file\") pod \"node-resolver-kw4lt\" (UID: \"c790aea7-3412-47a6-a3d0-3dd019c26022\") " pod="openshift-dns/node-resolver-kw4lt" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.265613 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnwnr\" (UniqueName: \"kubernetes.io/projected/c790aea7-3412-47a6-a3d0-3dd019c26022-kube-api-access-hnwnr\") pod \"node-resolver-kw4lt\" (UID: \"c790aea7-3412-47a6-a3d0-3dd019c26022\") " pod="openshift-dns/node-resolver-kw4lt" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.281583 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:32Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.374143 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-kw4lt" Dec 06 01:02:32 crc kubenswrapper[4991]: W1206 01:02:32.390257 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc790aea7_3412_47a6_a3d0_3dd019c26022.slice/crio-9b013423232a6d83c2695536e3c9bc5dac2f1b058fff52327bc5ac4702b43861 WatchSource:0}: Error finding container 9b013423232a6d83c2695536e3c9bc5dac2f1b058fff52327bc5ac4702b43861: Status 404 returned error can't find the container with id 9b013423232a6d83c2695536e3c9bc5dac2f1b058fff52327bc5ac4702b43861 Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.473251 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.475515 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.475839 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.475853 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.475940 4991 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.481654 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-ppxx8"] Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.481974 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-sq7kg"] Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.482280 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.482411 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-tw85g"] Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.482586 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.486015 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.488022 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.490241 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.490518 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.490546 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.490724 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.490739 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-tw85g" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.490823 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.491051 4991 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.491438 4991 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.491904 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.491955 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.492197 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.492599 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.493386 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.493416 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.493428 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.493448 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.493514 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.493462 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:32Z","lastTransitionTime":"2025-12-06T01:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.504957 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kw4lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c790aea7-3412-47a6-a3d0-3dd019c26022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnwnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kw4lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:32Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:32 crc kubenswrapper[4991]: E1206 01:02:32.513405 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"596c3607-4a7a-43fe-9f86-5d0b3f7502dc\\\",\\\"systemUUID\\\":\\\"a16d5df6-771f-4852-a5a1-7a32af9d583d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:32Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.518397 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.518448 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.518457 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.518477 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.518493 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:32Z","lastTransitionTime":"2025-12-06T01:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.521607 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc01e1f52af4386bd143ff8779458f0e335475f40f067c608c70a765d6fad952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:32Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:32 crc kubenswrapper[4991]: E1206 01:02:32.532245 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"596c3607-4a7a-43fe-9f86-5d0b3f7502dc\\\",\\\"systemUUID\\\":\\\"a16d5df6-771f-4852-a5a1-7a32af9d583d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:32Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.533579 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:32Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.539609 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.539840 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.539903 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.539978 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.540067 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:32Z","lastTransitionTime":"2025-12-06T01:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.547290 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/06f95c58-fd69-48f6-94be-15927aa850c2-cni-binary-copy\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.547336 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/09e52ba3-560c-483d-9c51-898cbe94aad5-os-release\") pod \"multus-additional-cni-plugins-tw85g\" (UID: \"09e52ba3-560c-483d-9c51-898cbe94aad5\") " pod="openshift-multus/multus-additional-cni-plugins-tw85g" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.547361 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/06f95c58-fd69-48f6-94be-15927aa850c2-multus-cni-dir\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.547468 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/09e52ba3-560c-483d-9c51-898cbe94aad5-cni-binary-copy\") pod \"multus-additional-cni-plugins-tw85g\" (UID: \"09e52ba3-560c-483d-9c51-898cbe94aad5\") " pod="openshift-multus/multus-additional-cni-plugins-tw85g" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.547516 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/06f95c58-fd69-48f6-94be-15927aa850c2-host-var-lib-cni-bin\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.547541 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/06f95c58-fd69-48f6-94be-15927aa850c2-multus-socket-dir-parent\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.547562 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f35cea72-9e31-4432-9e3b-16a50ebce794-proxy-tls\") pod \"machine-config-daemon-ppxx8\" (UID: \"f35cea72-9e31-4432-9e3b-16a50ebce794\") " pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.547580 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/06f95c58-fd69-48f6-94be-15927aa850c2-host-run-k8s-cni-cncf-io\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.547601 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/06f95c58-fd69-48f6-94be-15927aa850c2-host-var-lib-cni-multus\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.547630 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x44cm\" (UniqueName: \"kubernetes.io/projected/09e52ba3-560c-483d-9c51-898cbe94aad5-kube-api-access-x44cm\") pod \"multus-additional-cni-plugins-tw85g\" (UID: \"09e52ba3-560c-483d-9c51-898cbe94aad5\") " pod="openshift-multus/multus-additional-cni-plugins-tw85g" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.547652 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/06f95c58-fd69-48f6-94be-15927aa850c2-system-cni-dir\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.547673 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/06f95c58-fd69-48f6-94be-15927aa850c2-cnibin\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.547692 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/06f95c58-fd69-48f6-94be-15927aa850c2-host-var-lib-kubelet\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.547743 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/06f95c58-fd69-48f6-94be-15927aa850c2-hostroot\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.547771 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f35cea72-9e31-4432-9e3b-16a50ebce794-rootfs\") pod \"machine-config-daemon-ppxx8\" (UID: \"f35cea72-9e31-4432-9e3b-16a50ebce794\") " pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.547789 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/09e52ba3-560c-483d-9c51-898cbe94aad5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tw85g\" (UID: \"09e52ba3-560c-483d-9c51-898cbe94aad5\") " pod="openshift-multus/multus-additional-cni-plugins-tw85g" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.547806 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/09e52ba3-560c-483d-9c51-898cbe94aad5-cnibin\") pod \"multus-additional-cni-plugins-tw85g\" (UID: \"09e52ba3-560c-483d-9c51-898cbe94aad5\") " pod="openshift-multus/multus-additional-cni-plugins-tw85g" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.547827 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lq67\" (UniqueName: \"kubernetes.io/projected/f35cea72-9e31-4432-9e3b-16a50ebce794-kube-api-access-8lq67\") pod \"machine-config-daemon-ppxx8\" (UID: \"f35cea72-9e31-4432-9e3b-16a50ebce794\") " pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.547846 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09e52ba3-560c-483d-9c51-898cbe94aad5-system-cni-dir\") pod \"multus-additional-cni-plugins-tw85g\" (UID: \"09e52ba3-560c-483d-9c51-898cbe94aad5\") " pod="openshift-multus/multus-additional-cni-plugins-tw85g" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.547865 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/06f95c58-fd69-48f6-94be-15927aa850c2-os-release\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.547885 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/09e52ba3-560c-483d-9c51-898cbe94aad5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tw85g\" (UID: \"09e52ba3-560c-483d-9c51-898cbe94aad5\") " pod="openshift-multus/multus-additional-cni-plugins-tw85g" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.547908 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/06f95c58-fd69-48f6-94be-15927aa850c2-etc-kubernetes\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.547941 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f35cea72-9e31-4432-9e3b-16a50ebce794-mcd-auth-proxy-config\") pod \"machine-config-daemon-ppxx8\" (UID: \"f35cea72-9e31-4432-9e3b-16a50ebce794\") " pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.548004 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/06f95c58-fd69-48f6-94be-15927aa850c2-host-run-netns\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.548026 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/06f95c58-fd69-48f6-94be-15927aa850c2-host-run-multus-certs\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.548049 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/06f95c58-fd69-48f6-94be-15927aa850c2-multus-conf-dir\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.548067 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/06f95c58-fd69-48f6-94be-15927aa850c2-multus-daemon-config\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.548091 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw8lm\" (UniqueName: \"kubernetes.io/projected/06f95c58-fd69-48f6-94be-15927aa850c2-kube-api-access-lw8lm\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.550162 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f35cea72-9e31-4432-9e3b-16a50ebce794\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ppxx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:32Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:32 crc kubenswrapper[4991]: E1206 01:02:32.552166 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"596c3607-4a7a-43fe-9f86-5d0b3f7502dc\\\",\\\"systemUUID\\\":\\\"a16d5df6-771f-4852-a5a1-7a32af9d583d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:32Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.557508 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.557567 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.557576 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.557596 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.557612 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:32Z","lastTransitionTime":"2025-12-06T01:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.563086 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f3f8be-7093-40c6-8c6e-6bb65c878b31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:26Z\\\",\\\"message\\\":\\\"ication::requestheader-client-ca-file\\\\nI1206 01:02:26.397262 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1206 01:02:26.397306 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1206 01:02:26.397362 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764982931\\\\\\\\\\\\\\\" (2025-12-06 01:02:10 +0000 UTC to 2026-01-05 01:02:11 +0000 UTC (now=2025-12-06 01:02:26.397254869 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397406 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1206 01:02:26.397421 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1206 01:02:26.397440 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\"\\\\nI1206 01:02:26.397610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764982946\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764982946\\\\\\\\\\\\\\\" (2025-12-06 00:02:26 +0000 UTC to 2026-12-06 00:02:26 +0000 UTC (now=2025-12-06 01:02:26.397572448 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397373 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1206 01:02:26.397647 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1206 01:02:26.397687 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1206 01:02:26.397767 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1206 01:02:26.399022 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:32Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:32 crc kubenswrapper[4991]: E1206 01:02:32.572536 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"596c3607-4a7a-43fe-9f86-5d0b3f7502dc\\\",\\\"systemUUID\\\":\\\"a16d5df6-771f-4852-a5a1-7a32af9d583d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:32Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.576811 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.576972 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.577087 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.577211 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.577312 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:32Z","lastTransitionTime":"2025-12-06T01:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.580686 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:32Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:32 crc kubenswrapper[4991]: E1206 01:02:32.594687 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"596c3607-4a7a-43fe-9f86-5d0b3f7502dc\\\",\\\"systemUUID\\\":\\\"a16d5df6-771f-4852-a5a1-7a32af9d583d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:32Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:32 crc kubenswrapper[4991]: E1206 01:02:32.594839 4991 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.596771 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021d3a65c56a737a682268cde0d57a186389f597de592f541a72ea5958a170c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70371e1283910d9d652089554a815343eb1d54110a2472e067d5a2e8878c9193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:32Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.596878 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.597043 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.597060 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.597081 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.597095 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:32Z","lastTransitionTime":"2025-12-06T01:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.613505 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d8086b693e86f8e2ea45695911c64df4acded68da7cc61db4cdbc19b2402879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:32Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.628865 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:32Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.641813 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sq7kg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f95c58-fd69-48f6-94be-15927aa850c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw8lm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sq7kg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:32Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.649022 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/06f95c58-fd69-48f6-94be-15927aa850c2-host-var-lib-kubelet\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.649088 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x44cm\" (UniqueName: \"kubernetes.io/projected/09e52ba3-560c-483d-9c51-898cbe94aad5-kube-api-access-x44cm\") pod \"multus-additional-cni-plugins-tw85g\" (UID: \"09e52ba3-560c-483d-9c51-898cbe94aad5\") " pod="openshift-multus/multus-additional-cni-plugins-tw85g" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.649108 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/06f95c58-fd69-48f6-94be-15927aa850c2-system-cni-dir\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.649126 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/06f95c58-fd69-48f6-94be-15927aa850c2-cnibin\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.649165 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/06f95c58-fd69-48f6-94be-15927aa850c2-hostroot\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.649194 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f35cea72-9e31-4432-9e3b-16a50ebce794-rootfs\") pod \"machine-config-daemon-ppxx8\" (UID: \"f35cea72-9e31-4432-9e3b-16a50ebce794\") " pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.649213 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/09e52ba3-560c-483d-9c51-898cbe94aad5-cnibin\") pod \"multus-additional-cni-plugins-tw85g\" (UID: \"09e52ba3-560c-483d-9c51-898cbe94aad5\") " pod="openshift-multus/multus-additional-cni-plugins-tw85g" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.649262 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/09e52ba3-560c-483d-9c51-898cbe94aad5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tw85g\" (UID: \"09e52ba3-560c-483d-9c51-898cbe94aad5\") " pod="openshift-multus/multus-additional-cni-plugins-tw85g" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.649281 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lq67\" (UniqueName: \"kubernetes.io/projected/f35cea72-9e31-4432-9e3b-16a50ebce794-kube-api-access-8lq67\") pod \"machine-config-daemon-ppxx8\" (UID: \"f35cea72-9e31-4432-9e3b-16a50ebce794\") " pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.649302 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09e52ba3-560c-483d-9c51-898cbe94aad5-system-cni-dir\") pod \"multus-additional-cni-plugins-tw85g\" (UID: \"09e52ba3-560c-483d-9c51-898cbe94aad5\") " pod="openshift-multus/multus-additional-cni-plugins-tw85g" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.649319 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/06f95c58-fd69-48f6-94be-15927aa850c2-os-release\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.649343 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/09e52ba3-560c-483d-9c51-898cbe94aad5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tw85g\" (UID: \"09e52ba3-560c-483d-9c51-898cbe94aad5\") " pod="openshift-multus/multus-additional-cni-plugins-tw85g" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.649361 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/06f95c58-fd69-48f6-94be-15927aa850c2-etc-kubernetes\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.649384 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f35cea72-9e31-4432-9e3b-16a50ebce794-mcd-auth-proxy-config\") pod \"machine-config-daemon-ppxx8\" (UID: \"f35cea72-9e31-4432-9e3b-16a50ebce794\") " pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.649409 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/06f95c58-fd69-48f6-94be-15927aa850c2-host-run-netns\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.649430 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/06f95c58-fd69-48f6-94be-15927aa850c2-host-run-multus-certs\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.649451 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/06f95c58-fd69-48f6-94be-15927aa850c2-multus-conf-dir\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.649477 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/06f95c58-fd69-48f6-94be-15927aa850c2-multus-daemon-config\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.649495 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw8lm\" (UniqueName: \"kubernetes.io/projected/06f95c58-fd69-48f6-94be-15927aa850c2-kube-api-access-lw8lm\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.649514 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/06f95c58-fd69-48f6-94be-15927aa850c2-cni-binary-copy\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.649538 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/09e52ba3-560c-483d-9c51-898cbe94aad5-os-release\") pod \"multus-additional-cni-plugins-tw85g\" (UID: \"09e52ba3-560c-483d-9c51-898cbe94aad5\") " pod="openshift-multus/multus-additional-cni-plugins-tw85g" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.649555 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/06f95c58-fd69-48f6-94be-15927aa850c2-multus-cni-dir\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.649553 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/06f95c58-fd69-48f6-94be-15927aa850c2-system-cni-dir\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.649599 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/09e52ba3-560c-483d-9c51-898cbe94aad5-cni-binary-copy\") pod \"multus-additional-cni-plugins-tw85g\" (UID: \"09e52ba3-560c-483d-9c51-898cbe94aad5\") " pod="openshift-multus/multus-additional-cni-plugins-tw85g" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.649753 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/06f95c58-fd69-48f6-94be-15927aa850c2-multus-socket-dir-parent\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.649800 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/06f95c58-fd69-48f6-94be-15927aa850c2-host-var-lib-cni-bin\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.650041 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/09e52ba3-560c-483d-9c51-898cbe94aad5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tw85g\" (UID: \"09e52ba3-560c-483d-9c51-898cbe94aad5\") " pod="openshift-multus/multus-additional-cni-plugins-tw85g" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.650082 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/06f95c58-fd69-48f6-94be-15927aa850c2-cnibin\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.650161 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/06f95c58-fd69-48f6-94be-15927aa850c2-hostroot\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.650218 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f35cea72-9e31-4432-9e3b-16a50ebce794-rootfs\") pod \"machine-config-daemon-ppxx8\" (UID: \"f35cea72-9e31-4432-9e3b-16a50ebce794\") " pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.650272 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/09e52ba3-560c-483d-9c51-898cbe94aad5-cnibin\") pod \"multus-additional-cni-plugins-tw85g\" (UID: \"09e52ba3-560c-483d-9c51-898cbe94aad5\") " pod="openshift-multus/multus-additional-cni-plugins-tw85g" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.649203 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/06f95c58-fd69-48f6-94be-15927aa850c2-host-var-lib-kubelet\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.650338 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/06f95c58-fd69-48f6-94be-15927aa850c2-multus-socket-dir-parent\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.650365 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/09e52ba3-560c-483d-9c51-898cbe94aad5-cni-binary-copy\") pod \"multus-additional-cni-plugins-tw85g\" (UID: \"09e52ba3-560c-483d-9c51-898cbe94aad5\") " pod="openshift-multus/multus-additional-cni-plugins-tw85g" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.650413 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/06f95c58-fd69-48f6-94be-15927aa850c2-multus-conf-dir\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.650469 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/06f95c58-fd69-48f6-94be-15927aa850c2-etc-kubernetes\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.650957 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/06f95c58-fd69-48f6-94be-15927aa850c2-multus-daemon-config\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.651028 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09e52ba3-560c-483d-9c51-898cbe94aad5-system-cni-dir\") pod \"multus-additional-cni-plugins-tw85g\" (UID: \"09e52ba3-560c-483d-9c51-898cbe94aad5\") " pod="openshift-multus/multus-additional-cni-plugins-tw85g" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.651115 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/06f95c58-fd69-48f6-94be-15927aa850c2-os-release\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.651157 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/09e52ba3-560c-483d-9c51-898cbe94aad5-os-release\") pod \"multus-additional-cni-plugins-tw85g\" (UID: \"09e52ba3-560c-483d-9c51-898cbe94aad5\") " pod="openshift-multus/multus-additional-cni-plugins-tw85g" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.651320 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f35cea72-9e31-4432-9e3b-16a50ebce794-mcd-auth-proxy-config\") pod \"machine-config-daemon-ppxx8\" (UID: \"f35cea72-9e31-4432-9e3b-16a50ebce794\") " pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.651379 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/06f95c58-fd69-48f6-94be-15927aa850c2-host-run-netns\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.651421 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/06f95c58-fd69-48f6-94be-15927aa850c2-host-run-multus-certs\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.651442 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/06f95c58-fd69-48f6-94be-15927aa850c2-multus-cni-dir\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.651457 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/06f95c58-fd69-48f6-94be-15927aa850c2-host-var-lib-cni-bin\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.651635 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/09e52ba3-560c-483d-9c51-898cbe94aad5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tw85g\" (UID: \"09e52ba3-560c-483d-9c51-898cbe94aad5\") " pod="openshift-multus/multus-additional-cni-plugins-tw85g" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.651748 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f35cea72-9e31-4432-9e3b-16a50ebce794-proxy-tls\") pod \"machine-config-daemon-ppxx8\" (UID: \"f35cea72-9e31-4432-9e3b-16a50ebce794\") " pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.651796 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/06f95c58-fd69-48f6-94be-15927aa850c2-host-run-k8s-cni-cncf-io\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.651832 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/06f95c58-fd69-48f6-94be-15927aa850c2-host-var-lib-cni-multus\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.651839 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/06f95c58-fd69-48f6-94be-15927aa850c2-cni-binary-copy\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.651928 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/06f95c58-fd69-48f6-94be-15927aa850c2-host-var-lib-cni-multus\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.651929 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/06f95c58-fd69-48f6-94be-15927aa850c2-host-run-k8s-cni-cncf-io\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.655184 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f35cea72-9e31-4432-9e3b-16a50ebce794-proxy-tls\") pod \"machine-config-daemon-ppxx8\" (UID: \"f35cea72-9e31-4432-9e3b-16a50ebce794\") " pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.659162 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5409f18b-76b3-4388-a9c8-50a1ec8a37d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3c6493775f81f3e0feeee29f2b3c8420a15ba9ce8117fab50521b2251af46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6fa450da9e79fab8a6d70eb261ad080499995eef8884b2e01c68f44f3d1149\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cb22622bd70c57cf25cb4e61bec6541175f1fee668e4b1aaa7e36853827315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:32Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.671708 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f3f8be-7093-40c6-8c6e-6bb65c878b31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:26Z\\\",\\\"message\\\":\\\"ication::requestheader-client-ca-file\\\\nI1206 01:02:26.397262 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1206 01:02:26.397306 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1206 01:02:26.397362 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764982931\\\\\\\\\\\\\\\" (2025-12-06 01:02:10 +0000 UTC to 2026-01-05 01:02:11 +0000 UTC (now=2025-12-06 01:02:26.397254869 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397406 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1206 01:02:26.397421 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1206 01:02:26.397440 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\"\\\\nI1206 01:02:26.397610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764982946\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764982946\\\\\\\\\\\\\\\" (2025-12-06 00:02:26 +0000 UTC to 2026-12-06 00:02:26 +0000 UTC (now=2025-12-06 01:02:26.397572448 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397373 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1206 01:02:26.397647 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1206 01:02:26.397687 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1206 01:02:26.397767 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1206 01:02:26.399022 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:32Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.672184 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x44cm\" (UniqueName: \"kubernetes.io/projected/09e52ba3-560c-483d-9c51-898cbe94aad5-kube-api-access-x44cm\") pod \"multus-additional-cni-plugins-tw85g\" (UID: \"09e52ba3-560c-483d-9c51-898cbe94aad5\") " pod="openshift-multus/multus-additional-cni-plugins-tw85g" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.672424 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw8lm\" (UniqueName: \"kubernetes.io/projected/06f95c58-fd69-48f6-94be-15927aa850c2-kube-api-access-lw8lm\") pod \"multus-sq7kg\" (UID: \"06f95c58-fd69-48f6-94be-15927aa850c2\") " pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.674240 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lq67\" (UniqueName: \"kubernetes.io/projected/f35cea72-9e31-4432-9e3b-16a50ebce794-kube-api-access-8lq67\") pod \"machine-config-daemon-ppxx8\" (UID: \"f35cea72-9e31-4432-9e3b-16a50ebce794\") " pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.684927 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:32Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.698628 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:32Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.699698 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.699754 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.699767 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.699790 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.699804 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:32Z","lastTransitionTime":"2025-12-06T01:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.711420 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f35cea72-9e31-4432-9e3b-16a50ebce794\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ppxx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:32Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.730817 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5409f18b-76b3-4388-a9c8-50a1ec8a37d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3c6493775f81f3e0feeee29f2b3c8420a15ba9ce8117fab50521b2251af46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6fa450da9e79fab8a6d70eb261ad080499995eef8884b2e01c68f44f3d1149\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cb22622bd70c57cf25cb4e61bec6541175f1fee668e4b1aaa7e36853827315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:32Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.745882 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021d3a65c56a737a682268cde0d57a186389f597de592f541a72ea5958a170c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70371e1283910d9d652089554a815343eb1d54110a2472e067d5a2e8878c9193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:32Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.757762 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d8086b693e86f8e2ea45695911c64df4acded68da7cc61db4cdbc19b2402879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:32Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.769979 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:32Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.786216 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sq7kg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f95c58-fd69-48f6-94be-15927aa850c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw8lm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sq7kg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:32Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.802368 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.802413 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.802425 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.802442 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.802453 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:32Z","lastTransitionTime":"2025-12-06T01:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.806665 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-sq7kg" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.820006 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" Dec 06 01:02:32 crc kubenswrapper[4991]: W1206 01:02:32.820546 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06f95c58_fd69_48f6_94be_15927aa850c2.slice/crio-b56d6d71a85e1768a19d27cde9a5ed0e40bfa3e0ab2d9cc0a8b719c143a90dd9 WatchSource:0}: Error finding container b56d6d71a85e1768a19d27cde9a5ed0e40bfa3e0ab2d9cc0a8b719c143a90dd9: Status 404 returned error can't find the container with id b56d6d71a85e1768a19d27cde9a5ed0e40bfa3e0ab2d9cc0a8b719c143a90dd9 Dec 06 01:02:32 crc kubenswrapper[4991]: W1206 01:02:32.836025 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf35cea72_9e31_4432_9e3b_16a50ebce794.slice/crio-8af7825bbf5e08945210bc1fb9de5d5e703c4a38c8c8a3d8d974993568cbbb90 WatchSource:0}: Error finding container 8af7825bbf5e08945210bc1fb9de5d5e703c4a38c8c8a3d8d974993568cbbb90: Status 404 returned error can't find the container with id 8af7825bbf5e08945210bc1fb9de5d5e703c4a38c8c8a3d8d974993568cbbb90 Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.839032 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e52ba3-560c-483d-9c51-898cbe94aad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tw85g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:32Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.850797 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-tw85g" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.862122 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kw4lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c790aea7-3412-47a6-a3d0-3dd019c26022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnwnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kw4lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:32Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.867747 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5t29r"] Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.869905 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.877803 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.877908 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.880318 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.881322 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.881379 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.881749 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.886184 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.886449 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc01e1f52af4386bd143ff8779458f0e335475f40f067c608c70a765d6fad952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:32Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.913855 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.914069 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.915673 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.915766 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.915842 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:32Z","lastTransitionTime":"2025-12-06T01:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.917064 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc01e1f52af4386bd143ff8779458f0e335475f40f067c608c70a765d6fad952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:32Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.941465 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1c8508-3f35-44b2-988e-cc665121ce94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5t29r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:32Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.956257 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-systemd-units\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.956305 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-run-ovn\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.956338 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7a1c8508-3f35-44b2-988e-cc665121ce94-env-overrides\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.956392 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-log-socket\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.956415 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-host-cni-bin\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.956474 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-host-cni-netd\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.956512 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-host-kubelet\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.956545 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-host-slash\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.956561 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.956578 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-node-log\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.956609 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-run-openvswitch\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.956631 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-etc-openvswitch\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.956650 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-var-lib-openvswitch\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.956667 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-host-run-netns\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.956698 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-run-systemd\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.956729 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7a1c8508-3f35-44b2-988e-cc665121ce94-ovnkube-config\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.956748 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7a1c8508-3f35-44b2-988e-cc665121ce94-ovnkube-script-lib\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.956764 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pnfh\" (UniqueName: \"kubernetes.io/projected/7a1c8508-3f35-44b2-988e-cc665121ce94-kube-api-access-5pnfh\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.956780 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-host-run-ovn-kubernetes\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.956796 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7a1c8508-3f35-44b2-988e-cc665121ce94-ovn-node-metrics-cert\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.965937 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f3f8be-7093-40c6-8c6e-6bb65c878b31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:26Z\\\",\\\"message\\\":\\\"ication::requestheader-client-ca-file\\\\nI1206 01:02:26.397262 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1206 01:02:26.397306 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1206 01:02:26.397362 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764982931\\\\\\\\\\\\\\\" (2025-12-06 01:02:10 +0000 UTC to 2026-01-05 01:02:11 +0000 UTC (now=2025-12-06 01:02:26.397254869 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397406 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1206 01:02:26.397421 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1206 01:02:26.397440 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\"\\\\nI1206 01:02:26.397610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764982946\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764982946\\\\\\\\\\\\\\\" (2025-12-06 00:02:26 +0000 UTC to 2026-12-06 00:02:26 +0000 UTC (now=2025-12-06 01:02:26.397572448 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397373 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1206 01:02:26.397647 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1206 01:02:26.397687 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1206 01:02:26.397767 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1206 01:02:26.399022 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:32Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.982152 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:32Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:32 crc kubenswrapper[4991]: I1206 01:02:32.995556 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:32Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.006594 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f35cea72-9e31-4432-9e3b-16a50ebce794\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ppxx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:33Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.019649 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.019679 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.019689 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.019706 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.019717 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:33Z","lastTransitionTime":"2025-12-06T01:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.029454 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5409f18b-76b3-4388-a9c8-50a1ec8a37d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3c6493775f81f3e0feeee29f2b3c8420a15ba9ce8117fab50521b2251af46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6fa450da9e79fab8a6d70eb261ad080499995eef8884b2e01c68f44f3d1149\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cb22622bd70c57cf25cb4e61bec6541175f1fee668e4b1aaa7e36853827315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:33Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.048710 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021d3a65c56a737a682268cde0d57a186389f597de592f541a72ea5958a170c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70371e1283910d9d652089554a815343eb1d54110a2472e067d5a2e8878c9193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:33Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.058591 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-run-systemd\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.058632 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7a1c8508-3f35-44b2-988e-cc665121ce94-ovnkube-config\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.058651 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7a1c8508-3f35-44b2-988e-cc665121ce94-ovnkube-script-lib\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.058667 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pnfh\" (UniqueName: \"kubernetes.io/projected/7a1c8508-3f35-44b2-988e-cc665121ce94-kube-api-access-5pnfh\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.058683 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-host-run-ovn-kubernetes\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.058709 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7a1c8508-3f35-44b2-988e-cc665121ce94-ovn-node-metrics-cert\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.058694 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-run-systemd\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.058731 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-run-ovn\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.058751 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-systemd-units\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.058769 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7a1c8508-3f35-44b2-988e-cc665121ce94-env-overrides\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.058806 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-log-socket\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.058822 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-host-cni-bin\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.058817 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-host-run-ovn-kubernetes\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.058872 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-host-cni-netd\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.058839 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-host-cni-netd\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.058919 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-run-ovn\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.058945 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-systemd-units\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.058948 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-host-kubelet\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.059028 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-host-cni-bin\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.059037 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-host-slash\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.059059 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-host-kubelet\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.059062 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.059090 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-host-slash\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.059121 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.059109 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-log-socket\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.059123 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-node-log\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.059141 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-node-log\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.059221 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-run-openvswitch\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.059248 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-run-openvswitch\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.059255 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-var-lib-openvswitch\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.059296 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-etc-openvswitch\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.059339 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-host-run-netns\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.059359 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-var-lib-openvswitch\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.059362 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-etc-openvswitch\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.059435 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-host-run-netns\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.059610 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7a1c8508-3f35-44b2-988e-cc665121ce94-env-overrides\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.060177 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7a1c8508-3f35-44b2-988e-cc665121ce94-ovnkube-script-lib\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.060322 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7a1c8508-3f35-44b2-988e-cc665121ce94-ovnkube-config\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.064220 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d8086b693e86f8e2ea45695911c64df4acded68da7cc61db4cdbc19b2402879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:33Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.066565 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7a1c8508-3f35-44b2-988e-cc665121ce94-ovn-node-metrics-cert\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.075319 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pnfh\" (UniqueName: \"kubernetes.io/projected/7a1c8508-3f35-44b2-988e-cc665121ce94-kube-api-access-5pnfh\") pod \"ovnkube-node-5t29r\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.078708 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:33Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.096322 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sq7kg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f95c58-fd69-48f6-94be-15927aa850c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw8lm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sq7kg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:33Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.112814 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e52ba3-560c-483d-9c51-898cbe94aad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tw85g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:33Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.122356 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.122407 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.122422 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.122444 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.122458 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:33Z","lastTransitionTime":"2025-12-06T01:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.125233 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kw4lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c790aea7-3412-47a6-a3d0-3dd019c26022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnwnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kw4lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:33Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.210675 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.216022 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" event={"ID":"09e52ba3-560c-483d-9c51-898cbe94aad5","Type":"ContainerStarted","Data":"8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43"} Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.216101 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" event={"ID":"09e52ba3-560c-483d-9c51-898cbe94aad5","Type":"ContainerStarted","Data":"f9269d4130fa2739e47df7d4a33e8993f0e8e4478778f0c992b94aa1498779ee"} Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.221341 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" event={"ID":"f35cea72-9e31-4432-9e3b-16a50ebce794","Type":"ContainerStarted","Data":"9893b4133f9e41bd3fa958f5e1c25f5f0eb3cd3bd8d195d58f282866dc850c3f"} Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.221417 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" event={"ID":"f35cea72-9e31-4432-9e3b-16a50ebce794","Type":"ContainerStarted","Data":"674001323aaa7a65c9d6df014b5b35f7afb939aa6afe57ac9858923e3fe6ce15"} Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.221434 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" event={"ID":"f35cea72-9e31-4432-9e3b-16a50ebce794","Type":"ContainerStarted","Data":"8af7825bbf5e08945210bc1fb9de5d5e703c4a38c8c8a3d8d974993568cbbb90"} Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.224592 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.224645 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.224657 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.224680 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.224692 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:33Z","lastTransitionTime":"2025-12-06T01:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.225314 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sq7kg" event={"ID":"06f95c58-fd69-48f6-94be-15927aa850c2","Type":"ContainerStarted","Data":"d44b91702b22653a2cc67f20e049260e82a69445cff9d6d2abbec16d1e725147"} Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.225366 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sq7kg" event={"ID":"06f95c58-fd69-48f6-94be-15927aa850c2","Type":"ContainerStarted","Data":"b56d6d71a85e1768a19d27cde9a5ed0e40bfa3e0ab2d9cc0a8b719c143a90dd9"} Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.228006 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-kw4lt" event={"ID":"c790aea7-3412-47a6-a3d0-3dd019c26022","Type":"ContainerStarted","Data":"ec4698040873e672e01eca6a4791166d92bd11bdcc3197bbe0340472eba3ba69"} Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.228069 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-kw4lt" event={"ID":"c790aea7-3412-47a6-a3d0-3dd019c26022","Type":"ContainerStarted","Data":"9b013423232a6d83c2695536e3c9bc5dac2f1b058fff52327bc5ac4702b43861"} Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.228777 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kw4lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c790aea7-3412-47a6-a3d0-3dd019c26022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnwnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kw4lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:33Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:33 crc kubenswrapper[4991]: W1206 01:02:33.232868 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a1c8508_3f35_44b2_988e_cc665121ce94.slice/crio-a3bd3088dfcb2b7a46a300624e3817052a809cdabf3674dfcfad4d82c9659ebe WatchSource:0}: Error finding container a3bd3088dfcb2b7a46a300624e3817052a809cdabf3674dfcfad4d82c9659ebe: Status 404 returned error can't find the container with id a3bd3088dfcb2b7a46a300624e3817052a809cdabf3674dfcfad4d82c9659ebe Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.244916 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc01e1f52af4386bd143ff8779458f0e335475f40f067c608c70a765d6fad952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:33Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.269677 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1c8508-3f35-44b2-988e-cc665121ce94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5t29r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:33Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.284318 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f3f8be-7093-40c6-8c6e-6bb65c878b31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:26Z\\\",\\\"message\\\":\\\"ication::requestheader-client-ca-file\\\\nI1206 01:02:26.397262 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1206 01:02:26.397306 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1206 01:02:26.397362 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764982931\\\\\\\\\\\\\\\" (2025-12-06 01:02:10 +0000 UTC to 2026-01-05 01:02:11 +0000 UTC (now=2025-12-06 01:02:26.397254869 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397406 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1206 01:02:26.397421 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1206 01:02:26.397440 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\"\\\\nI1206 01:02:26.397610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764982946\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764982946\\\\\\\\\\\\\\\" (2025-12-06 00:02:26 +0000 UTC to 2026-12-06 00:02:26 +0000 UTC (now=2025-12-06 01:02:26.397572448 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397373 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1206 01:02:26.397647 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1206 01:02:26.397687 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1206 01:02:26.397767 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1206 01:02:26.399022 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:33Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.299830 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:33Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.314232 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:33Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.326966 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f35cea72-9e31-4432-9e3b-16a50ebce794\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ppxx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:33Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.327838 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.327874 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.327888 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.327907 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.327918 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:33Z","lastTransitionTime":"2025-12-06T01:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.343002 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:33Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.358390 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sq7kg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f95c58-fd69-48f6-94be-15927aa850c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw8lm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sq7kg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:33Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.375140 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e52ba3-560c-483d-9c51-898cbe94aad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tw85g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:33Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.389700 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5409f18b-76b3-4388-a9c8-50a1ec8a37d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3c6493775f81f3e0feeee29f2b3c8420a15ba9ce8117fab50521b2251af46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6fa450da9e79fab8a6d70eb261ad080499995eef8884b2e01c68f44f3d1149\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cb22622bd70c57cf25cb4e61bec6541175f1fee668e4b1aaa7e36853827315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:33Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.404562 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021d3a65c56a737a682268cde0d57a186389f597de592f541a72ea5958a170c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70371e1283910d9d652089554a815343eb1d54110a2472e067d5a2e8878c9193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:33Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.418299 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d8086b693e86f8e2ea45695911c64df4acded68da7cc61db4cdbc19b2402879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:33Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.430981 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.431110 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.431135 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.431168 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.431190 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:33Z","lastTransitionTime":"2025-12-06T01:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.433480 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f3f8be-7093-40c6-8c6e-6bb65c878b31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:26Z\\\",\\\"message\\\":\\\"ication::requestheader-client-ca-file\\\\nI1206 01:02:26.397262 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1206 01:02:26.397306 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1206 01:02:26.397362 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764982931\\\\\\\\\\\\\\\" (2025-12-06 01:02:10 +0000 UTC to 2026-01-05 01:02:11 +0000 UTC (now=2025-12-06 01:02:26.397254869 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397406 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1206 01:02:26.397421 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1206 01:02:26.397440 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\"\\\\nI1206 01:02:26.397610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764982946\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764982946\\\\\\\\\\\\\\\" (2025-12-06 00:02:26 +0000 UTC to 2026-12-06 00:02:26 +0000 UTC (now=2025-12-06 01:02:26.397572448 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397373 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1206 01:02:26.397647 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1206 01:02:26.397687 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1206 01:02:26.397767 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1206 01:02:26.399022 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:33Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.450749 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:33Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.468420 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:33Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.480608 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f35cea72-9e31-4432-9e3b-16a50ebce794\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9893b4133f9e41bd3fa958f5e1c25f5f0eb3cd3bd8d195d58f282866dc850c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674001323aaa7a65c9d6df014b5b35f7afb939aa6afe57ac9858923e3fe6ce15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ppxx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:33Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.497315 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5409f18b-76b3-4388-a9c8-50a1ec8a37d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3c6493775f81f3e0feeee29f2b3c8420a15ba9ce8117fab50521b2251af46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6fa450da9e79fab8a6d70eb261ad080499995eef8884b2e01c68f44f3d1149\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cb22622bd70c57cf25cb4e61bec6541175f1fee668e4b1aaa7e36853827315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:33Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.515556 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021d3a65c56a737a682268cde0d57a186389f597de592f541a72ea5958a170c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70371e1283910d9d652089554a815343eb1d54110a2472e067d5a2e8878c9193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:33Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.528649 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d8086b693e86f8e2ea45695911c64df4acded68da7cc61db4cdbc19b2402879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:33Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.536899 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.536976 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.537099 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.537128 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.537142 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:33Z","lastTransitionTime":"2025-12-06T01:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.544611 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:33Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.559383 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sq7kg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f95c58-fd69-48f6-94be-15927aa850c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d44b91702b22653a2cc67f20e049260e82a69445cff9d6d2abbec16d1e725147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw8lm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sq7kg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:33Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.577819 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e52ba3-560c-483d-9c51-898cbe94aad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tw85g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:33Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.589400 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kw4lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c790aea7-3412-47a6-a3d0-3dd019c26022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4698040873e672e01eca6a4791166d92bd11bdcc3197bbe0340472eba3ba69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnwnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kw4lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:33Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.603924 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc01e1f52af4386bd143ff8779458f0e335475f40f067c608c70a765d6fad952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:33Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.623474 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1c8508-3f35-44b2-988e-cc665121ce94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5t29r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:33Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.639637 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.639685 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.639694 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.639710 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.639719 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:33Z","lastTransitionTime":"2025-12-06T01:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.742288 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.742613 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.742630 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.742654 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.742670 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:33Z","lastTransitionTime":"2025-12-06T01:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.768249 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:02:33 crc kubenswrapper[4991]: E1206 01:02:33.768592 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:02:41.768571111 +0000 UTC m=+35.118861589 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.845753 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.845802 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.845814 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.845835 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.845849 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:33Z","lastTransitionTime":"2025-12-06T01:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.870096 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:02:33 crc kubenswrapper[4991]: E1206 01:02:33.870399 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 01:02:33 crc kubenswrapper[4991]: E1206 01:02:33.870444 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 01:02:33 crc kubenswrapper[4991]: E1206 01:02:33.870461 4991 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 01:02:33 crc kubenswrapper[4991]: E1206 01:02:33.870533 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-06 01:02:41.870507316 +0000 UTC m=+35.220797784 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 01:02:33 crc kubenswrapper[4991]: E1206 01:02:33.870604 4991 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.870429 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:02:33 crc kubenswrapper[4991]: E1206 01:02:33.870703 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 01:02:41.870679691 +0000 UTC m=+35.220970169 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.870910 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.871172 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:02:33 crc kubenswrapper[4991]: E1206 01:02:33.871131 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 01:02:33 crc kubenswrapper[4991]: E1206 01:02:33.871457 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 01:02:33 crc kubenswrapper[4991]: E1206 01:02:33.871544 4991 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 01:02:33 crc kubenswrapper[4991]: E1206 01:02:33.871658 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-06 01:02:41.871649328 +0000 UTC m=+35.221939796 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 01:02:33 crc kubenswrapper[4991]: E1206 01:02:33.871337 4991 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 01:02:33 crc kubenswrapper[4991]: E1206 01:02:33.871842 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 01:02:41.871834633 +0000 UTC m=+35.222125101 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.948499 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.948792 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.948860 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.948928 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:33 crc kubenswrapper[4991]: I1206 01:02:33.949028 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:33Z","lastTransitionTime":"2025-12-06T01:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.037626 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.037651 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:02:34 crc kubenswrapper[4991]: E1206 01:02:34.037818 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:02:34 crc kubenswrapper[4991]: E1206 01:02:34.038018 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.038258 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:02:34 crc kubenswrapper[4991]: E1206 01:02:34.038458 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.051695 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.051862 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.052134 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.052234 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.052315 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:34Z","lastTransitionTime":"2025-12-06T01:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.156234 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.156283 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.156295 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.156314 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.156328 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:34Z","lastTransitionTime":"2025-12-06T01:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.233321 4991 generic.go:334] "Generic (PLEG): container finished" podID="09e52ba3-560c-483d-9c51-898cbe94aad5" containerID="8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43" exitCode=0 Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.233415 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" event={"ID":"09e52ba3-560c-483d-9c51-898cbe94aad5","Type":"ContainerDied","Data":"8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43"} Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.236217 4991 generic.go:334] "Generic (PLEG): container finished" podID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerID="e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396" exitCode=0 Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.236287 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" event={"ID":"7a1c8508-3f35-44b2-988e-cc665121ce94","Type":"ContainerDied","Data":"e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396"} Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.236329 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" event={"ID":"7a1c8508-3f35-44b2-988e-cc665121ce94","Type":"ContainerStarted","Data":"a3bd3088dfcb2b7a46a300624e3817052a809cdabf3674dfcfad4d82c9659ebe"} Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.253109 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f3f8be-7093-40c6-8c6e-6bb65c878b31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:26Z\\\",\\\"message\\\":\\\"ication::requestheader-client-ca-file\\\\nI1206 01:02:26.397262 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1206 01:02:26.397306 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1206 01:02:26.397362 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764982931\\\\\\\\\\\\\\\" (2025-12-06 01:02:10 +0000 UTC to 2026-01-05 01:02:11 +0000 UTC (now=2025-12-06 01:02:26.397254869 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397406 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1206 01:02:26.397421 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1206 01:02:26.397440 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\"\\\\nI1206 01:02:26.397610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764982946\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764982946\\\\\\\\\\\\\\\" (2025-12-06 00:02:26 +0000 UTC to 2026-12-06 00:02:26 +0000 UTC (now=2025-12-06 01:02:26.397572448 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397373 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1206 01:02:26.397647 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1206 01:02:26.397687 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1206 01:02:26.397767 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1206 01:02:26.399022 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:34Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.260216 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.260301 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.260318 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.260345 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.260359 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:34Z","lastTransitionTime":"2025-12-06T01:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.273066 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:34Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.287108 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:34Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.298656 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f35cea72-9e31-4432-9e3b-16a50ebce794\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9893b4133f9e41bd3fa958f5e1c25f5f0eb3cd3bd8d195d58f282866dc850c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674001323aaa7a65c9d6df014b5b35f7afb939aa6afe57ac9858923e3fe6ce15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ppxx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:34Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.319398 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5409f18b-76b3-4388-a9c8-50a1ec8a37d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3c6493775f81f3e0feeee29f2b3c8420a15ba9ce8117fab50521b2251af46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6fa450da9e79fab8a6d70eb261ad080499995eef8884b2e01c68f44f3d1149\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cb22622bd70c57cf25cb4e61bec6541175f1fee668e4b1aaa7e36853827315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:34Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.339621 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021d3a65c56a737a682268cde0d57a186389f597de592f541a72ea5958a170c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70371e1283910d9d652089554a815343eb1d54110a2472e067d5a2e8878c9193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:34Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.353732 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d8086b693e86f8e2ea45695911c64df4acded68da7cc61db4cdbc19b2402879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:34Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.365448 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.365489 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.365502 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.365522 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.365537 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:34Z","lastTransitionTime":"2025-12-06T01:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.372028 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:34Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.396508 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sq7kg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f95c58-fd69-48f6-94be-15927aa850c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d44b91702b22653a2cc67f20e049260e82a69445cff9d6d2abbec16d1e725147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw8lm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sq7kg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:34Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.415970 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e52ba3-560c-483d-9c51-898cbe94aad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tw85g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:34Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.431044 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kw4lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c790aea7-3412-47a6-a3d0-3dd019c26022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4698040873e672e01eca6a4791166d92bd11bdcc3197bbe0340472eba3ba69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnwnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kw4lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:34Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.444591 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc01e1f52af4386bd143ff8779458f0e335475f40f067c608c70a765d6fad952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:34Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.469350 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.469826 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.469840 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.469864 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.469879 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:34Z","lastTransitionTime":"2025-12-06T01:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.471552 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1c8508-3f35-44b2-988e-cc665121ce94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5t29r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:34Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.488065 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc01e1f52af4386bd143ff8779458f0e335475f40f067c608c70a765d6fad952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:34Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.513201 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1c8508-3f35-44b2-988e-cc665121ce94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5t29r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:34Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.563601 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f3f8be-7093-40c6-8c6e-6bb65c878b31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:26Z\\\",\\\"message\\\":\\\"ication::requestheader-client-ca-file\\\\nI1206 01:02:26.397262 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1206 01:02:26.397306 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1206 01:02:26.397362 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764982931\\\\\\\\\\\\\\\" (2025-12-06 01:02:10 +0000 UTC to 2026-01-05 01:02:11 +0000 UTC (now=2025-12-06 01:02:26.397254869 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397406 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1206 01:02:26.397421 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1206 01:02:26.397440 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\"\\\\nI1206 01:02:26.397610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764982946\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764982946\\\\\\\\\\\\\\\" (2025-12-06 00:02:26 +0000 UTC to 2026-12-06 00:02:26 +0000 UTC (now=2025-12-06 01:02:26.397572448 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397373 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1206 01:02:26.397647 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1206 01:02:26.397687 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1206 01:02:26.397767 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1206 01:02:26.399022 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:34Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.576620 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.576669 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.576678 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.576700 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.576711 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:34Z","lastTransitionTime":"2025-12-06T01:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.593663 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:34Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.616465 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:34Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.629571 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f35cea72-9e31-4432-9e3b-16a50ebce794\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9893b4133f9e41bd3fa958f5e1c25f5f0eb3cd3bd8d195d58f282866dc850c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674001323aaa7a65c9d6df014b5b35f7afb939aa6afe57ac9858923e3fe6ce15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ppxx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:34Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.656015 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:34Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.669620 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sq7kg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f95c58-fd69-48f6-94be-15927aa850c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d44b91702b22653a2cc67f20e049260e82a69445cff9d6d2abbec16d1e725147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw8lm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sq7kg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:34Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.679435 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.679485 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.679498 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.679519 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.679533 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:34Z","lastTransitionTime":"2025-12-06T01:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.688287 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e52ba3-560c-483d-9c51-898cbe94aad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tw85g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:34Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.701313 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5409f18b-76b3-4388-a9c8-50a1ec8a37d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3c6493775f81f3e0feeee29f2b3c8420a15ba9ce8117fab50521b2251af46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6fa450da9e79fab8a6d70eb261ad080499995eef8884b2e01c68f44f3d1149\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cb22622bd70c57cf25cb4e61bec6541175f1fee668e4b1aaa7e36853827315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:34Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.717757 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021d3a65c56a737a682268cde0d57a186389f597de592f541a72ea5958a170c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70371e1283910d9d652089554a815343eb1d54110a2472e067d5a2e8878c9193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:34Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.732524 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d8086b693e86f8e2ea45695911c64df4acded68da7cc61db4cdbc19b2402879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:34Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.752199 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kw4lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c790aea7-3412-47a6-a3d0-3dd019c26022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4698040873e672e01eca6a4791166d92bd11bdcc3197bbe0340472eba3ba69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnwnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kw4lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:34Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.790473 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.790545 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.790561 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.790585 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.790601 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:34Z","lastTransitionTime":"2025-12-06T01:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.893674 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.893734 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.893758 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.893784 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.893798 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:34Z","lastTransitionTime":"2025-12-06T01:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.997510 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.997582 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.997597 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.997619 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:34 crc kubenswrapper[4991]: I1206 01:02:34.997633 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:34Z","lastTransitionTime":"2025-12-06T01:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.100675 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.100738 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.100755 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.100781 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.100798 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:35Z","lastTransitionTime":"2025-12-06T01:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.138301 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-g2v2x"] Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.138762 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-g2v2x" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.141395 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.142126 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.143050 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.143524 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.154538 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc01e1f52af4386bd143ff8779458f0e335475f40f067c608c70a765d6fad952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:35Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.172382 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1c8508-3f35-44b2-988e-cc665121ce94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5t29r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:35Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.182403 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g2v2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlkqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g2v2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:35Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.185494 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlkqk\" (UniqueName: \"kubernetes.io/projected/31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37-kube-api-access-jlkqk\") pod \"node-ca-g2v2x\" (UID: \"31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37\") " pod="openshift-image-registry/node-ca-g2v2x" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.185543 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37-host\") pod \"node-ca-g2v2x\" (UID: \"31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37\") " pod="openshift-image-registry/node-ca-g2v2x" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.185572 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37-serviceca\") pod \"node-ca-g2v2x\" (UID: \"31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37\") " pod="openshift-image-registry/node-ca-g2v2x" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.198771 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:35Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.203562 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.203602 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.203613 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.203631 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.203643 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:35Z","lastTransitionTime":"2025-12-06T01:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.211745 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:35Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.222085 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f35cea72-9e31-4432-9e3b-16a50ebce794\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9893b4133f9e41bd3fa958f5e1c25f5f0eb3cd3bd8d195d58f282866dc850c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674001323aaa7a65c9d6df014b5b35f7afb939aa6afe57ac9858923e3fe6ce15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ppxx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:35Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.236815 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f3f8be-7093-40c6-8c6e-6bb65c878b31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:26Z\\\",\\\"message\\\":\\\"ication::requestheader-client-ca-file\\\\nI1206 01:02:26.397262 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1206 01:02:26.397306 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1206 01:02:26.397362 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764982931\\\\\\\\\\\\\\\" (2025-12-06 01:02:10 +0000 UTC to 2026-01-05 01:02:11 +0000 UTC (now=2025-12-06 01:02:26.397254869 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397406 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1206 01:02:26.397421 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1206 01:02:26.397440 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\"\\\\nI1206 01:02:26.397610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764982946\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764982946\\\\\\\\\\\\\\\" (2025-12-06 00:02:26 +0000 UTC to 2026-12-06 00:02:26 +0000 UTC (now=2025-12-06 01:02:26.397572448 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397373 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1206 01:02:26.397647 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1206 01:02:26.397687 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1206 01:02:26.397767 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1206 01:02:26.399022 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:35Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.242433 4991 generic.go:334] "Generic (PLEG): container finished" podID="09e52ba3-560c-483d-9c51-898cbe94aad5" containerID="5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb" exitCode=0 Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.242541 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" event={"ID":"09e52ba3-560c-483d-9c51-898cbe94aad5","Type":"ContainerDied","Data":"5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb"} Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.246400 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" event={"ID":"7a1c8508-3f35-44b2-988e-cc665121ce94","Type":"ContainerStarted","Data":"223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d"} Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.246432 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" event={"ID":"7a1c8508-3f35-44b2-988e-cc665121ce94","Type":"ContainerStarted","Data":"8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7"} Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.246446 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" event={"ID":"7a1c8508-3f35-44b2-988e-cc665121ce94","Type":"ContainerStarted","Data":"019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156"} Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.246459 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" event={"ID":"7a1c8508-3f35-44b2-988e-cc665121ce94","Type":"ContainerStarted","Data":"a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86"} Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.246473 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" event={"ID":"7a1c8508-3f35-44b2-988e-cc665121ce94","Type":"ContainerStarted","Data":"ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842"} Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.246486 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" event={"ID":"7a1c8508-3f35-44b2-988e-cc665121ce94","Type":"ContainerStarted","Data":"a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8"} Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.253518 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021d3a65c56a737a682268cde0d57a186389f597de592f541a72ea5958a170c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70371e1283910d9d652089554a815343eb1d54110a2472e067d5a2e8878c9193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:35Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.272374 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d8086b693e86f8e2ea45695911c64df4acded68da7cc61db4cdbc19b2402879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:35Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.288245 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlkqk\" (UniqueName: \"kubernetes.io/projected/31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37-kube-api-access-jlkqk\") pod \"node-ca-g2v2x\" (UID: \"31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37\") " pod="openshift-image-registry/node-ca-g2v2x" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.288330 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37-host\") pod \"node-ca-g2v2x\" (UID: \"31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37\") " pod="openshift-image-registry/node-ca-g2v2x" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.288375 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37-serviceca\") pod \"node-ca-g2v2x\" (UID: \"31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37\") " pod="openshift-image-registry/node-ca-g2v2x" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.288812 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37-host\") pod \"node-ca-g2v2x\" (UID: \"31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37\") " pod="openshift-image-registry/node-ca-g2v2x" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.288905 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:35Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.289706 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37-serviceca\") pod \"node-ca-g2v2x\" (UID: \"31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37\") " pod="openshift-image-registry/node-ca-g2v2x" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.305045 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sq7kg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f95c58-fd69-48f6-94be-15927aa850c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d44b91702b22653a2cc67f20e049260e82a69445cff9d6d2abbec16d1e725147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw8lm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sq7kg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:35Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.309608 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlkqk\" (UniqueName: \"kubernetes.io/projected/31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37-kube-api-access-jlkqk\") pod \"node-ca-g2v2x\" (UID: \"31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37\") " pod="openshift-image-registry/node-ca-g2v2x" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.310434 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.310475 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.310489 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.310512 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.310530 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:35Z","lastTransitionTime":"2025-12-06T01:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.322025 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e52ba3-560c-483d-9c51-898cbe94aad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tw85g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:35Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.335601 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5409f18b-76b3-4388-a9c8-50a1ec8a37d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3c6493775f81f3e0feeee29f2b3c8420a15ba9ce8117fab50521b2251af46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6fa450da9e79fab8a6d70eb261ad080499995eef8884b2e01c68f44f3d1149\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cb22622bd70c57cf25cb4e61bec6541175f1fee668e4b1aaa7e36853827315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:35Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.351280 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kw4lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c790aea7-3412-47a6-a3d0-3dd019c26022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4698040873e672e01eca6a4791166d92bd11bdcc3197bbe0340472eba3ba69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnwnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kw4lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:35Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.362810 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kw4lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c790aea7-3412-47a6-a3d0-3dd019c26022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4698040873e672e01eca6a4791166d92bd11bdcc3197bbe0340472eba3ba69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnwnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kw4lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:35Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.377694 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc01e1f52af4386bd143ff8779458f0e335475f40f067c608c70a765d6fad952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:35Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.404245 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1c8508-3f35-44b2-988e-cc665121ce94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5t29r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:35Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.414179 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.414219 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.414230 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.414249 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.414260 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:35Z","lastTransitionTime":"2025-12-06T01:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.416129 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g2v2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlkqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g2v2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:35Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.432393 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f3f8be-7093-40c6-8c6e-6bb65c878b31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:26Z\\\",\\\"message\\\":\\\"ication::requestheader-client-ca-file\\\\nI1206 01:02:26.397262 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1206 01:02:26.397306 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1206 01:02:26.397362 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764982931\\\\\\\\\\\\\\\" (2025-12-06 01:02:10 +0000 UTC to 2026-01-05 01:02:11 +0000 UTC (now=2025-12-06 01:02:26.397254869 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397406 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1206 01:02:26.397421 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1206 01:02:26.397440 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\"\\\\nI1206 01:02:26.397610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764982946\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764982946\\\\\\\\\\\\\\\" (2025-12-06 00:02:26 +0000 UTC to 2026-12-06 00:02:26 +0000 UTC (now=2025-12-06 01:02:26.397572448 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397373 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1206 01:02:26.397647 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1206 01:02:26.397687 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1206 01:02:26.397767 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1206 01:02:26.399022 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:35Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.446821 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:35Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.451176 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-g2v2x" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.471902 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:35Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:35 crc kubenswrapper[4991]: W1206 01:02:35.472241 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31c69985_6af9_4dc7_9e6f_9cb4c8ad1b37.slice/crio-8aef2200c8fade8026cc44e4f2b4a12f5fca864b527d8aade05f0bffdab78b63 WatchSource:0}: Error finding container 8aef2200c8fade8026cc44e4f2b4a12f5fca864b527d8aade05f0bffdab78b63: Status 404 returned error can't find the container with id 8aef2200c8fade8026cc44e4f2b4a12f5fca864b527d8aade05f0bffdab78b63 Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.516037 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f35cea72-9e31-4432-9e3b-16a50ebce794\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9893b4133f9e41bd3fa958f5e1c25f5f0eb3cd3bd8d195d58f282866dc850c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674001323aaa7a65c9d6df014b5b35f7afb939aa6afe57ac9858923e3fe6ce15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ppxx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:35Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.518671 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.518705 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.518718 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.518741 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.518755 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:35Z","lastTransitionTime":"2025-12-06T01:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.552029 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:35Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.593799 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sq7kg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f95c58-fd69-48f6-94be-15927aa850c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d44b91702b22653a2cc67f20e049260e82a69445cff9d6d2abbec16d1e725147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw8lm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sq7kg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:35Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.622828 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.622872 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.622885 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.622908 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.622923 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:35Z","lastTransitionTime":"2025-12-06T01:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.636547 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e52ba3-560c-483d-9c51-898cbe94aad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tw85g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:35Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.670548 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5409f18b-76b3-4388-a9c8-50a1ec8a37d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3c6493775f81f3e0feeee29f2b3c8420a15ba9ce8117fab50521b2251af46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6fa450da9e79fab8a6d70eb261ad080499995eef8884b2e01c68f44f3d1149\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cb22622bd70c57cf25cb4e61bec6541175f1fee668e4b1aaa7e36853827315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:35Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.711684 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021d3a65c56a737a682268cde0d57a186389f597de592f541a72ea5958a170c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70371e1283910d9d652089554a815343eb1d54110a2472e067d5a2e8878c9193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:35Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.726214 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.726282 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.726299 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.726333 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.726351 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:35Z","lastTransitionTime":"2025-12-06T01:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.751385 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d8086b693e86f8e2ea45695911c64df4acded68da7cc61db4cdbc19b2402879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:35Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.828474 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.828503 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.828513 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.828530 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.828539 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:35Z","lastTransitionTime":"2025-12-06T01:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.931187 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.931240 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.931260 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.931283 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:35 crc kubenswrapper[4991]: I1206 01:02:35.931297 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:35Z","lastTransitionTime":"2025-12-06T01:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.034516 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.034570 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.034580 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.034601 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.034616 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:36Z","lastTransitionTime":"2025-12-06T01:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.036769 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.036818 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:02:36 crc kubenswrapper[4991]: E1206 01:02:36.036894 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:02:36 crc kubenswrapper[4991]: E1206 01:02:36.036971 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.037054 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:02:36 crc kubenswrapper[4991]: E1206 01:02:36.037126 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.138115 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.138177 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.138190 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.138213 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.138228 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:36Z","lastTransitionTime":"2025-12-06T01:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.242082 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.242127 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.242138 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.242157 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.242172 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:36Z","lastTransitionTime":"2025-12-06T01:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.251158 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-g2v2x" event={"ID":"31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37","Type":"ContainerStarted","Data":"ee92c0e499eed2711d5027aba6265d26549865fbfb26de5dd38442e3c9a852b8"} Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.251225 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-g2v2x" event={"ID":"31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37","Type":"ContainerStarted","Data":"8aef2200c8fade8026cc44e4f2b4a12f5fca864b527d8aade05f0bffdab78b63"} Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.253641 4991 generic.go:334] "Generic (PLEG): container finished" podID="09e52ba3-560c-483d-9c51-898cbe94aad5" containerID="e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78" exitCode=0 Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.253676 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" event={"ID":"09e52ba3-560c-483d-9c51-898cbe94aad5","Type":"ContainerDied","Data":"e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78"} Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.267085 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:36Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.285814 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:36Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.298491 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f35cea72-9e31-4432-9e3b-16a50ebce794\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9893b4133f9e41bd3fa958f5e1c25f5f0eb3cd3bd8d195d58f282866dc850c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674001323aaa7a65c9d6df014b5b35f7afb939aa6afe57ac9858923e3fe6ce15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ppxx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:36Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.312125 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f3f8be-7093-40c6-8c6e-6bb65c878b31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:26Z\\\",\\\"message\\\":\\\"ication::requestheader-client-ca-file\\\\nI1206 01:02:26.397262 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1206 01:02:26.397306 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1206 01:02:26.397362 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764982931\\\\\\\\\\\\\\\" (2025-12-06 01:02:10 +0000 UTC to 2026-01-05 01:02:11 +0000 UTC (now=2025-12-06 01:02:26.397254869 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397406 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1206 01:02:26.397421 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1206 01:02:26.397440 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\"\\\\nI1206 01:02:26.397610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764982946\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764982946\\\\\\\\\\\\\\\" (2025-12-06 00:02:26 +0000 UTC to 2026-12-06 00:02:26 +0000 UTC (now=2025-12-06 01:02:26.397572448 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397373 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1206 01:02:26.397647 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1206 01:02:26.397687 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1206 01:02:26.397767 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1206 01:02:26.399022 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:36Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.325759 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021d3a65c56a737a682268cde0d57a186389f597de592f541a72ea5958a170c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70371e1283910d9d652089554a815343eb1d54110a2472e067d5a2e8878c9193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:36Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.338974 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d8086b693e86f8e2ea45695911c64df4acded68da7cc61db4cdbc19b2402879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:36Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.345254 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.345303 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.345319 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.345341 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.345356 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:36Z","lastTransitionTime":"2025-12-06T01:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.353598 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:36Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.368558 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sq7kg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f95c58-fd69-48f6-94be-15927aa850c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d44b91702b22653a2cc67f20e049260e82a69445cff9d6d2abbec16d1e725147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw8lm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sq7kg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:36Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.387401 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e52ba3-560c-483d-9c51-898cbe94aad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tw85g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:36Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.402824 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5409f18b-76b3-4388-a9c8-50a1ec8a37d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3c6493775f81f3e0feeee29f2b3c8420a15ba9ce8117fab50521b2251af46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6fa450da9e79fab8a6d70eb261ad080499995eef8884b2e01c68f44f3d1149\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cb22622bd70c57cf25cb4e61bec6541175f1fee668e4b1aaa7e36853827315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:36Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.415199 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kw4lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c790aea7-3412-47a6-a3d0-3dd019c26022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4698040873e672e01eca6a4791166d92bd11bdcc3197bbe0340472eba3ba69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnwnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kw4lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:36Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.427006 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc01e1f52af4386bd143ff8779458f0e335475f40f067c608c70a765d6fad952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:36Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.448483 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.448530 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.448540 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.448575 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.448586 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:36Z","lastTransitionTime":"2025-12-06T01:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.448458 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1c8508-3f35-44b2-988e-cc665121ce94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5t29r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:36Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.461613 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g2v2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee92c0e499eed2711d5027aba6265d26549865fbfb26de5dd38442e3c9a852b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlkqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g2v2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:36Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.476833 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f3f8be-7093-40c6-8c6e-6bb65c878b31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:26Z\\\",\\\"message\\\":\\\"ication::requestheader-client-ca-file\\\\nI1206 01:02:26.397262 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1206 01:02:26.397306 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1206 01:02:26.397362 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764982931\\\\\\\\\\\\\\\" (2025-12-06 01:02:10 +0000 UTC to 2026-01-05 01:02:11 +0000 UTC (now=2025-12-06 01:02:26.397254869 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397406 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1206 01:02:26.397421 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1206 01:02:26.397440 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\"\\\\nI1206 01:02:26.397610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764982946\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764982946\\\\\\\\\\\\\\\" (2025-12-06 00:02:26 +0000 UTC to 2026-12-06 00:02:26 +0000 UTC (now=2025-12-06 01:02:26.397572448 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397373 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1206 01:02:26.397647 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1206 01:02:26.397687 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1206 01:02:26.397767 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1206 01:02:26.399022 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:36Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.490322 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:36Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.532177 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:36Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.554605 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.554662 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.554676 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.554697 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.554713 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:36Z","lastTransitionTime":"2025-12-06T01:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.568443 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f35cea72-9e31-4432-9e3b-16a50ebce794\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9893b4133f9e41bd3fa958f5e1c25f5f0eb3cd3bd8d195d58f282866dc850c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674001323aaa7a65c9d6df014b5b35f7afb939aa6afe57ac9858923e3fe6ce15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ppxx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:36Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.584542 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5409f18b-76b3-4388-a9c8-50a1ec8a37d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3c6493775f81f3e0feeee29f2b3c8420a15ba9ce8117fab50521b2251af46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6fa450da9e79fab8a6d70eb261ad080499995eef8884b2e01c68f44f3d1149\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cb22622bd70c57cf25cb4e61bec6541175f1fee668e4b1aaa7e36853827315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:36Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.601233 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021d3a65c56a737a682268cde0d57a186389f597de592f541a72ea5958a170c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70371e1283910d9d652089554a815343eb1d54110a2472e067d5a2e8878c9193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:36Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.615423 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d8086b693e86f8e2ea45695911c64df4acded68da7cc61db4cdbc19b2402879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:36Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.630963 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:36Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.657302 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.657343 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.657352 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.657369 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.657379 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:36Z","lastTransitionTime":"2025-12-06T01:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.674566 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sq7kg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f95c58-fd69-48f6-94be-15927aa850c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d44b91702b22653a2cc67f20e049260e82a69445cff9d6d2abbec16d1e725147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw8lm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sq7kg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:36Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.715204 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e52ba3-560c-483d-9c51-898cbe94aad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tw85g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:36Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.752008 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kw4lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c790aea7-3412-47a6-a3d0-3dd019c26022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4698040873e672e01eca6a4791166d92bd11bdcc3197bbe0340472eba3ba69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnwnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kw4lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:36Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.760302 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.760367 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.760380 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.760402 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.760420 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:36Z","lastTransitionTime":"2025-12-06T01:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.796403 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc01e1f52af4386bd143ff8779458f0e335475f40f067c608c70a765d6fad952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:36Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.837749 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1c8508-3f35-44b2-988e-cc665121ce94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5t29r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:36Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.864124 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.864186 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.864198 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.864218 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.864233 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:36Z","lastTransitionTime":"2025-12-06T01:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.875135 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g2v2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee92c0e499eed2711d5027aba6265d26549865fbfb26de5dd38442e3c9a852b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlkqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g2v2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:36Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.968175 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.968231 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.968241 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.968258 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:36 crc kubenswrapper[4991]: I1206 01:02:36.968269 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:36Z","lastTransitionTime":"2025-12-06T01:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.087827 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.087906 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.087928 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.087960 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.088026 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:37Z","lastTransitionTime":"2025-12-06T01:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.105778 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:37Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.123687 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f35cea72-9e31-4432-9e3b-16a50ebce794\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9893b4133f9e41bd3fa958f5e1c25f5f0eb3cd3bd8d195d58f282866dc850c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674001323aaa7a65c9d6df014b5b35f7afb939aa6afe57ac9858923e3fe6ce15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ppxx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:37Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.139579 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f3f8be-7093-40c6-8c6e-6bb65c878b31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:26Z\\\",\\\"message\\\":\\\"ication::requestheader-client-ca-file\\\\nI1206 01:02:26.397262 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1206 01:02:26.397306 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1206 01:02:26.397362 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764982931\\\\\\\\\\\\\\\" (2025-12-06 01:02:10 +0000 UTC to 2026-01-05 01:02:11 +0000 UTC (now=2025-12-06 01:02:26.397254869 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397406 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1206 01:02:26.397421 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1206 01:02:26.397440 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\"\\\\nI1206 01:02:26.397610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764982946\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764982946\\\\\\\\\\\\\\\" (2025-12-06 00:02:26 +0000 UTC to 2026-12-06 00:02:26 +0000 UTC (now=2025-12-06 01:02:26.397572448 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397373 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1206 01:02:26.397647 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1206 01:02:26.397687 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1206 01:02:26.397767 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1206 01:02:26.399022 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:37Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.155743 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:37Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.171487 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021d3a65c56a737a682268cde0d57a186389f597de592f541a72ea5958a170c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70371e1283910d9d652089554a815343eb1d54110a2472e067d5a2e8878c9193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:37Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.186461 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d8086b693e86f8e2ea45695911c64df4acded68da7cc61db4cdbc19b2402879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:37Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.190929 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.191190 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.191336 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.191480 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.191616 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:37Z","lastTransitionTime":"2025-12-06T01:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.201312 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:37Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.216022 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sq7kg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f95c58-fd69-48f6-94be-15927aa850c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d44b91702b22653a2cc67f20e049260e82a69445cff9d6d2abbec16d1e725147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw8lm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sq7kg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:37Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.240302 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e52ba3-560c-483d-9c51-898cbe94aad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tw85g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:37Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.258323 4991 generic.go:334] "Generic (PLEG): container finished" podID="09e52ba3-560c-483d-9c51-898cbe94aad5" containerID="f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604" exitCode=0 Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.258385 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" event={"ID":"09e52ba3-560c-483d-9c51-898cbe94aad5","Type":"ContainerDied","Data":"f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604"} Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.262841 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" event={"ID":"7a1c8508-3f35-44b2-988e-cc665121ce94","Type":"ContainerStarted","Data":"fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda"} Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.275090 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5409f18b-76b3-4388-a9c8-50a1ec8a37d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3c6493775f81f3e0feeee29f2b3c8420a15ba9ce8117fab50521b2251af46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6fa450da9e79fab8a6d70eb261ad080499995eef8884b2e01c68f44f3d1149\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cb22622bd70c57cf25cb4e61bec6541175f1fee668e4b1aaa7e36853827315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:37Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.294721 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.294771 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.294786 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.294808 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.294823 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:37Z","lastTransitionTime":"2025-12-06T01:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.309868 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kw4lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c790aea7-3412-47a6-a3d0-3dd019c26022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4698040873e672e01eca6a4791166d92bd11bdcc3197bbe0340472eba3ba69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnwnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kw4lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:37Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.357072 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1c8508-3f35-44b2-988e-cc665121ce94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5t29r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:37Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.391512 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g2v2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee92c0e499eed2711d5027aba6265d26549865fbfb26de5dd38442e3c9a852b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlkqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g2v2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:37Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.399683 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.399741 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.399755 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.399781 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.399795 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:37Z","lastTransitionTime":"2025-12-06T01:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.433572 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc01e1f52af4386bd143ff8779458f0e335475f40f067c608c70a765d6fad952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:37Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.483660 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1c8508-3f35-44b2-988e-cc665121ce94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5t29r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:37Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.502264 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.502331 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.502345 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.502370 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.502385 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:37Z","lastTransitionTime":"2025-12-06T01:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.512841 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g2v2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee92c0e499eed2711d5027aba6265d26549865fbfb26de5dd38442e3c9a852b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlkqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g2v2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:37Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.555618 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc01e1f52af4386bd143ff8779458f0e335475f40f067c608c70a765d6fad952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:37Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.593772 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:37Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.605382 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.605444 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.605460 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.605484 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.605499 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:37Z","lastTransitionTime":"2025-12-06T01:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.633489 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f35cea72-9e31-4432-9e3b-16a50ebce794\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9893b4133f9e41bd3fa958f5e1c25f5f0eb3cd3bd8d195d58f282866dc850c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674001323aaa7a65c9d6df014b5b35f7afb939aa6afe57ac9858923e3fe6ce15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ppxx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:37Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.675377 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f3f8be-7093-40c6-8c6e-6bb65c878b31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:26Z\\\",\\\"message\\\":\\\"ication::requestheader-client-ca-file\\\\nI1206 01:02:26.397262 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1206 01:02:26.397306 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1206 01:02:26.397362 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764982931\\\\\\\\\\\\\\\" (2025-12-06 01:02:10 +0000 UTC to 2026-01-05 01:02:11 +0000 UTC (now=2025-12-06 01:02:26.397254869 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397406 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1206 01:02:26.397421 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1206 01:02:26.397440 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\"\\\\nI1206 01:02:26.397610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764982946\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764982946\\\\\\\\\\\\\\\" (2025-12-06 00:02:26 +0000 UTC to 2026-12-06 00:02:26 +0000 UTC (now=2025-12-06 01:02:26.397572448 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397373 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1206 01:02:26.397647 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1206 01:02:26.397687 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1206 01:02:26.397767 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1206 01:02:26.399022 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:37Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.708103 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.708160 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.708176 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.708203 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.708222 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:37Z","lastTransitionTime":"2025-12-06T01:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.719783 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:37Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.756119 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021d3a65c56a737a682268cde0d57a186389f597de592f541a72ea5958a170c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70371e1283910d9d652089554a815343eb1d54110a2472e067d5a2e8878c9193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:37Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.792044 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d8086b693e86f8e2ea45695911c64df4acded68da7cc61db4cdbc19b2402879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:37Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.810792 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.810830 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.810841 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.810859 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.810871 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:37Z","lastTransitionTime":"2025-12-06T01:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.843845 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:37Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.877065 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sq7kg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f95c58-fd69-48f6-94be-15927aa850c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d44b91702b22653a2cc67f20e049260e82a69445cff9d6d2abbec16d1e725147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw8lm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sq7kg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:37Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.913863 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.913910 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.913922 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.913940 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.913956 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:37Z","lastTransitionTime":"2025-12-06T01:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.916421 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e52ba3-560c-483d-9c51-898cbe94aad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tw85g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:37Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.951713 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5409f18b-76b3-4388-a9c8-50a1ec8a37d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3c6493775f81f3e0feeee29f2b3c8420a15ba9ce8117fab50521b2251af46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6fa450da9e79fab8a6d70eb261ad080499995eef8884b2e01c68f44f3d1149\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cb22622bd70c57cf25cb4e61bec6541175f1fee668e4b1aaa7e36853827315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:37Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:37 crc kubenswrapper[4991]: I1206 01:02:37.991568 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kw4lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c790aea7-3412-47a6-a3d0-3dd019c26022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4698040873e672e01eca6a4791166d92bd11bdcc3197bbe0340472eba3ba69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnwnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kw4lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:37Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.017037 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.017086 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.017104 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.017130 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.017148 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:38Z","lastTransitionTime":"2025-12-06T01:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.036751 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:02:38 crc kubenswrapper[4991]: E1206 01:02:38.036938 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.037545 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:02:38 crc kubenswrapper[4991]: E1206 01:02:38.037658 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.037749 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:02:38 crc kubenswrapper[4991]: E1206 01:02:38.037833 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.121710 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.121788 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.121812 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.121844 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.121876 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:38Z","lastTransitionTime":"2025-12-06T01:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.225451 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.225543 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.225569 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.225607 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.225635 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:38Z","lastTransitionTime":"2025-12-06T01:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.271697 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" event={"ID":"09e52ba3-560c-483d-9c51-898cbe94aad5","Type":"ContainerStarted","Data":"e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef"} Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.294458 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5409f18b-76b3-4388-a9c8-50a1ec8a37d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3c6493775f81f3e0feeee29f2b3c8420a15ba9ce8117fab50521b2251af46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6fa450da9e79fab8a6d70eb261ad080499995eef8884b2e01c68f44f3d1149\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cb22622bd70c57cf25cb4e61bec6541175f1fee668e4b1aaa7e36853827315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:38Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.317417 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021d3a65c56a737a682268cde0d57a186389f597de592f541a72ea5958a170c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70371e1283910d9d652089554a815343eb1d54110a2472e067d5a2e8878c9193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:38Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.329596 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.329651 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.329663 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.329684 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.329700 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:38Z","lastTransitionTime":"2025-12-06T01:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.336821 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d8086b693e86f8e2ea45695911c64df4acded68da7cc61db4cdbc19b2402879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:38Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.356354 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:38Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.376417 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sq7kg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f95c58-fd69-48f6-94be-15927aa850c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d44b91702b22653a2cc67f20e049260e82a69445cff9d6d2abbec16d1e725147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw8lm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sq7kg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:38Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.396819 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e52ba3-560c-483d-9c51-898cbe94aad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tw85g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:38Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.414368 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kw4lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c790aea7-3412-47a6-a3d0-3dd019c26022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4698040873e672e01eca6a4791166d92bd11bdcc3197bbe0340472eba3ba69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnwnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kw4lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:38Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.432689 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.432757 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.432777 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.432872 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.432896 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:38Z","lastTransitionTime":"2025-12-06T01:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.436538 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc01e1f52af4386bd143ff8779458f0e335475f40f067c608c70a765d6fad952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:38Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.469317 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1c8508-3f35-44b2-988e-cc665121ce94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5t29r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:38Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.483629 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g2v2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee92c0e499eed2711d5027aba6265d26549865fbfb26de5dd38442e3c9a852b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlkqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g2v2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:38Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.506231 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f3f8be-7093-40c6-8c6e-6bb65c878b31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:26Z\\\",\\\"message\\\":\\\"ication::requestheader-client-ca-file\\\\nI1206 01:02:26.397262 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1206 01:02:26.397306 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1206 01:02:26.397362 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764982931\\\\\\\\\\\\\\\" (2025-12-06 01:02:10 +0000 UTC to 2026-01-05 01:02:11 +0000 UTC (now=2025-12-06 01:02:26.397254869 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397406 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1206 01:02:26.397421 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1206 01:02:26.397440 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\"\\\\nI1206 01:02:26.397610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764982946\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764982946\\\\\\\\\\\\\\\" (2025-12-06 00:02:26 +0000 UTC to 2026-12-06 00:02:26 +0000 UTC (now=2025-12-06 01:02:26.397572448 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397373 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1206 01:02:26.397647 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1206 01:02:26.397687 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1206 01:02:26.397767 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1206 01:02:26.399022 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:38Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.525845 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:38Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.536065 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.536126 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.536147 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.536175 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.536192 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:38Z","lastTransitionTime":"2025-12-06T01:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.546685 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:38Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.565167 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f35cea72-9e31-4432-9e3b-16a50ebce794\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9893b4133f9e41bd3fa958f5e1c25f5f0eb3cd3bd8d195d58f282866dc850c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674001323aaa7a65c9d6df014b5b35f7afb939aa6afe57ac9858923e3fe6ce15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ppxx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:38Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.639635 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.639693 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.639706 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.639725 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.639735 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:38Z","lastTransitionTime":"2025-12-06T01:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.742467 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.742535 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.742550 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.742594 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.742612 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:38Z","lastTransitionTime":"2025-12-06T01:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.845188 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.845250 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.845266 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.845287 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.845300 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:38Z","lastTransitionTime":"2025-12-06T01:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.948854 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.948922 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.948935 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.948959 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:38 crc kubenswrapper[4991]: I1206 01:02:38.948972 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:38Z","lastTransitionTime":"2025-12-06T01:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.051607 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.051681 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.051699 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.051723 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.051741 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:39Z","lastTransitionTime":"2025-12-06T01:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.155352 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.155434 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.155452 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.155480 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.155499 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:39Z","lastTransitionTime":"2025-12-06T01:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.258455 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.258533 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.258552 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.258582 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.258601 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:39Z","lastTransitionTime":"2025-12-06T01:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.280839 4991 generic.go:334] "Generic (PLEG): container finished" podID="09e52ba3-560c-483d-9c51-898cbe94aad5" containerID="e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef" exitCode=0 Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.280893 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" event={"ID":"09e52ba3-560c-483d-9c51-898cbe94aad5","Type":"ContainerDied","Data":"e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef"} Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.317437 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f3f8be-7093-40c6-8c6e-6bb65c878b31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:26Z\\\",\\\"message\\\":\\\"ication::requestheader-client-ca-file\\\\nI1206 01:02:26.397262 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1206 01:02:26.397306 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1206 01:02:26.397362 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764982931\\\\\\\\\\\\\\\" (2025-12-06 01:02:10 +0000 UTC to 2026-01-05 01:02:11 +0000 UTC (now=2025-12-06 01:02:26.397254869 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397406 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1206 01:02:26.397421 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1206 01:02:26.397440 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\"\\\\nI1206 01:02:26.397610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764982946\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764982946\\\\\\\\\\\\\\\" (2025-12-06 00:02:26 +0000 UTC to 2026-12-06 00:02:26 +0000 UTC (now=2025-12-06 01:02:26.397572448 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397373 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1206 01:02:26.397647 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1206 01:02:26.397687 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1206 01:02:26.397767 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1206 01:02:26.399022 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:39Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.332255 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:39Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.346882 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:39Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.359215 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f35cea72-9e31-4432-9e3b-16a50ebce794\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9893b4133f9e41bd3fa958f5e1c25f5f0eb3cd3bd8d195d58f282866dc850c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674001323aaa7a65c9d6df014b5b35f7afb939aa6afe57ac9858923e3fe6ce15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ppxx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:39Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.366359 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.366417 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.366434 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.366460 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.366483 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:39Z","lastTransitionTime":"2025-12-06T01:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.375899 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:39Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.391129 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sq7kg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f95c58-fd69-48f6-94be-15927aa850c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d44b91702b22653a2cc67f20e049260e82a69445cff9d6d2abbec16d1e725147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw8lm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sq7kg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:39Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.409421 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e52ba3-560c-483d-9c51-898cbe94aad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tw85g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:39Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.422937 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5409f18b-76b3-4388-a9c8-50a1ec8a37d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3c6493775f81f3e0feeee29f2b3c8420a15ba9ce8117fab50521b2251af46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6fa450da9e79fab8a6d70eb261ad080499995eef8884b2e01c68f44f3d1149\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cb22622bd70c57cf25cb4e61bec6541175f1fee668e4b1aaa7e36853827315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:39Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.436882 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021d3a65c56a737a682268cde0d57a186389f597de592f541a72ea5958a170c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70371e1283910d9d652089554a815343eb1d54110a2472e067d5a2e8878c9193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:39Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.449229 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d8086b693e86f8e2ea45695911c64df4acded68da7cc61db4cdbc19b2402879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:39Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.461540 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kw4lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c790aea7-3412-47a6-a3d0-3dd019c26022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4698040873e672e01eca6a4791166d92bd11bdcc3197bbe0340472eba3ba69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnwnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kw4lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:39Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.468975 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.469031 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.469041 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.469058 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.469067 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:39Z","lastTransitionTime":"2025-12-06T01:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.479546 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc01e1f52af4386bd143ff8779458f0e335475f40f067c608c70a765d6fad952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:39Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.498616 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1c8508-3f35-44b2-988e-cc665121ce94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5t29r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:39Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.510808 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g2v2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee92c0e499eed2711d5027aba6265d26549865fbfb26de5dd38442e3c9a852b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlkqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g2v2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:39Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.571858 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.571896 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.571904 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.571920 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.571931 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:39Z","lastTransitionTime":"2025-12-06T01:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.674635 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.674681 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.674691 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.674711 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.674723 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:39Z","lastTransitionTime":"2025-12-06T01:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.778318 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.778400 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.778414 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.778437 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.778474 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:39Z","lastTransitionTime":"2025-12-06T01:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.880882 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.880937 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.880947 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.880962 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.880995 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:39Z","lastTransitionTime":"2025-12-06T01:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.984654 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.984756 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.984781 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.984821 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:39 crc kubenswrapper[4991]: I1206 01:02:39.984846 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:39Z","lastTransitionTime":"2025-12-06T01:02:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.037623 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.037734 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.037620 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:02:40 crc kubenswrapper[4991]: E1206 01:02:40.037852 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:02:40 crc kubenswrapper[4991]: E1206 01:02:40.038095 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:02:40 crc kubenswrapper[4991]: E1206 01:02:40.038253 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.088365 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.088410 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.088420 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.088441 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.088456 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:40Z","lastTransitionTime":"2025-12-06T01:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.191119 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.191180 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.191197 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.191229 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.191248 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:40Z","lastTransitionTime":"2025-12-06T01:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.290332 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" event={"ID":"09e52ba3-560c-483d-9c51-898cbe94aad5","Type":"ContainerStarted","Data":"8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c"} Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.293418 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.293503 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.293536 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.293651 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.293687 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:40Z","lastTransitionTime":"2025-12-06T01:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.300086 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" event={"ID":"7a1c8508-3f35-44b2-988e-cc665121ce94","Type":"ContainerStarted","Data":"74348a275bbff980e3fcc2f0657d8877d541c9b854bb9f9d26ba76494876f9b2"} Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.300501 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.300569 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.320545 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc01e1f52af4386bd143ff8779458f0e335475f40f067c608c70a765d6fad952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:40Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.370001 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.370473 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.376046 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1c8508-3f35-44b2-988e-cc665121ce94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5t29r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:40Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.389308 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g2v2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee92c0e499eed2711d5027aba6265d26549865fbfb26de5dd38442e3c9a852b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlkqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g2v2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:40Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.396915 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.397009 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.397030 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.397061 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.397086 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:40Z","lastTransitionTime":"2025-12-06T01:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.412737 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f3f8be-7093-40c6-8c6e-6bb65c878b31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:26Z\\\",\\\"message\\\":\\\"ication::requestheader-client-ca-file\\\\nI1206 01:02:26.397262 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1206 01:02:26.397306 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1206 01:02:26.397362 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764982931\\\\\\\\\\\\\\\" (2025-12-06 01:02:10 +0000 UTC to 2026-01-05 01:02:11 +0000 UTC (now=2025-12-06 01:02:26.397254869 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397406 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1206 01:02:26.397421 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1206 01:02:26.397440 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\"\\\\nI1206 01:02:26.397610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764982946\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764982946\\\\\\\\\\\\\\\" (2025-12-06 00:02:26 +0000 UTC to 2026-12-06 00:02:26 +0000 UTC (now=2025-12-06 01:02:26.397572448 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397373 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1206 01:02:26.397647 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1206 01:02:26.397687 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1206 01:02:26.397767 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1206 01:02:26.399022 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:40Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.429189 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:40Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.448825 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:40Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.465434 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f35cea72-9e31-4432-9e3b-16a50ebce794\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9893b4133f9e41bd3fa958f5e1c25f5f0eb3cd3bd8d195d58f282866dc850c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674001323aaa7a65c9d6df014b5b35f7afb939aa6afe57ac9858923e3fe6ce15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ppxx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:40Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.484970 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5409f18b-76b3-4388-a9c8-50a1ec8a37d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3c6493775f81f3e0feeee29f2b3c8420a15ba9ce8117fab50521b2251af46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6fa450da9e79fab8a6d70eb261ad080499995eef8884b2e01c68f44f3d1149\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cb22622bd70c57cf25cb4e61bec6541175f1fee668e4b1aaa7e36853827315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:40Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.500553 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.500619 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.500641 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.500670 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.500693 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:40Z","lastTransitionTime":"2025-12-06T01:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.502451 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021d3a65c56a737a682268cde0d57a186389f597de592f541a72ea5958a170c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70371e1283910d9d652089554a815343eb1d54110a2472e067d5a2e8878c9193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:40Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.516648 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d8086b693e86f8e2ea45695911c64df4acded68da7cc61db4cdbc19b2402879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:40Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.531135 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:40Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.553763 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sq7kg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f95c58-fd69-48f6-94be-15927aa850c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d44b91702b22653a2cc67f20e049260e82a69445cff9d6d2abbec16d1e725147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw8lm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sq7kg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:40Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.578810 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e52ba3-560c-483d-9c51-898cbe94aad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tw85g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:40Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.599160 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kw4lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c790aea7-3412-47a6-a3d0-3dd019c26022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4698040873e672e01eca6a4791166d92bd11bdcc3197bbe0340472eba3ba69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnwnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kw4lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:40Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.603707 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.603769 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.603784 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.603803 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.603816 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:40Z","lastTransitionTime":"2025-12-06T01:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.620864 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:40Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.644214 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:40Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.664040 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f35cea72-9e31-4432-9e3b-16a50ebce794\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9893b4133f9e41bd3fa958f5e1c25f5f0eb3cd3bd8d195d58f282866dc850c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674001323aaa7a65c9d6df014b5b35f7afb939aa6afe57ac9858923e3fe6ce15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ppxx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:40Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.687517 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f3f8be-7093-40c6-8c6e-6bb65c878b31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:26Z\\\",\\\"message\\\":\\\"ication::requestheader-client-ca-file\\\\nI1206 01:02:26.397262 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1206 01:02:26.397306 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1206 01:02:26.397362 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764982931\\\\\\\\\\\\\\\" (2025-12-06 01:02:10 +0000 UTC to 2026-01-05 01:02:11 +0000 UTC (now=2025-12-06 01:02:26.397254869 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397406 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1206 01:02:26.397421 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1206 01:02:26.397440 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\"\\\\nI1206 01:02:26.397610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764982946\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764982946\\\\\\\\\\\\\\\" (2025-12-06 00:02:26 +0000 UTC to 2026-12-06 00:02:26 +0000 UTC (now=2025-12-06 01:02:26.397572448 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397373 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1206 01:02:26.397647 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1206 01:02:26.397687 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1206 01:02:26.397767 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1206 01:02:26.399022 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:40Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.707567 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.707640 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.707666 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.707706 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.707723 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:40Z","lastTransitionTime":"2025-12-06T01:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.709741 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021d3a65c56a737a682268cde0d57a186389f597de592f541a72ea5958a170c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70371e1283910d9d652089554a815343eb1d54110a2472e067d5a2e8878c9193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:40Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.727638 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d8086b693e86f8e2ea45695911c64df4acded68da7cc61db4cdbc19b2402879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:40Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.744604 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:40Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.759541 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sq7kg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f95c58-fd69-48f6-94be-15927aa850c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d44b91702b22653a2cc67f20e049260e82a69445cff9d6d2abbec16d1e725147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw8lm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sq7kg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:40Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.784821 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e52ba3-560c-483d-9c51-898cbe94aad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tw85g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:40Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.804776 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5409f18b-76b3-4388-a9c8-50a1ec8a37d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3c6493775f81f3e0feeee29f2b3c8420a15ba9ce8117fab50521b2251af46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6fa450da9e79fab8a6d70eb261ad080499995eef8884b2e01c68f44f3d1149\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cb22622bd70c57cf25cb4e61bec6541175f1fee668e4b1aaa7e36853827315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:40Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.811938 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.812016 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.812028 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.812049 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.812086 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:40Z","lastTransitionTime":"2025-12-06T01:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.821470 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kw4lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c790aea7-3412-47a6-a3d0-3dd019c26022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4698040873e672e01eca6a4791166d92bd11bdcc3197bbe0340472eba3ba69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnwnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kw4lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:40Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.840603 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc01e1f52af4386bd143ff8779458f0e335475f40f067c608c70a765d6fad952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:40Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.874618 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1c8508-3f35-44b2-988e-cc665121ce94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74348a275bbff980e3fcc2f0657d8877d541c9b854bb9f9d26ba76494876f9b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5t29r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:40Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.891425 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g2v2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee92c0e499eed2711d5027aba6265d26549865fbfb26de5dd38442e3c9a852b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlkqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g2v2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:40Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.914510 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.914582 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.914603 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.914636 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:40 crc kubenswrapper[4991]: I1206 01:02:40.914655 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:40Z","lastTransitionTime":"2025-12-06T01:02:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.017430 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.017480 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.017492 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.017515 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.017532 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:41Z","lastTransitionTime":"2025-12-06T01:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.120954 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.121047 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.121065 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.121091 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.121109 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:41Z","lastTransitionTime":"2025-12-06T01:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.224064 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.224129 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.224147 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.224175 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.224193 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:41Z","lastTransitionTime":"2025-12-06T01:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.306378 4991 generic.go:334] "Generic (PLEG): container finished" podID="09e52ba3-560c-483d-9c51-898cbe94aad5" containerID="8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c" exitCode=0 Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.306522 4991 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.307302 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" event={"ID":"09e52ba3-560c-483d-9c51-898cbe94aad5","Type":"ContainerDied","Data":"8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c"} Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.324423 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kw4lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c790aea7-3412-47a6-a3d0-3dd019c26022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4698040873e672e01eca6a4791166d92bd11bdcc3197bbe0340472eba3ba69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnwnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kw4lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:41Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.327248 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.327327 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.327346 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.327374 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.327390 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:41Z","lastTransitionTime":"2025-12-06T01:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.344330 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc01e1f52af4386bd143ff8779458f0e335475f40f067c608c70a765d6fad952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:41Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.374200 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1c8508-3f35-44b2-988e-cc665121ce94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74348a275bbff980e3fcc2f0657d8877d541c9b854bb9f9d26ba76494876f9b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5t29r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:41Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.388691 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g2v2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee92c0e499eed2711d5027aba6265d26549865fbfb26de5dd38442e3c9a852b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlkqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g2v2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:41Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.412747 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f3f8be-7093-40c6-8c6e-6bb65c878b31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:26Z\\\",\\\"message\\\":\\\"ication::requestheader-client-ca-file\\\\nI1206 01:02:26.397262 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1206 01:02:26.397306 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1206 01:02:26.397362 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764982931\\\\\\\\\\\\\\\" (2025-12-06 01:02:10 +0000 UTC to 2026-01-05 01:02:11 +0000 UTC (now=2025-12-06 01:02:26.397254869 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397406 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1206 01:02:26.397421 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1206 01:02:26.397440 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\"\\\\nI1206 01:02:26.397610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764982946\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764982946\\\\\\\\\\\\\\\" (2025-12-06 00:02:26 +0000 UTC to 2026-12-06 00:02:26 +0000 UTC (now=2025-12-06 01:02:26.397572448 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397373 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1206 01:02:26.397647 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1206 01:02:26.397687 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1206 01:02:26.397767 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1206 01:02:26.399022 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:41Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.429962 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:41Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.432586 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.432653 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.432667 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.432691 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.432705 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:41Z","lastTransitionTime":"2025-12-06T01:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.449578 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:41Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.465772 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f35cea72-9e31-4432-9e3b-16a50ebce794\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9893b4133f9e41bd3fa958f5e1c25f5f0eb3cd3bd8d195d58f282866dc850c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674001323aaa7a65c9d6df014b5b35f7afb939aa6afe57ac9858923e3fe6ce15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ppxx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:41Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.483962 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5409f18b-76b3-4388-a9c8-50a1ec8a37d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3c6493775f81f3e0feeee29f2b3c8420a15ba9ce8117fab50521b2251af46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6fa450da9e79fab8a6d70eb261ad080499995eef8884b2e01c68f44f3d1149\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cb22622bd70c57cf25cb4e61bec6541175f1fee668e4b1aaa7e36853827315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:41Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.500728 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021d3a65c56a737a682268cde0d57a186389f597de592f541a72ea5958a170c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70371e1283910d9d652089554a815343eb1d54110a2472e067d5a2e8878c9193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:41Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.515728 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d8086b693e86f8e2ea45695911c64df4acded68da7cc61db4cdbc19b2402879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:41Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.530820 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:41Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.540519 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.540552 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.540565 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.540587 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.540633 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:41Z","lastTransitionTime":"2025-12-06T01:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.547035 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sq7kg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f95c58-fd69-48f6-94be-15927aa850c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d44b91702b22653a2cc67f20e049260e82a69445cff9d6d2abbec16d1e725147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw8lm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sq7kg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:41Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.564434 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e52ba3-560c-483d-9c51-898cbe94aad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tw85g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:41Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.643876 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.643929 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.643939 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.643961 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.643972 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:41Z","lastTransitionTime":"2025-12-06T01:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.747886 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.747956 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.747973 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.748022 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.748043 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:41Z","lastTransitionTime":"2025-12-06T01:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.768939 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:02:41 crc kubenswrapper[4991]: E1206 01:02:41.769178 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:02:57.769143926 +0000 UTC m=+51.119434404 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.851239 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.851318 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.851356 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.851374 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.851384 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:41Z","lastTransitionTime":"2025-12-06T01:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.955053 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.955100 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.955110 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.955127 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.955137 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:41Z","lastTransitionTime":"2025-12-06T01:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.971455 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.971525 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.971558 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:02:41 crc kubenswrapper[4991]: I1206 01:02:41.971603 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:02:41 crc kubenswrapper[4991]: E1206 01:02:41.971769 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 01:02:41 crc kubenswrapper[4991]: E1206 01:02:41.971820 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 01:02:41 crc kubenswrapper[4991]: E1206 01:02:41.971837 4991 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 01:02:41 crc kubenswrapper[4991]: E1206 01:02:41.971846 4991 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 01:02:41 crc kubenswrapper[4991]: E1206 01:02:41.971911 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-06 01:02:57.971886085 +0000 UTC m=+51.322176573 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 01:02:41 crc kubenswrapper[4991]: E1206 01:02:41.971938 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 01:02:57.971926766 +0000 UTC m=+51.322217244 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 01:02:41 crc kubenswrapper[4991]: E1206 01:02:41.971795 4991 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 01:02:41 crc kubenswrapper[4991]: E1206 01:02:41.971979 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 01:02:57.971972047 +0000 UTC m=+51.322262525 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 01:02:41 crc kubenswrapper[4991]: E1206 01:02:41.972019 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 01:02:41 crc kubenswrapper[4991]: E1206 01:02:41.972040 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 01:02:41 crc kubenswrapper[4991]: E1206 01:02:41.972059 4991 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 01:02:41 crc kubenswrapper[4991]: E1206 01:02:41.972113 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-06 01:02:57.97208818 +0000 UTC m=+51.322378878 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.037052 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.037087 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.037207 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:02:42 crc kubenswrapper[4991]: E1206 01:02:42.037198 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:02:42 crc kubenswrapper[4991]: E1206 01:02:42.037277 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:02:42 crc kubenswrapper[4991]: E1206 01:02:42.037326 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.057791 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.057827 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.057843 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.057865 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.057881 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:42Z","lastTransitionTime":"2025-12-06T01:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.160180 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.160225 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.160233 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.160249 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.160260 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:42Z","lastTransitionTime":"2025-12-06T01:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.263456 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.263514 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.263529 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.263547 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.263560 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:42Z","lastTransitionTime":"2025-12-06T01:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.314428 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" event={"ID":"09e52ba3-560c-483d-9c51-898cbe94aad5","Type":"ContainerStarted","Data":"3f6d3d0e90489e71ee7e5e352e5d1752a7248686adf123c15f2f9aba34e9c556"} Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.314533 4991 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.336961 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021d3a65c56a737a682268cde0d57a186389f597de592f541a72ea5958a170c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70371e1283910d9d652089554a815343eb1d54110a2472e067d5a2e8878c9193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:42Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.357129 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d8086b693e86f8e2ea45695911c64df4acded68da7cc61db4cdbc19b2402879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:42Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.366477 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.366563 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.366582 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.366610 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.366624 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:42Z","lastTransitionTime":"2025-12-06T01:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.381320 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:42Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.400383 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sq7kg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f95c58-fd69-48f6-94be-15927aa850c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d44b91702b22653a2cc67f20e049260e82a69445cff9d6d2abbec16d1e725147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw8lm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sq7kg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:42Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.428320 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e52ba3-560c-483d-9c51-898cbe94aad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f6d3d0e90489e71ee7e5e352e5d1752a7248686adf123c15f2f9aba34e9c556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tw85g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:42Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.445356 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5409f18b-76b3-4388-a9c8-50a1ec8a37d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3c6493775f81f3e0feeee29f2b3c8420a15ba9ce8117fab50521b2251af46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6fa450da9e79fab8a6d70eb261ad080499995eef8884b2e01c68f44f3d1149\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cb22622bd70c57cf25cb4e61bec6541175f1fee668e4b1aaa7e36853827315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:42Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.465788 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kw4lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c790aea7-3412-47a6-a3d0-3dd019c26022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4698040873e672e01eca6a4791166d92bd11bdcc3197bbe0340472eba3ba69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnwnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kw4lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:42Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.469613 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.469669 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.469684 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.469718 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.469732 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:42Z","lastTransitionTime":"2025-12-06T01:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.491702 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc01e1f52af4386bd143ff8779458f0e335475f40f067c608c70a765d6fad952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:42Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.514503 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1c8508-3f35-44b2-988e-cc665121ce94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74348a275bbff980e3fcc2f0657d8877d541c9b854bb9f9d26ba76494876f9b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5t29r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:42Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.527748 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g2v2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee92c0e499eed2711d5027aba6265d26549865fbfb26de5dd38442e3c9a852b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlkqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g2v2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:42Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.539697 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:42Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.551642 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:42Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.562068 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f35cea72-9e31-4432-9e3b-16a50ebce794\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9893b4133f9e41bd3fa958f5e1c25f5f0eb3cd3bd8d195d58f282866dc850c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674001323aaa7a65c9d6df014b5b35f7afb939aa6afe57ac9858923e3fe6ce15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ppxx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:42Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.573604 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.573659 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.573672 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.573695 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.573708 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:42Z","lastTransitionTime":"2025-12-06T01:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.577437 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f3f8be-7093-40c6-8c6e-6bb65c878b31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:26Z\\\",\\\"message\\\":\\\"ication::requestheader-client-ca-file\\\\nI1206 01:02:26.397262 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1206 01:02:26.397306 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1206 01:02:26.397362 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764982931\\\\\\\\\\\\\\\" (2025-12-06 01:02:10 +0000 UTC to 2026-01-05 01:02:11 +0000 UTC (now=2025-12-06 01:02:26.397254869 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397406 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1206 01:02:26.397421 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1206 01:02:26.397440 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\"\\\\nI1206 01:02:26.397610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764982946\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764982946\\\\\\\\\\\\\\\" (2025-12-06 00:02:26 +0000 UTC to 2026-12-06 00:02:26 +0000 UTC (now=2025-12-06 01:02:26.397572448 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397373 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1206 01:02:26.397647 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1206 01:02:26.397687 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1206 01:02:26.397767 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1206 01:02:26.399022 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:42Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.676915 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.676965 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.676975 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.677019 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.677033 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:42Z","lastTransitionTime":"2025-12-06T01:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.780052 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.780117 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.780133 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.780156 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.780169 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:42Z","lastTransitionTime":"2025-12-06T01:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.791503 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.791539 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.791553 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.791572 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.791592 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:42Z","lastTransitionTime":"2025-12-06T01:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:42 crc kubenswrapper[4991]: E1206 01:02:42.813789 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"596c3607-4a7a-43fe-9f86-5d0b3f7502dc\\\",\\\"systemUUID\\\":\\\"a16d5df6-771f-4852-a5a1-7a32af9d583d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:42Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.818921 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.819009 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.819020 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.819042 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.819054 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:42Z","lastTransitionTime":"2025-12-06T01:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:42 crc kubenswrapper[4991]: E1206 01:02:42.840505 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"596c3607-4a7a-43fe-9f86-5d0b3f7502dc\\\",\\\"systemUUID\\\":\\\"a16d5df6-771f-4852-a5a1-7a32af9d583d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:42Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.846026 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.846115 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.846145 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.846179 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.846203 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:42Z","lastTransitionTime":"2025-12-06T01:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:42 crc kubenswrapper[4991]: E1206 01:02:42.865766 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"596c3607-4a7a-43fe-9f86-5d0b3f7502dc\\\",\\\"systemUUID\\\":\\\"a16d5df6-771f-4852-a5a1-7a32af9d583d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:42Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.871105 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.871142 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.871154 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.871173 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.871186 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:42Z","lastTransitionTime":"2025-12-06T01:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:42 crc kubenswrapper[4991]: E1206 01:02:42.888426 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"596c3607-4a7a-43fe-9f86-5d0b3f7502dc\\\",\\\"systemUUID\\\":\\\"a16d5df6-771f-4852-a5a1-7a32af9d583d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:42Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.893063 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.893128 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.893142 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.893168 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.893182 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:42Z","lastTransitionTime":"2025-12-06T01:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:42 crc kubenswrapper[4991]: E1206 01:02:42.908465 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"596c3607-4a7a-43fe-9f86-5d0b3f7502dc\\\",\\\"systemUUID\\\":\\\"a16d5df6-771f-4852-a5a1-7a32af9d583d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:42Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:42 crc kubenswrapper[4991]: E1206 01:02:42.908591 4991 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.910552 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.910596 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.910609 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.910628 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:42 crc kubenswrapper[4991]: I1206 01:02:42.910642 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:42Z","lastTransitionTime":"2025-12-06T01:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.013226 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.013269 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.013278 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.013295 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.013304 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:43Z","lastTransitionTime":"2025-12-06T01:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.116515 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.116576 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.116595 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.116622 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.116638 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:43Z","lastTransitionTime":"2025-12-06T01:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.219634 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.219669 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.219677 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.219694 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.219704 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:43Z","lastTransitionTime":"2025-12-06T01:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.322810 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.322874 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.322892 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.322918 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.322935 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:43Z","lastTransitionTime":"2025-12-06T01:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.425807 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.425864 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.425875 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.425900 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.425911 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:43Z","lastTransitionTime":"2025-12-06T01:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.528905 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.528955 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.528964 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.528997 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.529008 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:43Z","lastTransitionTime":"2025-12-06T01:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.631417 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.631463 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.631473 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.631490 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.631500 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:43Z","lastTransitionTime":"2025-12-06T01:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.736532 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.736599 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.736614 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.736638 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.736652 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:43Z","lastTransitionTime":"2025-12-06T01:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.839979 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.840154 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.840179 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.840205 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.840231 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:43Z","lastTransitionTime":"2025-12-06T01:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.943234 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.943375 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.943399 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.943430 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:43 crc kubenswrapper[4991]: I1206 01:02:43.943451 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:43Z","lastTransitionTime":"2025-12-06T01:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.036938 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.037103 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.036953 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:02:44 crc kubenswrapper[4991]: E1206 01:02:44.037228 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:02:44 crc kubenswrapper[4991]: E1206 01:02:44.037658 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:02:44 crc kubenswrapper[4991]: E1206 01:02:44.037746 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.038519 4991 scope.go:117] "RemoveContainer" containerID="d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.048926 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.049059 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.049082 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.049115 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.049137 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:44Z","lastTransitionTime":"2025-12-06T01:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.151669 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.151733 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.151752 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.151784 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.151804 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:44Z","lastTransitionTime":"2025-12-06T01:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.255373 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.255424 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.255436 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.255452 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.255470 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:44Z","lastTransitionTime":"2025-12-06T01:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.326213 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5t29r_7a1c8508-3f35-44b2-988e-cc665121ce94/ovnkube-controller/0.log" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.330393 4991 generic.go:334] "Generic (PLEG): container finished" podID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerID="74348a275bbff980e3fcc2f0657d8877d541c9b854bb9f9d26ba76494876f9b2" exitCode=1 Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.330474 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" event={"ID":"7a1c8508-3f35-44b2-988e-cc665121ce94","Type":"ContainerDied","Data":"74348a275bbff980e3fcc2f0657d8877d541c9b854bb9f9d26ba76494876f9b2"} Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.334055 4991 scope.go:117] "RemoveContainer" containerID="74348a275bbff980e3fcc2f0657d8877d541c9b854bb9f9d26ba76494876f9b2" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.352241 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kw4lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c790aea7-3412-47a6-a3d0-3dd019c26022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4698040873e672e01eca6a4791166d92bd11bdcc3197bbe0340472eba3ba69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnwnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kw4lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:44Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.358080 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.358122 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.358133 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.358151 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.358163 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:44Z","lastTransitionTime":"2025-12-06T01:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.373789 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc01e1f52af4386bd143ff8779458f0e335475f40f067c608c70a765d6fad952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:44Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.397869 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1c8508-3f35-44b2-988e-cc665121ce94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74348a275bbff980e3fcc2f0657d8877d541c9b854bb9f9d26ba76494876f9b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74348a275bbff980e3fcc2f0657d8877d541c9b854bb9f9d26ba76494876f9b2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T01:02:43Z\\\",\\\"message\\\":\\\"factory.go:160\\\\nI1206 01:02:43.087189 6324 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1206 01:02:43.087487 6324 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 01:02:43.087832 6324 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1206 01:02:43.087954 6324 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1206 01:02:43.087964 6324 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1206 01:02:43.087980 6324 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1206 01:02:43.088008 6324 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1206 01:02:43.088024 6324 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1206 01:02:43.088083 6324 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1206 01:02:43.088112 6324 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1206 01:02:43.088139 6324 factory.go:656] Stopping watch factory\\\\nI1206 01:02:43.088146 6324 handler.go:208] Removed *v1.Node event handler 2\\\\nI1206 01:02:43.088154 6324 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1206 01:02:43.088162 6324 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5t29r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:44Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.416316 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g2v2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee92c0e499eed2711d5027aba6265d26549865fbfb26de5dd38442e3c9a852b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlkqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g2v2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:44Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.442798 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f3f8be-7093-40c6-8c6e-6bb65c878b31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:26Z\\\",\\\"message\\\":\\\"ication::requestheader-client-ca-file\\\\nI1206 01:02:26.397262 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1206 01:02:26.397306 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1206 01:02:26.397362 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764982931\\\\\\\\\\\\\\\" (2025-12-06 01:02:10 +0000 UTC to 2026-01-05 01:02:11 +0000 UTC (now=2025-12-06 01:02:26.397254869 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397406 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1206 01:02:26.397421 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1206 01:02:26.397440 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\"\\\\nI1206 01:02:26.397610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764982946\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764982946\\\\\\\\\\\\\\\" (2025-12-06 00:02:26 +0000 UTC to 2026-12-06 00:02:26 +0000 UTC (now=2025-12-06 01:02:26.397572448 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397373 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1206 01:02:26.397647 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1206 01:02:26.397687 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1206 01:02:26.397767 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1206 01:02:26.399022 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:44Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.461446 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.461526 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.461551 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.461582 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.461606 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:44Z","lastTransitionTime":"2025-12-06T01:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.465954 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:44Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.488660 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:44Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.508392 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f35cea72-9e31-4432-9e3b-16a50ebce794\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9893b4133f9e41bd3fa958f5e1c25f5f0eb3cd3bd8d195d58f282866dc850c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674001323aaa7a65c9d6df014b5b35f7afb939aa6afe57ac9858923e3fe6ce15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ppxx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:44Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.529124 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5409f18b-76b3-4388-a9c8-50a1ec8a37d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3c6493775f81f3e0feeee29f2b3c8420a15ba9ce8117fab50521b2251af46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6fa450da9e79fab8a6d70eb261ad080499995eef8884b2e01c68f44f3d1149\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cb22622bd70c57cf25cb4e61bec6541175f1fee668e4b1aaa7e36853827315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:44Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.549128 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021d3a65c56a737a682268cde0d57a186389f597de592f541a72ea5958a170c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70371e1283910d9d652089554a815343eb1d54110a2472e067d5a2e8878c9193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:44Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.564686 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.564756 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.564776 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.564885 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.564912 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:44Z","lastTransitionTime":"2025-12-06T01:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.567836 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d8086b693e86f8e2ea45695911c64df4acded68da7cc61db4cdbc19b2402879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:44Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.591713 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:44Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.613311 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sq7kg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f95c58-fd69-48f6-94be-15927aa850c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d44b91702b22653a2cc67f20e049260e82a69445cff9d6d2abbec16d1e725147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw8lm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sq7kg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:44Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.638758 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e52ba3-560c-483d-9c51-898cbe94aad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f6d3d0e90489e71ee7e5e352e5d1752a7248686adf123c15f2f9aba34e9c556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tw85g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:44Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.668821 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.668924 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.668948 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.669010 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.669030 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:44Z","lastTransitionTime":"2025-12-06T01:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.772548 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.772624 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.772643 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.772674 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.772693 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:44Z","lastTransitionTime":"2025-12-06T01:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.876307 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.876385 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.876405 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.876434 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.876455 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:44Z","lastTransitionTime":"2025-12-06T01:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.980292 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.980337 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.980346 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.980365 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:44 crc kubenswrapper[4991]: I1206 01:02:44.980380 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:44Z","lastTransitionTime":"2025-12-06T01:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.083450 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.083555 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.083581 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.083612 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.083631 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:45Z","lastTransitionTime":"2025-12-06T01:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.191806 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.191892 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.191915 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.191943 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.191962 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:45Z","lastTransitionTime":"2025-12-06T01:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.296779 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.297456 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.297485 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.297527 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.297554 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:45Z","lastTransitionTime":"2025-12-06T01:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.405281 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.405337 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.405357 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.405384 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.405405 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:45Z","lastTransitionTime":"2025-12-06T01:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.414947 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t8t9g"] Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.416025 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t8t9g" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.418981 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.421326 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.436944 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kw4lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c790aea7-3412-47a6-a3d0-3dd019c26022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4698040873e672e01eca6a4791166d92bd11bdcc3197bbe0340472eba3ba69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnwnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kw4lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:45Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.474195 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1c8508-3f35-44b2-988e-cc665121ce94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74348a275bbff980e3fcc2f0657d8877d541c9b854bb9f9d26ba76494876f9b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74348a275bbff980e3fcc2f0657d8877d541c9b854bb9f9d26ba76494876f9b2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T01:02:43Z\\\",\\\"message\\\":\\\"factory.go:160\\\\nI1206 01:02:43.087189 6324 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1206 01:02:43.087487 6324 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 01:02:43.087832 6324 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1206 01:02:43.087954 6324 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1206 01:02:43.087964 6324 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1206 01:02:43.087980 6324 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1206 01:02:43.088008 6324 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1206 01:02:43.088024 6324 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1206 01:02:43.088083 6324 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1206 01:02:43.088112 6324 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1206 01:02:43.088139 6324 factory.go:656] Stopping watch factory\\\\nI1206 01:02:43.088146 6324 handler.go:208] Removed *v1.Node event handler 2\\\\nI1206 01:02:43.088154 6324 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1206 01:02:43.088162 6324 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5t29r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:45Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.493531 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g2v2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee92c0e499eed2711d5027aba6265d26549865fbfb26de5dd38442e3c9a852b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlkqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g2v2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:45Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.508631 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.508687 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.508705 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.508734 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.508753 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:45Z","lastTransitionTime":"2025-12-06T01:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.514489 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gpqq\" (UniqueName: \"kubernetes.io/projected/dfebf788-e0ac-4bd4-8926-1f7b7cabba71-kube-api-access-9gpqq\") pod \"ovnkube-control-plane-749d76644c-t8t9g\" (UID: \"dfebf788-e0ac-4bd4-8926-1f7b7cabba71\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t8t9g" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.514609 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dfebf788-e0ac-4bd4-8926-1f7b7cabba71-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-t8t9g\" (UID: \"dfebf788-e0ac-4bd4-8926-1f7b7cabba71\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t8t9g" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.514719 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dfebf788-e0ac-4bd4-8926-1f7b7cabba71-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-t8t9g\" (UID: \"dfebf788-e0ac-4bd4-8926-1f7b7cabba71\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t8t9g" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.514775 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dfebf788-e0ac-4bd4-8926-1f7b7cabba71-env-overrides\") pod \"ovnkube-control-plane-749d76644c-t8t9g\" (UID: \"dfebf788-e0ac-4bd4-8926-1f7b7cabba71\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t8t9g" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.518285 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc01e1f52af4386bd143ff8779458f0e335475f40f067c608c70a765d6fad952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:45Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.537915 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:45Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.557921 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f35cea72-9e31-4432-9e3b-16a50ebce794\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9893b4133f9e41bd3fa958f5e1c25f5f0eb3cd3bd8d195d58f282866dc850c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674001323aaa7a65c9d6df014b5b35f7afb939aa6afe57ac9858923e3fe6ce15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ppxx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:45Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.583646 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f3f8be-7093-40c6-8c6e-6bb65c878b31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:26Z\\\",\\\"message\\\":\\\"ication::requestheader-client-ca-file\\\\nI1206 01:02:26.397262 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1206 01:02:26.397306 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1206 01:02:26.397362 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764982931\\\\\\\\\\\\\\\" (2025-12-06 01:02:10 +0000 UTC to 2026-01-05 01:02:11 +0000 UTC (now=2025-12-06 01:02:26.397254869 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397406 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1206 01:02:26.397421 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1206 01:02:26.397440 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\"\\\\nI1206 01:02:26.397610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764982946\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764982946\\\\\\\\\\\\\\\" (2025-12-06 00:02:26 +0000 UTC to 2026-12-06 00:02:26 +0000 UTC (now=2025-12-06 01:02:26.397572448 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397373 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1206 01:02:26.397647 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1206 01:02:26.397687 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1206 01:02:26.397767 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1206 01:02:26.399022 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:45Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.606349 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:45Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.612053 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.612126 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.612142 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.612169 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.612183 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:45Z","lastTransitionTime":"2025-12-06T01:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.615748 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dfebf788-e0ac-4bd4-8926-1f7b7cabba71-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-t8t9g\" (UID: \"dfebf788-e0ac-4bd4-8926-1f7b7cabba71\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t8t9g" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.615811 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dfebf788-e0ac-4bd4-8926-1f7b7cabba71-env-overrides\") pod \"ovnkube-control-plane-749d76644c-t8t9g\" (UID: \"dfebf788-e0ac-4bd4-8926-1f7b7cabba71\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t8t9g" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.615884 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gpqq\" (UniqueName: \"kubernetes.io/projected/dfebf788-e0ac-4bd4-8926-1f7b7cabba71-kube-api-access-9gpqq\") pod \"ovnkube-control-plane-749d76644c-t8t9g\" (UID: \"dfebf788-e0ac-4bd4-8926-1f7b7cabba71\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t8t9g" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.615928 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dfebf788-e0ac-4bd4-8926-1f7b7cabba71-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-t8t9g\" (UID: \"dfebf788-e0ac-4bd4-8926-1f7b7cabba71\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t8t9g" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.617248 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dfebf788-e0ac-4bd4-8926-1f7b7cabba71-env-overrides\") pod \"ovnkube-control-plane-749d76644c-t8t9g\" (UID: \"dfebf788-e0ac-4bd4-8926-1f7b7cabba71\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t8t9g" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.617491 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dfebf788-e0ac-4bd4-8926-1f7b7cabba71-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-t8t9g\" (UID: \"dfebf788-e0ac-4bd4-8926-1f7b7cabba71\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t8t9g" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.629468 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dfebf788-e0ac-4bd4-8926-1f7b7cabba71-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-t8t9g\" (UID: \"dfebf788-e0ac-4bd4-8926-1f7b7cabba71\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t8t9g" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.630116 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021d3a65c56a737a682268cde0d57a186389f597de592f541a72ea5958a170c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70371e1283910d9d652089554a815343eb1d54110a2472e067d5a2e8878c9193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:45Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.651036 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gpqq\" (UniqueName: \"kubernetes.io/projected/dfebf788-e0ac-4bd4-8926-1f7b7cabba71-kube-api-access-9gpqq\") pod \"ovnkube-control-plane-749d76644c-t8t9g\" (UID: \"dfebf788-e0ac-4bd4-8926-1f7b7cabba71\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t8t9g" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.652060 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d8086b693e86f8e2ea45695911c64df4acded68da7cc61db4cdbc19b2402879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:45Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.674116 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:45Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.696899 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sq7kg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f95c58-fd69-48f6-94be-15927aa850c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d44b91702b22653a2cc67f20e049260e82a69445cff9d6d2abbec16d1e725147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw8lm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sq7kg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:45Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.715730 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.715810 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.715835 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.715872 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.715896 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:45Z","lastTransitionTime":"2025-12-06T01:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.730621 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e52ba3-560c-483d-9c51-898cbe94aad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f6d3d0e90489e71ee7e5e352e5d1752a7248686adf123c15f2f9aba34e9c556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tw85g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:45Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.737782 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t8t9g" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.747765 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t8t9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfebf788-e0ac-4bd4-8926-1f7b7cabba71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gpqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gpqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t8t9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:45Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.774440 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5409f18b-76b3-4388-a9c8-50a1ec8a37d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3c6493775f81f3e0feeee29f2b3c8420a15ba9ce8117fab50521b2251af46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6fa450da9e79fab8a6d70eb261ad080499995eef8884b2e01c68f44f3d1149\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cb22622bd70c57cf25cb4e61bec6541175f1fee668e4b1aaa7e36853827315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:45Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.818850 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.818893 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.818902 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.818917 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.818927 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:45Z","lastTransitionTime":"2025-12-06T01:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.921675 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.921746 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.921770 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.921809 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:45 crc kubenswrapper[4991]: I1206 01:02:45.921834 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:45Z","lastTransitionTime":"2025-12-06T01:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.025126 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.025205 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.025216 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.025234 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.025246 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:46Z","lastTransitionTime":"2025-12-06T01:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.038744 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.038847 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:02:46 crc kubenswrapper[4991]: E1206 01:02:46.038945 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:02:46 crc kubenswrapper[4991]: E1206 01:02:46.039261 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.039333 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:02:46 crc kubenswrapper[4991]: E1206 01:02:46.039460 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.129524 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.129601 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.129625 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.129658 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.129682 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:46Z","lastTransitionTime":"2025-12-06T01:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.233108 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.233187 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.233276 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.233310 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.233333 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:46Z","lastTransitionTime":"2025-12-06T01:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.337091 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.337148 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.337168 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.337196 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.337216 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:46Z","lastTransitionTime":"2025-12-06T01:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.440881 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.440962 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.441015 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.441053 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.441076 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:46Z","lastTransitionTime":"2025-12-06T01:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.545502 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.545585 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.545616 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.545650 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.545678 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:46Z","lastTransitionTime":"2025-12-06T01:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:46 crc kubenswrapper[4991]: W1206 01:02:46.578338 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfebf788_e0ac_4bd4_8926_1f7b7cabba71.slice/crio-7c6ab143b3d50eddce6a81268511cda5b1de732cc69445a52c4044bead176fc4 WatchSource:0}: Error finding container 7c6ab143b3d50eddce6a81268511cda5b1de732cc69445a52c4044bead176fc4: Status 404 returned error can't find the container with id 7c6ab143b3d50eddce6a81268511cda5b1de732cc69445a52c4044bead176fc4 Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.591618 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-q8mbt"] Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.592562 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:02:46 crc kubenswrapper[4991]: E1206 01:02:46.592674 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.613361 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kw4lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c790aea7-3412-47a6-a3d0-3dd019c26022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4698040873e672e01eca6a4791166d92bd11bdcc3197bbe0340472eba3ba69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnwnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kw4lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:46Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.627729 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khpzb\" (UniqueName: \"kubernetes.io/projected/374025e5-ccaa-4cc4-816b-136713f17d6d-kube-api-access-khpzb\") pod \"network-metrics-daemon-q8mbt\" (UID: \"374025e5-ccaa-4cc4-816b-136713f17d6d\") " pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.627837 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/374025e5-ccaa-4cc4-816b-136713f17d6d-metrics-certs\") pod \"network-metrics-daemon-q8mbt\" (UID: \"374025e5-ccaa-4cc4-816b-136713f17d6d\") " pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.644441 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc01e1f52af4386bd143ff8779458f0e335475f40f067c608c70a765d6fad952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:46Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.648444 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.648497 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.648515 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.648538 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.648555 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:46Z","lastTransitionTime":"2025-12-06T01:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.669451 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1c8508-3f35-44b2-988e-cc665121ce94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74348a275bbff980e3fcc2f0657d8877d541c9b854bb9f9d26ba76494876f9b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74348a275bbff980e3fcc2f0657d8877d541c9b854bb9f9d26ba76494876f9b2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T01:02:43Z\\\",\\\"message\\\":\\\"factory.go:160\\\\nI1206 01:02:43.087189 6324 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1206 01:02:43.087487 6324 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 01:02:43.087832 6324 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1206 01:02:43.087954 6324 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1206 01:02:43.087964 6324 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1206 01:02:43.087980 6324 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1206 01:02:43.088008 6324 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1206 01:02:43.088024 6324 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1206 01:02:43.088083 6324 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1206 01:02:43.088112 6324 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1206 01:02:43.088139 6324 factory.go:656] Stopping watch factory\\\\nI1206 01:02:43.088146 6324 handler.go:208] Removed *v1.Node event handler 2\\\\nI1206 01:02:43.088154 6324 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1206 01:02:43.088162 6324 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5t29r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:46Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.684530 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g2v2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee92c0e499eed2711d5027aba6265d26549865fbfb26de5dd38442e3c9a852b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlkqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g2v2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:46Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.704725 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f3f8be-7093-40c6-8c6e-6bb65c878b31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:26Z\\\",\\\"message\\\":\\\"ication::requestheader-client-ca-file\\\\nI1206 01:02:26.397262 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1206 01:02:26.397306 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1206 01:02:26.397362 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764982931\\\\\\\\\\\\\\\" (2025-12-06 01:02:10 +0000 UTC to 2026-01-05 01:02:11 +0000 UTC (now=2025-12-06 01:02:26.397254869 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397406 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1206 01:02:26.397421 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1206 01:02:26.397440 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\"\\\\nI1206 01:02:26.397610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764982946\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764982946\\\\\\\\\\\\\\\" (2025-12-06 00:02:26 +0000 UTC to 2026-12-06 00:02:26 +0000 UTC (now=2025-12-06 01:02:26.397572448 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397373 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1206 01:02:26.397647 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1206 01:02:26.397687 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1206 01:02:26.397767 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1206 01:02:26.399022 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:46Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.718177 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:46Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.728712 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khpzb\" (UniqueName: \"kubernetes.io/projected/374025e5-ccaa-4cc4-816b-136713f17d6d-kube-api-access-khpzb\") pod \"network-metrics-daemon-q8mbt\" (UID: \"374025e5-ccaa-4cc4-816b-136713f17d6d\") " pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.728774 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/374025e5-ccaa-4cc4-816b-136713f17d6d-metrics-certs\") pod \"network-metrics-daemon-q8mbt\" (UID: \"374025e5-ccaa-4cc4-816b-136713f17d6d\") " pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:02:46 crc kubenswrapper[4991]: E1206 01:02:46.728917 4991 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 01:02:46 crc kubenswrapper[4991]: E1206 01:02:46.729007 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/374025e5-ccaa-4cc4-816b-136713f17d6d-metrics-certs podName:374025e5-ccaa-4cc4-816b-136713f17d6d nodeName:}" failed. No retries permitted until 2025-12-06 01:02:47.228964779 +0000 UTC m=+40.579255247 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/374025e5-ccaa-4cc4-816b-136713f17d6d-metrics-certs") pod "network-metrics-daemon-q8mbt" (UID: "374025e5-ccaa-4cc4-816b-136713f17d6d") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.736694 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:46Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.748654 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khpzb\" (UniqueName: \"kubernetes.io/projected/374025e5-ccaa-4cc4-816b-136713f17d6d-kube-api-access-khpzb\") pod \"network-metrics-daemon-q8mbt\" (UID: \"374025e5-ccaa-4cc4-816b-136713f17d6d\") " pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.748873 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f35cea72-9e31-4432-9e3b-16a50ebce794\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9893b4133f9e41bd3fa958f5e1c25f5f0eb3cd3bd8d195d58f282866dc850c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674001323aaa7a65c9d6df014b5b35f7afb939aa6afe57ac9858923e3fe6ce15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ppxx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:46Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.756753 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.756777 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.756786 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.756800 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.756812 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:46Z","lastTransitionTime":"2025-12-06T01:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.761408 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sq7kg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f95c58-fd69-48f6-94be-15927aa850c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d44b91702b22653a2cc67f20e049260e82a69445cff9d6d2abbec16d1e725147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw8lm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sq7kg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:46Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.780468 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e52ba3-560c-483d-9c51-898cbe94aad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f6d3d0e90489e71ee7e5e352e5d1752a7248686adf123c15f2f9aba34e9c556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tw85g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:46Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.794283 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t8t9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfebf788-e0ac-4bd4-8926-1f7b7cabba71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gpqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gpqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t8t9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:46Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.807858 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5409f18b-76b3-4388-a9c8-50a1ec8a37d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3c6493775f81f3e0feeee29f2b3c8420a15ba9ce8117fab50521b2251af46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6fa450da9e79fab8a6d70eb261ad080499995eef8884b2e01c68f44f3d1149\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cb22622bd70c57cf25cb4e61bec6541175f1fee668e4b1aaa7e36853827315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:46Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.820513 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021d3a65c56a737a682268cde0d57a186389f597de592f541a72ea5958a170c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70371e1283910d9d652089554a815343eb1d54110a2472e067d5a2e8878c9193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:46Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.834173 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d8086b693e86f8e2ea45695911c64df4acded68da7cc61db4cdbc19b2402879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:46Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.847582 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:46Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.860124 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q8mbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"374025e5-ccaa-4cc4-816b-136713f17d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khpzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khpzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q8mbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:46Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.860243 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.860276 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.860285 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.860301 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.860313 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:46Z","lastTransitionTime":"2025-12-06T01:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.963317 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.963369 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.963386 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.963409 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:46 crc kubenswrapper[4991]: I1206 01:02:46.963424 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:46Z","lastTransitionTime":"2025-12-06T01:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.061036 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021d3a65c56a737a682268cde0d57a186389f597de592f541a72ea5958a170c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70371e1283910d9d652089554a815343eb1d54110a2472e067d5a2e8878c9193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:47Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.066224 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.066284 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.066304 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.066331 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.066348 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:47Z","lastTransitionTime":"2025-12-06T01:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.081181 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d8086b693e86f8e2ea45695911c64df4acded68da7cc61db4cdbc19b2402879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:47Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.100224 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:47Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.119184 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sq7kg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f95c58-fd69-48f6-94be-15927aa850c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d44b91702b22653a2cc67f20e049260e82a69445cff9d6d2abbec16d1e725147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw8lm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sq7kg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:47Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.136256 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e52ba3-560c-483d-9c51-898cbe94aad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f6d3d0e90489e71ee7e5e352e5d1752a7248686adf123c15f2f9aba34e9c556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tw85g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:47Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.151527 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t8t9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfebf788-e0ac-4bd4-8926-1f7b7cabba71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gpqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gpqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t8t9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:47Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.168568 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5409f18b-76b3-4388-a9c8-50a1ec8a37d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3c6493775f81f3e0feeee29f2b3c8420a15ba9ce8117fab50521b2251af46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6fa450da9e79fab8a6d70eb261ad080499995eef8884b2e01c68f44f3d1149\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cb22622bd70c57cf25cb4e61bec6541175f1fee668e4b1aaa7e36853827315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:47Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.169076 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.169127 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.169140 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.169167 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.169181 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:47Z","lastTransitionTime":"2025-12-06T01:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.182352 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q8mbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"374025e5-ccaa-4cc4-816b-136713f17d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khpzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khpzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q8mbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:47Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.202196 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kw4lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c790aea7-3412-47a6-a3d0-3dd019c26022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4698040873e672e01eca6a4791166d92bd11bdcc3197bbe0340472eba3ba69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnwnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kw4lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:47Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.225267 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1c8508-3f35-44b2-988e-cc665121ce94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74348a275bbff980e3fcc2f0657d8877d541c9b854bb9f9d26ba76494876f9b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74348a275bbff980e3fcc2f0657d8877d541c9b854bb9f9d26ba76494876f9b2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T01:02:43Z\\\",\\\"message\\\":\\\"factory.go:160\\\\nI1206 01:02:43.087189 6324 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1206 01:02:43.087487 6324 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 01:02:43.087832 6324 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1206 01:02:43.087954 6324 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1206 01:02:43.087964 6324 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1206 01:02:43.087980 6324 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1206 01:02:43.088008 6324 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1206 01:02:43.088024 6324 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1206 01:02:43.088083 6324 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1206 01:02:43.088112 6324 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1206 01:02:43.088139 6324 factory.go:656] Stopping watch factory\\\\nI1206 01:02:43.088146 6324 handler.go:208] Removed *v1.Node event handler 2\\\\nI1206 01:02:43.088154 6324 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1206 01:02:43.088162 6324 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5t29r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:47Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.235142 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/374025e5-ccaa-4cc4-816b-136713f17d6d-metrics-certs\") pod \"network-metrics-daemon-q8mbt\" (UID: \"374025e5-ccaa-4cc4-816b-136713f17d6d\") " pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:02:47 crc kubenswrapper[4991]: E1206 01:02:47.235359 4991 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 01:02:47 crc kubenswrapper[4991]: E1206 01:02:47.235430 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/374025e5-ccaa-4cc4-816b-136713f17d6d-metrics-certs podName:374025e5-ccaa-4cc4-816b-136713f17d6d nodeName:}" failed. No retries permitted until 2025-12-06 01:02:48.235411745 +0000 UTC m=+41.585702213 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/374025e5-ccaa-4cc4-816b-136713f17d6d-metrics-certs") pod "network-metrics-daemon-q8mbt" (UID: "374025e5-ccaa-4cc4-816b-136713f17d6d") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.238865 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g2v2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee92c0e499eed2711d5027aba6265d26549865fbfb26de5dd38442e3c9a852b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlkqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g2v2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:47Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.255940 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc01e1f52af4386bd143ff8779458f0e335475f40f067c608c70a765d6fad952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:47Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.270504 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:47Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.272616 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.272716 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.272731 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.272753 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.272767 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:47Z","lastTransitionTime":"2025-12-06T01:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.282810 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f35cea72-9e31-4432-9e3b-16a50ebce794\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9893b4133f9e41bd3fa958f5e1c25f5f0eb3cd3bd8d195d58f282866dc850c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674001323aaa7a65c9d6df014b5b35f7afb939aa6afe57ac9858923e3fe6ce15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ppxx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:47Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.296557 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f3f8be-7093-40c6-8c6e-6bb65c878b31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:26Z\\\",\\\"message\\\":\\\"ication::requestheader-client-ca-file\\\\nI1206 01:02:26.397262 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1206 01:02:26.397306 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1206 01:02:26.397362 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764982931\\\\\\\\\\\\\\\" (2025-12-06 01:02:10 +0000 UTC to 2026-01-05 01:02:11 +0000 UTC (now=2025-12-06 01:02:26.397254869 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397406 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1206 01:02:26.397421 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1206 01:02:26.397440 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\"\\\\nI1206 01:02:26.397610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764982946\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764982946\\\\\\\\\\\\\\\" (2025-12-06 00:02:26 +0000 UTC to 2026-12-06 00:02:26 +0000 UTC (now=2025-12-06 01:02:26.397572448 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397373 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1206 01:02:26.397647 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1206 01:02:26.397687 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1206 01:02:26.397767 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1206 01:02:26.399022 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:47Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.311975 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:47Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.355956 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5t29r_7a1c8508-3f35-44b2-988e-cc665121ce94/ovnkube-controller/0.log" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.361467 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" event={"ID":"7a1c8508-3f35-44b2-988e-cc665121ce94","Type":"ContainerStarted","Data":"25af58348673cb95595baaa6e45859db48cde037f62e8e7f49aac6a5932fb194"} Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.362822 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t8t9g" event={"ID":"dfebf788-e0ac-4bd4-8926-1f7b7cabba71","Type":"ContainerStarted","Data":"7c6ab143b3d50eddce6a81268511cda5b1de732cc69445a52c4044bead176fc4"} Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.365482 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.368147 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0d50b2b6313c316a02866d767184d6eb48a30af52e2d86c65a08dab461e73a7c"} Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.369431 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.375867 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.376060 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.376099 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.376187 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.376260 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:47Z","lastTransitionTime":"2025-12-06T01:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.393602 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f3f8be-7093-40c6-8c6e-6bb65c878b31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d50b2b6313c316a02866d767184d6eb48a30af52e2d86c65a08dab461e73a7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:26Z\\\",\\\"message\\\":\\\"ication::requestheader-client-ca-file\\\\nI1206 01:02:26.397262 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1206 01:02:26.397306 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1206 01:02:26.397362 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764982931\\\\\\\\\\\\\\\" (2025-12-06 01:02:10 +0000 UTC to 2026-01-05 01:02:11 +0000 UTC (now=2025-12-06 01:02:26.397254869 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397406 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1206 01:02:26.397421 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1206 01:02:26.397440 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\"\\\\nI1206 01:02:26.397610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764982946\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764982946\\\\\\\\\\\\\\\" (2025-12-06 00:02:26 +0000 UTC to 2026-12-06 00:02:26 +0000 UTC (now=2025-12-06 01:02:26.397572448 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397373 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1206 01:02:26.397647 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1206 01:02:26.397687 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1206 01:02:26.397767 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1206 01:02:26.399022 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:47Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.416411 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:47Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.435165 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:47Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.453861 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f35cea72-9e31-4432-9e3b-16a50ebce794\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9893b4133f9e41bd3fa958f5e1c25f5f0eb3cd3bd8d195d58f282866dc850c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674001323aaa7a65c9d6df014b5b35f7afb939aa6afe57ac9858923e3fe6ce15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ppxx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:47Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.474793 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e52ba3-560c-483d-9c51-898cbe94aad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f6d3d0e90489e71ee7e5e352e5d1752a7248686adf123c15f2f9aba34e9c556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tw85g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:47Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.480218 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.480283 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.480298 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.480339 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.480358 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:47Z","lastTransitionTime":"2025-12-06T01:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.493654 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t8t9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfebf788-e0ac-4bd4-8926-1f7b7cabba71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gpqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gpqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t8t9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:47Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.513901 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5409f18b-76b3-4388-a9c8-50a1ec8a37d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3c6493775f81f3e0feeee29f2b3c8420a15ba9ce8117fab50521b2251af46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6fa450da9e79fab8a6d70eb261ad080499995eef8884b2e01c68f44f3d1149\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cb22622bd70c57cf25cb4e61bec6541175f1fee668e4b1aaa7e36853827315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:47Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.535673 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021d3a65c56a737a682268cde0d57a186389f597de592f541a72ea5958a170c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70371e1283910d9d652089554a815343eb1d54110a2472e067d5a2e8878c9193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:47Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.556627 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d8086b693e86f8e2ea45695911c64df4acded68da7cc61db4cdbc19b2402879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:47Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.576869 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:47Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.583168 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.583217 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.583233 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.583256 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.583272 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:47Z","lastTransitionTime":"2025-12-06T01:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.601496 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sq7kg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f95c58-fd69-48f6-94be-15927aa850c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d44b91702b22653a2cc67f20e049260e82a69445cff9d6d2abbec16d1e725147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw8lm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sq7kg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:47Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.619655 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q8mbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"374025e5-ccaa-4cc4-816b-136713f17d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khpzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khpzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q8mbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:47Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.636068 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kw4lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c790aea7-3412-47a6-a3d0-3dd019c26022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4698040873e672e01eca6a4791166d92bd11bdcc3197bbe0340472eba3ba69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnwnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kw4lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:47Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.663972 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc01e1f52af4386bd143ff8779458f0e335475f40f067c608c70a765d6fad952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:47Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.686311 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.686372 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.686386 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.686413 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.686429 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:47Z","lastTransitionTime":"2025-12-06T01:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.689550 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1c8508-3f35-44b2-988e-cc665121ce94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74348a275bbff980e3fcc2f0657d8877d541c9b854bb9f9d26ba76494876f9b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74348a275bbff980e3fcc2f0657d8877d541c9b854bb9f9d26ba76494876f9b2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T01:02:43Z\\\",\\\"message\\\":\\\"factory.go:160\\\\nI1206 01:02:43.087189 6324 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1206 01:02:43.087487 6324 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 01:02:43.087832 6324 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1206 01:02:43.087954 6324 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1206 01:02:43.087964 6324 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1206 01:02:43.087980 6324 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1206 01:02:43.088008 6324 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1206 01:02:43.088024 6324 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1206 01:02:43.088083 6324 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1206 01:02:43.088112 6324 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1206 01:02:43.088139 6324 factory.go:656] Stopping watch factory\\\\nI1206 01:02:43.088146 6324 handler.go:208] Removed *v1.Node event handler 2\\\\nI1206 01:02:43.088154 6324 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1206 01:02:43.088162 6324 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5t29r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:47Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.702561 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g2v2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee92c0e499eed2711d5027aba6265d26549865fbfb26de5dd38442e3c9a852b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlkqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g2v2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:47Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.789356 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.789411 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.789423 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.789448 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.789464 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:47Z","lastTransitionTime":"2025-12-06T01:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.892526 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.892587 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.892601 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.892622 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.893030 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:47Z","lastTransitionTime":"2025-12-06T01:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.995949 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.996039 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.996053 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.996081 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:47 crc kubenswrapper[4991]: I1206 01:02:47.996094 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:47Z","lastTransitionTime":"2025-12-06T01:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.037377 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:02:48 crc kubenswrapper[4991]: E1206 01:02:48.037544 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.038063 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:02:48 crc kubenswrapper[4991]: E1206 01:02:48.038133 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.038277 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.038294 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:02:48 crc kubenswrapper[4991]: E1206 01:02:48.038477 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:02:48 crc kubenswrapper[4991]: E1206 01:02:48.038595 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.099913 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.100255 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.100266 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.100293 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.100312 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:48Z","lastTransitionTime":"2025-12-06T01:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.203715 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.203764 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.203798 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.203827 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.203844 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:48Z","lastTransitionTime":"2025-12-06T01:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.246404 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/374025e5-ccaa-4cc4-816b-136713f17d6d-metrics-certs\") pod \"network-metrics-daemon-q8mbt\" (UID: \"374025e5-ccaa-4cc4-816b-136713f17d6d\") " pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:02:48 crc kubenswrapper[4991]: E1206 01:02:48.246686 4991 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 01:02:48 crc kubenswrapper[4991]: E1206 01:02:48.246815 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/374025e5-ccaa-4cc4-816b-136713f17d6d-metrics-certs podName:374025e5-ccaa-4cc4-816b-136713f17d6d nodeName:}" failed. No retries permitted until 2025-12-06 01:02:50.246784856 +0000 UTC m=+43.597075384 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/374025e5-ccaa-4cc4-816b-136713f17d6d-metrics-certs") pod "network-metrics-daemon-q8mbt" (UID: "374025e5-ccaa-4cc4-816b-136713f17d6d") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.307634 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.307703 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.307722 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.307750 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.307770 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:48Z","lastTransitionTime":"2025-12-06T01:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.376305 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5t29r_7a1c8508-3f35-44b2-988e-cc665121ce94/ovnkube-controller/1.log" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.377520 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5t29r_7a1c8508-3f35-44b2-988e-cc665121ce94/ovnkube-controller/0.log" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.381547 4991 generic.go:334] "Generic (PLEG): container finished" podID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerID="25af58348673cb95595baaa6e45859db48cde037f62e8e7f49aac6a5932fb194" exitCode=1 Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.381619 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" event={"ID":"7a1c8508-3f35-44b2-988e-cc665121ce94","Type":"ContainerDied","Data":"25af58348673cb95595baaa6e45859db48cde037f62e8e7f49aac6a5932fb194"} Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.382047 4991 scope.go:117] "RemoveContainer" containerID="74348a275bbff980e3fcc2f0657d8877d541c9b854bb9f9d26ba76494876f9b2" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.383028 4991 scope.go:117] "RemoveContainer" containerID="25af58348673cb95595baaa6e45859db48cde037f62e8e7f49aac6a5932fb194" Dec 06 01:02:48 crc kubenswrapper[4991]: E1206 01:02:48.383395 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5t29r_openshift-ovn-kubernetes(7a1c8508-3f35-44b2-988e-cc665121ce94)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.383842 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t8t9g" event={"ID":"dfebf788-e0ac-4bd4-8926-1f7b7cabba71","Type":"ContainerStarted","Data":"60a6e0233b2965c7438c6ee9467c180d9cd0b1407e2ed0279cc0947686bb9fbf"} Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.402925 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:48Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.410304 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.410335 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.410345 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.410365 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.410378 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:48Z","lastTransitionTime":"2025-12-06T01:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.423287 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:48Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.443570 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f35cea72-9e31-4432-9e3b-16a50ebce794\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9893b4133f9e41bd3fa958f5e1c25f5f0eb3cd3bd8d195d58f282866dc850c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674001323aaa7a65c9d6df014b5b35f7afb939aa6afe57ac9858923e3fe6ce15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ppxx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:48Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.460287 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f3f8be-7093-40c6-8c6e-6bb65c878b31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d50b2b6313c316a02866d767184d6eb48a30af52e2d86c65a08dab461e73a7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:26Z\\\",\\\"message\\\":\\\"ication::requestheader-client-ca-file\\\\nI1206 01:02:26.397262 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1206 01:02:26.397306 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1206 01:02:26.397362 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764982931\\\\\\\\\\\\\\\" (2025-12-06 01:02:10 +0000 UTC to 2026-01-05 01:02:11 +0000 UTC (now=2025-12-06 01:02:26.397254869 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397406 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1206 01:02:26.397421 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1206 01:02:26.397440 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\"\\\\nI1206 01:02:26.397610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764982946\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764982946\\\\\\\\\\\\\\\" (2025-12-06 00:02:26 +0000 UTC to 2026-12-06 00:02:26 +0000 UTC (now=2025-12-06 01:02:26.397572448 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397373 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1206 01:02:26.397647 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1206 01:02:26.397687 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1206 01:02:26.397767 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1206 01:02:26.399022 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:48Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.479949 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021d3a65c56a737a682268cde0d57a186389f597de592f541a72ea5958a170c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70371e1283910d9d652089554a815343eb1d54110a2472e067d5a2e8878c9193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:48Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.497589 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d8086b693e86f8e2ea45695911c64df4acded68da7cc61db4cdbc19b2402879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:48Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.512864 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.512945 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.512968 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.513025 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.513046 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:48Z","lastTransitionTime":"2025-12-06T01:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.515119 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:48Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.533041 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sq7kg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f95c58-fd69-48f6-94be-15927aa850c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d44b91702b22653a2cc67f20e049260e82a69445cff9d6d2abbec16d1e725147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw8lm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sq7kg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:48Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.558016 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e52ba3-560c-483d-9c51-898cbe94aad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f6d3d0e90489e71ee7e5e352e5d1752a7248686adf123c15f2f9aba34e9c556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tw85g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:48Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.574180 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t8t9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfebf788-e0ac-4bd4-8926-1f7b7cabba71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gpqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gpqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t8t9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:48Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.589045 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5409f18b-76b3-4388-a9c8-50a1ec8a37d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3c6493775f81f3e0feeee29f2b3c8420a15ba9ce8117fab50521b2251af46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6fa450da9e79fab8a6d70eb261ad080499995eef8884b2e01c68f44f3d1149\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cb22622bd70c57cf25cb4e61bec6541175f1fee668e4b1aaa7e36853827315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:48Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.602548 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q8mbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"374025e5-ccaa-4cc4-816b-136713f17d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khpzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khpzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q8mbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:48Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.615925 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.615977 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.616015 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.616040 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.616059 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:48Z","lastTransitionTime":"2025-12-06T01:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.616846 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kw4lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c790aea7-3412-47a6-a3d0-3dd019c26022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4698040873e672e01eca6a4791166d92bd11bdcc3197bbe0340472eba3ba69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnwnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kw4lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:48Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.719171 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.719648 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.719807 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.719942 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.720096 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:48Z","lastTransitionTime":"2025-12-06T01:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.823389 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.823443 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.823461 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.823488 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.823505 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:48Z","lastTransitionTime":"2025-12-06T01:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.869653 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc01e1f52af4386bd143ff8779458f0e335475f40f067c608c70a765d6fad952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:48Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.902395 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1c8508-3f35-44b2-988e-cc665121ce94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25af58348673cb95595baaa6e45859db48cde037f62e8e7f49aac6a5932fb194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74348a275bbff980e3fcc2f0657d8877d541c9b854bb9f9d26ba76494876f9b2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T01:02:43Z\\\",\\\"message\\\":\\\"factory.go:160\\\\nI1206 01:02:43.087189 6324 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1206 01:02:43.087487 6324 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 01:02:43.087832 6324 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1206 01:02:43.087954 6324 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1206 01:02:43.087964 6324 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1206 01:02:43.087980 6324 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1206 01:02:43.088008 6324 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1206 01:02:43.088024 6324 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1206 01:02:43.088083 6324 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1206 01:02:43.088112 6324 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1206 01:02:43.088139 6324 factory.go:656] Stopping watch factory\\\\nI1206 01:02:43.088146 6324 handler.go:208] Removed *v1.Node event handler 2\\\\nI1206 01:02:43.088154 6324 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1206 01:02:43.088162 6324 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25af58348673cb95595baaa6e45859db48cde037f62e8e7f49aac6a5932fb194\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T01:02:48Z\\\",\\\"message\\\":\\\"8.084433 6500 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 01:02:48.084572 6500 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 01:02:48.084834 6500 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 01:02:48.084927 6500 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 01:02:48.085548 6500 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1206 01:02:48.085562 6500 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1206 01:02:48.085580 6500 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1206 01:02:48.085605 6500 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1206 01:02:48.085631 6500 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1206 01:02:48.089921 6500 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1206 01:02:48.089953 6500 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1206 01:02:48.090015 6500 factory.go:656] Stopping watch factory\\\\nI1206 01:02:48.090031 6500 ovnkube.go:599] Stopped ovnkube\\\\nI1206 01:02:48.090073 6500 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1206 01\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5t29r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:48Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.919138 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g2v2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee92c0e499eed2711d5027aba6265d26549865fbfb26de5dd38442e3c9a852b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlkqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g2v2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:48Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.926793 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.926866 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.926885 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.926917 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:48 crc kubenswrapper[4991]: I1206 01:02:48.926937 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:48Z","lastTransitionTime":"2025-12-06T01:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.031884 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.032273 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.032465 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.032569 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.032649 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:49Z","lastTransitionTime":"2025-12-06T01:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.136811 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.136870 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.136888 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.136913 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.136931 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:49Z","lastTransitionTime":"2025-12-06T01:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.241383 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.241462 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.241482 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.241510 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.241531 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:49Z","lastTransitionTime":"2025-12-06T01:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.350468 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.350927 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.351123 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.351306 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.351452 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:49Z","lastTransitionTime":"2025-12-06T01:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.391113 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5t29r_7a1c8508-3f35-44b2-988e-cc665121ce94/ovnkube-controller/1.log" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.398404 4991 scope.go:117] "RemoveContainer" containerID="25af58348673cb95595baaa6e45859db48cde037f62e8e7f49aac6a5932fb194" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.398711 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t8t9g" event={"ID":"dfebf788-e0ac-4bd4-8926-1f7b7cabba71","Type":"ContainerStarted","Data":"966bddb0df2e52745119f2f51e57c859a39ca726270405f7b9af19c9d90a4118"} Dec 06 01:02:49 crc kubenswrapper[4991]: E1206 01:02:49.399254 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5t29r_openshift-ovn-kubernetes(7a1c8508-3f35-44b2-988e-cc665121ce94)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.416777 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d8086b693e86f8e2ea45695911c64df4acded68da7cc61db4cdbc19b2402879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:49Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.433750 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:49Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.450971 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sq7kg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f95c58-fd69-48f6-94be-15927aa850c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d44b91702b22653a2cc67f20e049260e82a69445cff9d6d2abbec16d1e725147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw8lm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sq7kg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:49Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.454880 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.455078 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.455176 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.455278 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.455372 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:49Z","lastTransitionTime":"2025-12-06T01:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.472736 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e52ba3-560c-483d-9c51-898cbe94aad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f6d3d0e90489e71ee7e5e352e5d1752a7248686adf123c15f2f9aba34e9c556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tw85g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:49Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.490397 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t8t9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfebf788-e0ac-4bd4-8926-1f7b7cabba71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gpqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gpqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t8t9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:49Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.513103 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5409f18b-76b3-4388-a9c8-50a1ec8a37d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3c6493775f81f3e0feeee29f2b3c8420a15ba9ce8117fab50521b2251af46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6fa450da9e79fab8a6d70eb261ad080499995eef8884b2e01c68f44f3d1149\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cb22622bd70c57cf25cb4e61bec6541175f1fee668e4b1aaa7e36853827315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:49Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.537904 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021d3a65c56a737a682268cde0d57a186389f597de592f541a72ea5958a170c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70371e1283910d9d652089554a815343eb1d54110a2472e067d5a2e8878c9193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:49Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.558934 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q8mbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"374025e5-ccaa-4cc4-816b-136713f17d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khpzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khpzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q8mbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:49Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.559050 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.559106 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.559124 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.559149 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.559164 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:49Z","lastTransitionTime":"2025-12-06T01:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.576566 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kw4lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c790aea7-3412-47a6-a3d0-3dd019c26022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4698040873e672e01eca6a4791166d92bd11bdcc3197bbe0340472eba3ba69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnwnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kw4lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:49Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.589523 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g2v2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee92c0e499eed2711d5027aba6265d26549865fbfb26de5dd38442e3c9a852b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlkqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g2v2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:49Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.606704 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc01e1f52af4386bd143ff8779458f0e335475f40f067c608c70a765d6fad952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:49Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.628212 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1c8508-3f35-44b2-988e-cc665121ce94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25af58348673cb95595baaa6e45859db48cde037f62e8e7f49aac6a5932fb194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25af58348673cb95595baaa6e45859db48cde037f62e8e7f49aac6a5932fb194\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T01:02:48Z\\\",\\\"message\\\":\\\"8.084433 6500 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 01:02:48.084572 6500 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 01:02:48.084834 6500 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 01:02:48.084927 6500 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 01:02:48.085548 6500 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1206 01:02:48.085562 6500 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1206 01:02:48.085580 6500 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1206 01:02:48.085605 6500 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1206 01:02:48.085631 6500 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1206 01:02:48.089921 6500 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1206 01:02:48.089953 6500 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1206 01:02:48.090015 6500 factory.go:656] Stopping watch factory\\\\nI1206 01:02:48.090031 6500 ovnkube.go:599] Stopped ovnkube\\\\nI1206 01:02:48.090073 6500 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1206 01\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5t29r_openshift-ovn-kubernetes(7a1c8508-3f35-44b2-988e-cc665121ce94)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5t29r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:49Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.645656 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f35cea72-9e31-4432-9e3b-16a50ebce794\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9893b4133f9e41bd3fa958f5e1c25f5f0eb3cd3bd8d195d58f282866dc850c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674001323aaa7a65c9d6df014b5b35f7afb939aa6afe57ac9858923e3fe6ce15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ppxx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:49Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.662031 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.662085 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.662102 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.662125 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.662137 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:49Z","lastTransitionTime":"2025-12-06T01:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.668617 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f3f8be-7093-40c6-8c6e-6bb65c878b31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d50b2b6313c316a02866d767184d6eb48a30af52e2d86c65a08dab461e73a7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:26Z\\\",\\\"message\\\":\\\"ication::requestheader-client-ca-file\\\\nI1206 01:02:26.397262 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1206 01:02:26.397306 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1206 01:02:26.397362 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764982931\\\\\\\\\\\\\\\" (2025-12-06 01:02:10 +0000 UTC to 2026-01-05 01:02:11 +0000 UTC (now=2025-12-06 01:02:26.397254869 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397406 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1206 01:02:26.397421 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1206 01:02:26.397440 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\"\\\\nI1206 01:02:26.397610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764982946\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764982946\\\\\\\\\\\\\\\" (2025-12-06 00:02:26 +0000 UTC to 2026-12-06 00:02:26 +0000 UTC (now=2025-12-06 01:02:26.397572448 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397373 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1206 01:02:26.397647 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1206 01:02:26.397687 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1206 01:02:26.397767 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1206 01:02:26.399022 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:49Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.684878 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:49Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.699144 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:49Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.715336 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc01e1f52af4386bd143ff8779458f0e335475f40f067c608c70a765d6fad952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:49Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.736321 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1c8508-3f35-44b2-988e-cc665121ce94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25af58348673cb95595baaa6e45859db48cde037f62e8e7f49aac6a5932fb194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25af58348673cb95595baaa6e45859db48cde037f62e8e7f49aac6a5932fb194\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T01:02:48Z\\\",\\\"message\\\":\\\"8.084433 6500 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 01:02:48.084572 6500 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 01:02:48.084834 6500 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 01:02:48.084927 6500 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 01:02:48.085548 6500 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1206 01:02:48.085562 6500 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1206 01:02:48.085580 6500 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1206 01:02:48.085605 6500 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1206 01:02:48.085631 6500 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1206 01:02:48.089921 6500 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1206 01:02:48.089953 6500 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1206 01:02:48.090015 6500 factory.go:656] Stopping watch factory\\\\nI1206 01:02:48.090031 6500 ovnkube.go:599] Stopped ovnkube\\\\nI1206 01:02:48.090073 6500 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1206 01\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5t29r_openshift-ovn-kubernetes(7a1c8508-3f35-44b2-988e-cc665121ce94)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5t29r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:49Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.751928 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g2v2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee92c0e499eed2711d5027aba6265d26549865fbfb26de5dd38442e3c9a852b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlkqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g2v2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:49Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.765351 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.765420 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.765439 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.765466 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.765484 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:49Z","lastTransitionTime":"2025-12-06T01:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.768758 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f3f8be-7093-40c6-8c6e-6bb65c878b31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d50b2b6313c316a02866d767184d6eb48a30af52e2d86c65a08dab461e73a7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:26Z\\\",\\\"message\\\":\\\"ication::requestheader-client-ca-file\\\\nI1206 01:02:26.397262 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1206 01:02:26.397306 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1206 01:02:26.397362 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764982931\\\\\\\\\\\\\\\" (2025-12-06 01:02:10 +0000 UTC to 2026-01-05 01:02:11 +0000 UTC (now=2025-12-06 01:02:26.397254869 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397406 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1206 01:02:26.397421 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1206 01:02:26.397440 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\"\\\\nI1206 01:02:26.397610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764982946\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764982946\\\\\\\\\\\\\\\" (2025-12-06 00:02:26 +0000 UTC to 2026-12-06 00:02:26 +0000 UTC (now=2025-12-06 01:02:26.397572448 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397373 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1206 01:02:26.397647 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1206 01:02:26.397687 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1206 01:02:26.397767 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1206 01:02:26.399022 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:49Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.782589 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:49Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.796525 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:49Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.806224 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f35cea72-9e31-4432-9e3b-16a50ebce794\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9893b4133f9e41bd3fa958f5e1c25f5f0eb3cd3bd8d195d58f282866dc850c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674001323aaa7a65c9d6df014b5b35f7afb939aa6afe57ac9858923e3fe6ce15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ppxx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:49Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.816266 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5409f18b-76b3-4388-a9c8-50a1ec8a37d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3c6493775f81f3e0feeee29f2b3c8420a15ba9ce8117fab50521b2251af46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6fa450da9e79fab8a6d70eb261ad080499995eef8884b2e01c68f44f3d1149\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cb22622bd70c57cf25cb4e61bec6541175f1fee668e4b1aaa7e36853827315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:49Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.827347 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021d3a65c56a737a682268cde0d57a186389f597de592f541a72ea5958a170c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70371e1283910d9d652089554a815343eb1d54110a2472e067d5a2e8878c9193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:49Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.839384 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d8086b693e86f8e2ea45695911c64df4acded68da7cc61db4cdbc19b2402879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:49Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.849898 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:49Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.867214 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sq7kg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f95c58-fd69-48f6-94be-15927aa850c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d44b91702b22653a2cc67f20e049260e82a69445cff9d6d2abbec16d1e725147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw8lm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sq7kg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:49Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.868752 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.868787 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.868801 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.868821 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.868833 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:49Z","lastTransitionTime":"2025-12-06T01:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.881506 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e52ba3-560c-483d-9c51-898cbe94aad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f6d3d0e90489e71ee7e5e352e5d1752a7248686adf123c15f2f9aba34e9c556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tw85g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:49Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.892755 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t8t9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfebf788-e0ac-4bd4-8926-1f7b7cabba71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60a6e0233b2965c7438c6ee9467c180d9cd0b1407e2ed0279cc0947686bb9fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gpqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://966bddb0df2e52745119f2f51e57c859a39ca726270405f7b9af19c9d90a4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gpqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t8t9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:49Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.908188 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q8mbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"374025e5-ccaa-4cc4-816b-136713f17d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khpzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khpzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q8mbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:49Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.919366 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kw4lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c790aea7-3412-47a6-a3d0-3dd019c26022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4698040873e672e01eca6a4791166d92bd11bdcc3197bbe0340472eba3ba69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnwnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kw4lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:49Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.972310 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.972367 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.972396 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.972433 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:49 crc kubenswrapper[4991]: I1206 01:02:49.972460 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:49Z","lastTransitionTime":"2025-12-06T01:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.037828 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.037903 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.037904 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.037967 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:02:50 crc kubenswrapper[4991]: E1206 01:02:50.038221 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:02:50 crc kubenswrapper[4991]: E1206 01:02:50.038369 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:02:50 crc kubenswrapper[4991]: E1206 01:02:50.038547 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:02:50 crc kubenswrapper[4991]: E1206 01:02:50.038698 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.075618 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.075662 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.075682 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.075708 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.075728 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:50Z","lastTransitionTime":"2025-12-06T01:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.178312 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.178371 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.178392 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.178417 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.178434 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:50Z","lastTransitionTime":"2025-12-06T01:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.269831 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/374025e5-ccaa-4cc4-816b-136713f17d6d-metrics-certs\") pod \"network-metrics-daemon-q8mbt\" (UID: \"374025e5-ccaa-4cc4-816b-136713f17d6d\") " pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:02:50 crc kubenswrapper[4991]: E1206 01:02:50.270122 4991 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 01:02:50 crc kubenswrapper[4991]: E1206 01:02:50.270273 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/374025e5-ccaa-4cc4-816b-136713f17d6d-metrics-certs podName:374025e5-ccaa-4cc4-816b-136713f17d6d nodeName:}" failed. No retries permitted until 2025-12-06 01:02:54.270235816 +0000 UTC m=+47.620526324 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/374025e5-ccaa-4cc4-816b-136713f17d6d-metrics-certs") pod "network-metrics-daemon-q8mbt" (UID: "374025e5-ccaa-4cc4-816b-136713f17d6d") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.280793 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.280836 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.280849 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.280873 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.280889 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:50Z","lastTransitionTime":"2025-12-06T01:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.385117 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.385177 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.385195 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.385217 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.385236 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:50Z","lastTransitionTime":"2025-12-06T01:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.486936 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.487002 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.487016 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.487035 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.487045 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:50Z","lastTransitionTime":"2025-12-06T01:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.590143 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.590205 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.590225 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.590252 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.590270 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:50Z","lastTransitionTime":"2025-12-06T01:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.693594 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.693652 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.693667 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.693684 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.693695 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:50Z","lastTransitionTime":"2025-12-06T01:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.798709 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.798798 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.798814 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.798837 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.798852 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:50Z","lastTransitionTime":"2025-12-06T01:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.901036 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.901108 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.901128 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.901156 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:50 crc kubenswrapper[4991]: I1206 01:02:50.901176 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:50Z","lastTransitionTime":"2025-12-06T01:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.004671 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.004765 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.004786 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.004819 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.004849 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:51Z","lastTransitionTime":"2025-12-06T01:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.108184 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.108257 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.108276 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.108305 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.108325 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:51Z","lastTransitionTime":"2025-12-06T01:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.210935 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.210973 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.210997 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.211014 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.211025 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:51Z","lastTransitionTime":"2025-12-06T01:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.313803 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.313847 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.313860 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.313878 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.313888 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:51Z","lastTransitionTime":"2025-12-06T01:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.416901 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.416950 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.416963 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.417006 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.417024 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:51Z","lastTransitionTime":"2025-12-06T01:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.521214 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.521325 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.521354 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.521393 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.521420 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:51Z","lastTransitionTime":"2025-12-06T01:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.624258 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.624360 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.624383 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.624858 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.625166 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:51Z","lastTransitionTime":"2025-12-06T01:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.728241 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.728314 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.728336 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.728369 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.728392 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:51Z","lastTransitionTime":"2025-12-06T01:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.831442 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.831495 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.831511 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.831540 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.831562 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:51Z","lastTransitionTime":"2025-12-06T01:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.934449 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.934519 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.934534 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.934556 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:51 crc kubenswrapper[4991]: I1206 01:02:51.934572 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:51Z","lastTransitionTime":"2025-12-06T01:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.036765 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.036843 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.036907 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.036919 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:02:52 crc kubenswrapper[4991]: E1206 01:02:52.037144 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:02:52 crc kubenswrapper[4991]: E1206 01:02:52.037210 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:02:52 crc kubenswrapper[4991]: E1206 01:02:52.037340 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:02:52 crc kubenswrapper[4991]: E1206 01:02:52.037536 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.037909 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.037962 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.038011 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.038041 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.038060 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:52Z","lastTransitionTime":"2025-12-06T01:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.140294 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.140347 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.140358 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.140377 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.140389 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:52Z","lastTransitionTime":"2025-12-06T01:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.243120 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.243161 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.243176 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.243196 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.243208 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:52Z","lastTransitionTime":"2025-12-06T01:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.346150 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.346195 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.346213 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.346244 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.346271 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:52Z","lastTransitionTime":"2025-12-06T01:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.451456 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.451497 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.451511 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.451531 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.451546 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:52Z","lastTransitionTime":"2025-12-06T01:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.555206 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.555256 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.555268 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.555285 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.555299 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:52Z","lastTransitionTime":"2025-12-06T01:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.657614 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.657669 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.657685 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.657711 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.657727 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:52Z","lastTransitionTime":"2025-12-06T01:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.761151 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.761204 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.761220 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.761239 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.761253 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:52Z","lastTransitionTime":"2025-12-06T01:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.864698 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.864783 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.864810 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.864850 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.864876 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:52Z","lastTransitionTime":"2025-12-06T01:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.968497 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.968578 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.968590 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.968613 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:52 crc kubenswrapper[4991]: I1206 01:02:52.968625 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:52Z","lastTransitionTime":"2025-12-06T01:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.058944 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.059031 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.059050 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.059071 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.059089 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:53Z","lastTransitionTime":"2025-12-06T01:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:53 crc kubenswrapper[4991]: E1206 01:02:53.081391 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"596c3607-4a7a-43fe-9f86-5d0b3f7502dc\\\",\\\"systemUUID\\\":\\\"a16d5df6-771f-4852-a5a1-7a32af9d583d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:53Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.087267 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.087337 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.087354 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.087377 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.087392 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:53Z","lastTransitionTime":"2025-12-06T01:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:53 crc kubenswrapper[4991]: E1206 01:02:53.112031 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"596c3607-4a7a-43fe-9f86-5d0b3f7502dc\\\",\\\"systemUUID\\\":\\\"a16d5df6-771f-4852-a5a1-7a32af9d583d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:53Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.118797 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.118953 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.119118 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.119157 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.119180 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:53Z","lastTransitionTime":"2025-12-06T01:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:53 crc kubenswrapper[4991]: E1206 01:02:53.147318 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"596c3607-4a7a-43fe-9f86-5d0b3f7502dc\\\",\\\"systemUUID\\\":\\\"a16d5df6-771f-4852-a5a1-7a32af9d583d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:53Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.152685 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.152733 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.152752 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.152785 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.152802 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:53Z","lastTransitionTime":"2025-12-06T01:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:53 crc kubenswrapper[4991]: E1206 01:02:53.171659 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"596c3607-4a7a-43fe-9f86-5d0b3f7502dc\\\",\\\"systemUUID\\\":\\\"a16d5df6-771f-4852-a5a1-7a32af9d583d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:53Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.176229 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.176271 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.176286 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.176309 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.176322 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:53Z","lastTransitionTime":"2025-12-06T01:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:53 crc kubenswrapper[4991]: E1206 01:02:53.193667 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:02:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"596c3607-4a7a-43fe-9f86-5d0b3f7502dc\\\",\\\"systemUUID\\\":\\\"a16d5df6-771f-4852-a5a1-7a32af9d583d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:53Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:53 crc kubenswrapper[4991]: E1206 01:02:53.193907 4991 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.195969 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.196042 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.196053 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.196076 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.196088 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:53Z","lastTransitionTime":"2025-12-06T01:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.299331 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.299378 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.299392 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.299411 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.299423 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:53Z","lastTransitionTime":"2025-12-06T01:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.404015 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.404087 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.404107 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.404138 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.404164 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:53Z","lastTransitionTime":"2025-12-06T01:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.506660 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.506748 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.506765 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.506786 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.506800 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:53Z","lastTransitionTime":"2025-12-06T01:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.610018 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.610129 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.610152 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.610185 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.610210 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:53Z","lastTransitionTime":"2025-12-06T01:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.713799 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.713849 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.713858 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.713873 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.713883 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:53Z","lastTransitionTime":"2025-12-06T01:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.816685 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.816742 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.816757 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.816779 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.816790 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:53Z","lastTransitionTime":"2025-12-06T01:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.919276 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.919344 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.919367 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.919398 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:53 crc kubenswrapper[4991]: I1206 01:02:53.919422 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:53Z","lastTransitionTime":"2025-12-06T01:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.022044 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.022134 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.022158 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.022190 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.022218 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:54Z","lastTransitionTime":"2025-12-06T01:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.037233 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.037320 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.037340 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.037315 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:02:54 crc kubenswrapper[4991]: E1206 01:02:54.037427 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:02:54 crc kubenswrapper[4991]: E1206 01:02:54.037585 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:02:54 crc kubenswrapper[4991]: E1206 01:02:54.037681 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:02:54 crc kubenswrapper[4991]: E1206 01:02:54.037751 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.125441 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.125505 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.125517 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.125540 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.125559 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:54Z","lastTransitionTime":"2025-12-06T01:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.228471 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.228514 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.228524 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.228538 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.228549 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:54Z","lastTransitionTime":"2025-12-06T01:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.318354 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/374025e5-ccaa-4cc4-816b-136713f17d6d-metrics-certs\") pod \"network-metrics-daemon-q8mbt\" (UID: \"374025e5-ccaa-4cc4-816b-136713f17d6d\") " pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:02:54 crc kubenswrapper[4991]: E1206 01:02:54.318669 4991 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 01:02:54 crc kubenswrapper[4991]: E1206 01:02:54.318849 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/374025e5-ccaa-4cc4-816b-136713f17d6d-metrics-certs podName:374025e5-ccaa-4cc4-816b-136713f17d6d nodeName:}" failed. No retries permitted until 2025-12-06 01:03:02.318803793 +0000 UTC m=+55.669094431 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/374025e5-ccaa-4cc4-816b-136713f17d6d-metrics-certs") pod "network-metrics-daemon-q8mbt" (UID: "374025e5-ccaa-4cc4-816b-136713f17d6d") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.331122 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.331178 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.331196 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.331225 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.331246 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:54Z","lastTransitionTime":"2025-12-06T01:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.434529 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.434588 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.434602 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.434625 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.434638 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:54Z","lastTransitionTime":"2025-12-06T01:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.537797 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.537852 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.537862 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.537878 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.537888 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:54Z","lastTransitionTime":"2025-12-06T01:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.641492 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.641574 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.641593 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.641615 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.641633 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:54Z","lastTransitionTime":"2025-12-06T01:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.745178 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.745243 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.745258 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.745277 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.745291 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:54Z","lastTransitionTime":"2025-12-06T01:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.848926 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.849013 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.849026 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.849048 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.849063 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:54Z","lastTransitionTime":"2025-12-06T01:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.952169 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.952241 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.952253 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.952275 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:54 crc kubenswrapper[4991]: I1206 01:02:54.952289 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:54Z","lastTransitionTime":"2025-12-06T01:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.055904 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.055959 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.055975 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.056019 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.056032 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:55Z","lastTransitionTime":"2025-12-06T01:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.159037 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.159106 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.159119 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.159134 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.159146 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:55Z","lastTransitionTime":"2025-12-06T01:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.262410 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.262460 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.262470 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.262487 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.262500 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:55Z","lastTransitionTime":"2025-12-06T01:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.365760 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.365810 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.365822 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.365838 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.365849 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:55Z","lastTransitionTime":"2025-12-06T01:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.467885 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.467924 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.467936 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.467961 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.467974 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:55Z","lastTransitionTime":"2025-12-06T01:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.571155 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.571220 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.571240 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.571266 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.571283 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:55Z","lastTransitionTime":"2025-12-06T01:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.674193 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.674232 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.674241 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.674255 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.674265 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:55Z","lastTransitionTime":"2025-12-06T01:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.777339 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.777390 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.777399 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.777416 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.777428 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:55Z","lastTransitionTime":"2025-12-06T01:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.880717 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.880774 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.880788 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.880808 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.880824 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:55Z","lastTransitionTime":"2025-12-06T01:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.986847 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.986935 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.986946 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.986968 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:55 crc kubenswrapper[4991]: I1206 01:02:55.986977 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:55Z","lastTransitionTime":"2025-12-06T01:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.037820 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.037815 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:02:56 crc kubenswrapper[4991]: E1206 01:02:56.038060 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.037848 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:02:56 crc kubenswrapper[4991]: E1206 01:02:56.038167 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.037815 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:02:56 crc kubenswrapper[4991]: E1206 01:02:56.038219 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:02:56 crc kubenswrapper[4991]: E1206 01:02:56.038341 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.091761 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.091871 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.091890 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.091918 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.091937 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:56Z","lastTransitionTime":"2025-12-06T01:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.194820 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.194858 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.194867 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.194883 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.194894 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:56Z","lastTransitionTime":"2025-12-06T01:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.298915 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.299023 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.299083 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.299113 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.299136 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:56Z","lastTransitionTime":"2025-12-06T01:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.402168 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.402245 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.402261 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.402287 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.402352 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:56Z","lastTransitionTime":"2025-12-06T01:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.507028 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.507109 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.507131 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.507164 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.507184 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:56Z","lastTransitionTime":"2025-12-06T01:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.610263 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.610335 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.610375 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.610407 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.610459 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:56Z","lastTransitionTime":"2025-12-06T01:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.714133 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.714202 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.714216 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.714239 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.714255 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:56Z","lastTransitionTime":"2025-12-06T01:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.817871 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.817957 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.817976 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.818038 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.818058 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:56Z","lastTransitionTime":"2025-12-06T01:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.921360 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.921424 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.921442 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.921475 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:56 crc kubenswrapper[4991]: I1206 01:02:56.921490 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:56Z","lastTransitionTime":"2025-12-06T01:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.024409 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.024484 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.024506 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.024535 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.024559 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:57Z","lastTransitionTime":"2025-12-06T01:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.060582 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f3f8be-7093-40c6-8c6e-6bb65c878b31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d50b2b6313c316a02866d767184d6eb48a30af52e2d86c65a08dab461e73a7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:26Z\\\",\\\"message\\\":\\\"ication::requestheader-client-ca-file\\\\nI1206 01:02:26.397262 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1206 01:02:26.397306 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1206 01:02:26.397362 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764982931\\\\\\\\\\\\\\\" (2025-12-06 01:02:10 +0000 UTC to 2026-01-05 01:02:11 +0000 UTC (now=2025-12-06 01:02:26.397254869 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397406 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1206 01:02:26.397421 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1206 01:02:26.397440 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\"\\\\nI1206 01:02:26.397610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764982946\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764982946\\\\\\\\\\\\\\\" (2025-12-06 00:02:26 +0000 UTC to 2026-12-06 00:02:26 +0000 UTC (now=2025-12-06 01:02:26.397572448 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397373 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1206 01:02:26.397647 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1206 01:02:26.397687 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1206 01:02:26.397767 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1206 01:02:26.399022 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:57Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.082341 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:57Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.104272 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:57Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.119886 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f35cea72-9e31-4432-9e3b-16a50ebce794\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9893b4133f9e41bd3fa958f5e1c25f5f0eb3cd3bd8d195d58f282866dc850c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674001323aaa7a65c9d6df014b5b35f7afb939aa6afe57ac9858923e3fe6ce15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ppxx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:57Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.127679 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.127751 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.127777 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.127811 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.127836 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:57Z","lastTransitionTime":"2025-12-06T01:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.131293 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t8t9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfebf788-e0ac-4bd4-8926-1f7b7cabba71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60a6e0233b2965c7438c6ee9467c180d9cd0b1407e2ed0279cc0947686bb9fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gpqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://966bddb0df2e52745119f2f51e57c859a39ca726270405f7b9af19c9d90a4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gpqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t8t9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:57Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.147124 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5409f18b-76b3-4388-a9c8-50a1ec8a37d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3c6493775f81f3e0feeee29f2b3c8420a15ba9ce8117fab50521b2251af46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6fa450da9e79fab8a6d70eb261ad080499995eef8884b2e01c68f44f3d1149\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cb22622bd70c57cf25cb4e61bec6541175f1fee668e4b1aaa7e36853827315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:57Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.162060 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021d3a65c56a737a682268cde0d57a186389f597de592f541a72ea5958a170c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70371e1283910d9d652089554a815343eb1d54110a2472e067d5a2e8878c9193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:57Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.181411 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d8086b693e86f8e2ea45695911c64df4acded68da7cc61db4cdbc19b2402879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:57Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.198625 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:57Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.218633 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sq7kg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f95c58-fd69-48f6-94be-15927aa850c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d44b91702b22653a2cc67f20e049260e82a69445cff9d6d2abbec16d1e725147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw8lm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sq7kg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:57Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.231382 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.231455 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.231471 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.231495 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.231512 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:57Z","lastTransitionTime":"2025-12-06T01:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.235872 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e52ba3-560c-483d-9c51-898cbe94aad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f6d3d0e90489e71ee7e5e352e5d1752a7248686adf123c15f2f9aba34e9c556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tw85g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:57Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.248766 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q8mbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"374025e5-ccaa-4cc4-816b-136713f17d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khpzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khpzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q8mbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:57Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.260688 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kw4lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c790aea7-3412-47a6-a3d0-3dd019c26022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4698040873e672e01eca6a4791166d92bd11bdcc3197bbe0340472eba3ba69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnwnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kw4lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:57Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.276911 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc01e1f52af4386bd143ff8779458f0e335475f40f067c608c70a765d6fad952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:57Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.308614 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1c8508-3f35-44b2-988e-cc665121ce94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25af58348673cb95595baaa6e45859db48cde037f62e8e7f49aac6a5932fb194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25af58348673cb95595baaa6e45859db48cde037f62e8e7f49aac6a5932fb194\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T01:02:48Z\\\",\\\"message\\\":\\\"8.084433 6500 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 01:02:48.084572 6500 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 01:02:48.084834 6500 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 01:02:48.084927 6500 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 01:02:48.085548 6500 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1206 01:02:48.085562 6500 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1206 01:02:48.085580 6500 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1206 01:02:48.085605 6500 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1206 01:02:48.085631 6500 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1206 01:02:48.089921 6500 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1206 01:02:48.089953 6500 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1206 01:02:48.090015 6500 factory.go:656] Stopping watch factory\\\\nI1206 01:02:48.090031 6500 ovnkube.go:599] Stopped ovnkube\\\\nI1206 01:02:48.090073 6500 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1206 01\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5t29r_openshift-ovn-kubernetes(7a1c8508-3f35-44b2-988e-cc665121ce94)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5t29r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:57Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.320281 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g2v2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee92c0e499eed2711d5027aba6265d26549865fbfb26de5dd38442e3c9a852b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlkqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g2v2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:02:57Z is after 2025-08-24T17:21:41Z" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.334263 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.334295 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.334307 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.334328 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.334342 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:57Z","lastTransitionTime":"2025-12-06T01:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.437950 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.438038 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.438066 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.438092 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.438110 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:57Z","lastTransitionTime":"2025-12-06T01:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.541819 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.541908 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.541928 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.541960 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.542018 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:57Z","lastTransitionTime":"2025-12-06T01:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.645380 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.645445 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.645463 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.645493 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.645513 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:57Z","lastTransitionTime":"2025-12-06T01:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.749548 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.749636 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.749662 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.749703 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.749729 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:57Z","lastTransitionTime":"2025-12-06T01:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.814164 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.815372 4991 scope.go:117] "RemoveContainer" containerID="25af58348673cb95595baaa6e45859db48cde037f62e8e7f49aac6a5932fb194" Dec 06 01:02:57 crc kubenswrapper[4991]: E1206 01:02:57.815591 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5t29r_openshift-ovn-kubernetes(7a1c8508-3f35-44b2-988e-cc665121ce94)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.853403 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.853803 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.853832 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.853868 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.853898 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:57Z","lastTransitionTime":"2025-12-06T01:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.864074 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:02:57 crc kubenswrapper[4991]: E1206 01:02:57.864345 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:03:29.864312317 +0000 UTC m=+83.214602825 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.957969 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.958079 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.958100 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.958127 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:57 crc kubenswrapper[4991]: I1206 01:02:57.958147 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:57Z","lastTransitionTime":"2025-12-06T01:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.037219 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.037330 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.037330 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.037480 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:02:58 crc kubenswrapper[4991]: E1206 01:02:58.037582 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:02:58 crc kubenswrapper[4991]: E1206 01:02:58.037950 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:02:58 crc kubenswrapper[4991]: E1206 01:02:58.038212 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:02:58 crc kubenswrapper[4991]: E1206 01:02:58.038438 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.062727 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.062809 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.062827 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.062862 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.062883 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:58Z","lastTransitionTime":"2025-12-06T01:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.066687 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.066761 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.066802 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.066852 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:02:58 crc kubenswrapper[4991]: E1206 01:02:58.066962 4991 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 01:02:58 crc kubenswrapper[4991]: E1206 01:02:58.067116 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 01:03:30.067087517 +0000 UTC m=+83.417377995 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 01:02:58 crc kubenswrapper[4991]: E1206 01:02:58.067111 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 01:02:58 crc kubenswrapper[4991]: E1206 01:02:58.067174 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 01:02:58 crc kubenswrapper[4991]: E1206 01:02:58.067208 4991 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 01:02:58 crc kubenswrapper[4991]: E1206 01:02:58.066968 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 01:02:58 crc kubenswrapper[4991]: E1206 01:02:58.067310 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 01:02:58 crc kubenswrapper[4991]: E1206 01:02:58.067331 4991 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 01:02:58 crc kubenswrapper[4991]: E1206 01:02:58.067343 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-06 01:03:30.067309743 +0000 UTC m=+83.417600391 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 01:02:58 crc kubenswrapper[4991]: E1206 01:02:58.067412 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-06 01:03:30.067388266 +0000 UTC m=+83.417678744 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 01:02:58 crc kubenswrapper[4991]: E1206 01:02:58.067125 4991 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 01:02:58 crc kubenswrapper[4991]: E1206 01:02:58.067492 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 01:03:30.067478418 +0000 UTC m=+83.417769106 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.166566 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.166634 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.166653 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.166679 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.166704 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:58Z","lastTransitionTime":"2025-12-06T01:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.269691 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.269734 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.269744 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.269759 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.269770 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:58Z","lastTransitionTime":"2025-12-06T01:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.372494 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.372566 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.372587 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.372617 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.372634 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:58Z","lastTransitionTime":"2025-12-06T01:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.477812 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.477887 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.477912 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.477948 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.477974 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:58Z","lastTransitionTime":"2025-12-06T01:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.581206 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.581274 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.581288 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.581309 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.581506 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:58Z","lastTransitionTime":"2025-12-06T01:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.685238 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.685320 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.685341 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.685373 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.685396 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:58Z","lastTransitionTime":"2025-12-06T01:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.789090 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.789180 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.789202 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.789234 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.789256 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:58Z","lastTransitionTime":"2025-12-06T01:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.892885 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.892945 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.892966 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.893027 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.893048 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:58Z","lastTransitionTime":"2025-12-06T01:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.996067 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.996135 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.996159 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.996192 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:58 crc kubenswrapper[4991]: I1206 01:02:58.996215 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:58Z","lastTransitionTime":"2025-12-06T01:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:59 crc kubenswrapper[4991]: I1206 01:02:59.098604 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:59 crc kubenswrapper[4991]: I1206 01:02:59.098673 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:59 crc kubenswrapper[4991]: I1206 01:02:59.098699 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:59 crc kubenswrapper[4991]: I1206 01:02:59.098735 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:59 crc kubenswrapper[4991]: I1206 01:02:59.098759 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:59Z","lastTransitionTime":"2025-12-06T01:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:59 crc kubenswrapper[4991]: I1206 01:02:59.202248 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:59 crc kubenswrapper[4991]: I1206 01:02:59.202357 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:59 crc kubenswrapper[4991]: I1206 01:02:59.202386 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:59 crc kubenswrapper[4991]: I1206 01:02:59.202423 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:59 crc kubenswrapper[4991]: I1206 01:02:59.202452 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:59Z","lastTransitionTime":"2025-12-06T01:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:59 crc kubenswrapper[4991]: I1206 01:02:59.306221 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:59 crc kubenswrapper[4991]: I1206 01:02:59.306266 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:59 crc kubenswrapper[4991]: I1206 01:02:59.306280 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:59 crc kubenswrapper[4991]: I1206 01:02:59.306298 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:59 crc kubenswrapper[4991]: I1206 01:02:59.306311 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:59Z","lastTransitionTime":"2025-12-06T01:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:59 crc kubenswrapper[4991]: I1206 01:02:59.409355 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:59 crc kubenswrapper[4991]: I1206 01:02:59.409428 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:59 crc kubenswrapper[4991]: I1206 01:02:59.409448 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:59 crc kubenswrapper[4991]: I1206 01:02:59.409478 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:59 crc kubenswrapper[4991]: I1206 01:02:59.409501 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:59Z","lastTransitionTime":"2025-12-06T01:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:59 crc kubenswrapper[4991]: I1206 01:02:59.512832 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:59 crc kubenswrapper[4991]: I1206 01:02:59.512897 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:59 crc kubenswrapper[4991]: I1206 01:02:59.512907 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:59 crc kubenswrapper[4991]: I1206 01:02:59.512925 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:59 crc kubenswrapper[4991]: I1206 01:02:59.512935 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:59Z","lastTransitionTime":"2025-12-06T01:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:59 crc kubenswrapper[4991]: I1206 01:02:59.616730 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:59 crc kubenswrapper[4991]: I1206 01:02:59.616798 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:59 crc kubenswrapper[4991]: I1206 01:02:59.616811 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:59 crc kubenswrapper[4991]: I1206 01:02:59.616832 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:59 crc kubenswrapper[4991]: I1206 01:02:59.616845 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:59Z","lastTransitionTime":"2025-12-06T01:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:59 crc kubenswrapper[4991]: I1206 01:02:59.720244 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:59 crc kubenswrapper[4991]: I1206 01:02:59.720326 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:59 crc kubenswrapper[4991]: I1206 01:02:59.720348 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:59 crc kubenswrapper[4991]: I1206 01:02:59.720383 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:59 crc kubenswrapper[4991]: I1206 01:02:59.720408 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:59Z","lastTransitionTime":"2025-12-06T01:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:59 crc kubenswrapper[4991]: I1206 01:02:59.823790 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:59 crc kubenswrapper[4991]: I1206 01:02:59.823851 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:59 crc kubenswrapper[4991]: I1206 01:02:59.823865 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:59 crc kubenswrapper[4991]: I1206 01:02:59.823893 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:59 crc kubenswrapper[4991]: I1206 01:02:59.823910 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:59Z","lastTransitionTime":"2025-12-06T01:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:02:59 crc kubenswrapper[4991]: I1206 01:02:59.927604 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:02:59 crc kubenswrapper[4991]: I1206 01:02:59.927674 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:02:59 crc kubenswrapper[4991]: I1206 01:02:59.927692 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:02:59 crc kubenswrapper[4991]: I1206 01:02:59.927723 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:02:59 crc kubenswrapper[4991]: I1206 01:02:59.927742 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:02:59Z","lastTransitionTime":"2025-12-06T01:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.031402 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.031470 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.031497 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.031533 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.031557 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:00Z","lastTransitionTime":"2025-12-06T01:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.037079 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.037136 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.037139 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.037096 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:03:00 crc kubenswrapper[4991]: E1206 01:03:00.037348 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:03:00 crc kubenswrapper[4991]: E1206 01:03:00.037476 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:03:00 crc kubenswrapper[4991]: E1206 01:03:00.037751 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:03:00 crc kubenswrapper[4991]: E1206 01:03:00.037924 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.134823 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.134884 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.134895 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.134915 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.134927 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:00Z","lastTransitionTime":"2025-12-06T01:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.238582 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.238664 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.238689 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.238728 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.238752 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:00Z","lastTransitionTime":"2025-12-06T01:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.342826 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.342912 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.342938 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.342967 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.343017 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:00Z","lastTransitionTime":"2025-12-06T01:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.446330 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.446394 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.446422 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.446460 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.446489 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:00Z","lastTransitionTime":"2025-12-06T01:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.550411 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.550490 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.550510 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.550538 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.550558 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:00Z","lastTransitionTime":"2025-12-06T01:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.654665 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.654751 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.654762 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.654784 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.654798 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:00Z","lastTransitionTime":"2025-12-06T01:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.758367 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.758444 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.758458 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.758482 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.758496 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:00Z","lastTransitionTime":"2025-12-06T01:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.779079 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.806862 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc01e1f52af4386bd143ff8779458f0e335475f40f067c608c70a765d6fad952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:00Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.837860 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1c8508-3f35-44b2-988e-cc665121ce94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25af58348673cb95595baaa6e45859db48cde037f62e8e7f49aac6a5932fb194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25af58348673cb95595baaa6e45859db48cde037f62e8e7f49aac6a5932fb194\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T01:02:48Z\\\",\\\"message\\\":\\\"8.084433 6500 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 01:02:48.084572 6500 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 01:02:48.084834 6500 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 01:02:48.084927 6500 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 01:02:48.085548 6500 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1206 01:02:48.085562 6500 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1206 01:02:48.085580 6500 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1206 01:02:48.085605 6500 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1206 01:02:48.085631 6500 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1206 01:02:48.089921 6500 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1206 01:02:48.089953 6500 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1206 01:02:48.090015 6500 factory.go:656] Stopping watch factory\\\\nI1206 01:02:48.090031 6500 ovnkube.go:599] Stopped ovnkube\\\\nI1206 01:02:48.090073 6500 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1206 01\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5t29r_openshift-ovn-kubernetes(7a1c8508-3f35-44b2-988e-cc665121ce94)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5t29r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:00Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.855548 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g2v2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee92c0e499eed2711d5027aba6265d26549865fbfb26de5dd38442e3c9a852b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlkqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g2v2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:00Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.864366 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.864428 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.864448 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.864480 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.864500 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:00Z","lastTransitionTime":"2025-12-06T01:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.874112 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f3f8be-7093-40c6-8c6e-6bb65c878b31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d50b2b6313c316a02866d767184d6eb48a30af52e2d86c65a08dab461e73a7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:26Z\\\",\\\"message\\\":\\\"ication::requestheader-client-ca-file\\\\nI1206 01:02:26.397262 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1206 01:02:26.397306 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1206 01:02:26.397362 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764982931\\\\\\\\\\\\\\\" (2025-12-06 01:02:10 +0000 UTC to 2026-01-05 01:02:11 +0000 UTC (now=2025-12-06 01:02:26.397254869 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397406 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1206 01:02:26.397421 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1206 01:02:26.397440 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\"\\\\nI1206 01:02:26.397610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764982946\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764982946\\\\\\\\\\\\\\\" (2025-12-06 00:02:26 +0000 UTC to 2026-12-06 00:02:26 +0000 UTC (now=2025-12-06 01:02:26.397572448 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397373 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1206 01:02:26.397647 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1206 01:02:26.397687 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1206 01:02:26.397767 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1206 01:02:26.399022 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:00Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.892443 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:00Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.909863 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:00Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.929471 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f35cea72-9e31-4432-9e3b-16a50ebce794\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9893b4133f9e41bd3fa958f5e1c25f5f0eb3cd3bd8d195d58f282866dc850c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674001323aaa7a65c9d6df014b5b35f7afb939aa6afe57ac9858923e3fe6ce15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ppxx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:00Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.945879 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t8t9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfebf788-e0ac-4bd4-8926-1f7b7cabba71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60a6e0233b2965c7438c6ee9467c180d9cd0b1407e2ed0279cc0947686bb9fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gpqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://966bddb0df2e52745119f2f51e57c859a39ca726270405f7b9af19c9d90a4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gpqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t8t9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:00Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.964530 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5409f18b-76b3-4388-a9c8-50a1ec8a37d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3c6493775f81f3e0feeee29f2b3c8420a15ba9ce8117fab50521b2251af46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6fa450da9e79fab8a6d70eb261ad080499995eef8884b2e01c68f44f3d1149\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cb22622bd70c57cf25cb4e61bec6541175f1fee668e4b1aaa7e36853827315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:00Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.968335 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.968405 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.968422 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.968448 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.968469 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:00Z","lastTransitionTime":"2025-12-06T01:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:00 crc kubenswrapper[4991]: I1206 01:03:00.988611 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021d3a65c56a737a682268cde0d57a186389f597de592f541a72ea5958a170c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70371e1283910d9d652089554a815343eb1d54110a2472e067d5a2e8878c9193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:00Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.007715 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d8086b693e86f8e2ea45695911c64df4acded68da7cc61db4cdbc19b2402879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:01Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.024072 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:01Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.041856 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sq7kg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f95c58-fd69-48f6-94be-15927aa850c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d44b91702b22653a2cc67f20e049260e82a69445cff9d6d2abbec16d1e725147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw8lm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sq7kg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:01Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.058950 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e52ba3-560c-483d-9c51-898cbe94aad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f6d3d0e90489e71ee7e5e352e5d1752a7248686adf123c15f2f9aba34e9c556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tw85g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:01Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.070310 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q8mbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"374025e5-ccaa-4cc4-816b-136713f17d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khpzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khpzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q8mbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:01Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.071219 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.071274 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.071288 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.071311 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.071323 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:01Z","lastTransitionTime":"2025-12-06T01:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.080940 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kw4lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c790aea7-3412-47a6-a3d0-3dd019c26022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4698040873e672e01eca6a4791166d92bd11bdcc3197bbe0340472eba3ba69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnwnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kw4lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:01Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.174234 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.174306 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.174324 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.174348 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.174362 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:01Z","lastTransitionTime":"2025-12-06T01:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.277538 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.277606 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.277618 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.277637 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.277652 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:01Z","lastTransitionTime":"2025-12-06T01:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.380866 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.380936 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.380956 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.381041 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.381064 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:01Z","lastTransitionTime":"2025-12-06T01:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.484777 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.484854 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.484874 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.484904 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.484924 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:01Z","lastTransitionTime":"2025-12-06T01:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.588143 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.588216 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.588234 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.588264 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.588291 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:01Z","lastTransitionTime":"2025-12-06T01:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.691648 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.691700 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.691709 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.691728 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.691738 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:01Z","lastTransitionTime":"2025-12-06T01:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.794960 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.795060 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.795078 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.795103 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.795119 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:01Z","lastTransitionTime":"2025-12-06T01:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.916295 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.916357 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.916371 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.916395 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:01 crc kubenswrapper[4991]: I1206 01:03:01.916409 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:01Z","lastTransitionTime":"2025-12-06T01:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.019135 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.019179 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.019187 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.019201 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.019211 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:02Z","lastTransitionTime":"2025-12-06T01:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.037810 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.038045 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.038106 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.038146 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:03:02 crc kubenswrapper[4991]: E1206 01:03:02.038057 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:03:02 crc kubenswrapper[4991]: E1206 01:03:02.038344 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:03:02 crc kubenswrapper[4991]: E1206 01:03:02.038394 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:03:02 crc kubenswrapper[4991]: E1206 01:03:02.038472 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.121600 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.121666 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.121680 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.121703 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.121717 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:02Z","lastTransitionTime":"2025-12-06T01:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.224326 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.224414 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.224434 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.224466 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.224486 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:02Z","lastTransitionTime":"2025-12-06T01:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:02 crc kubenswrapper[4991]: E1206 01:03:02.321217 4991 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 01:03:02 crc kubenswrapper[4991]: E1206 01:03:02.321382 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/374025e5-ccaa-4cc4-816b-136713f17d6d-metrics-certs podName:374025e5-ccaa-4cc4-816b-136713f17d6d nodeName:}" failed. No retries permitted until 2025-12-06 01:03:18.321342835 +0000 UTC m=+71.671633333 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/374025e5-ccaa-4cc4-816b-136713f17d6d-metrics-certs") pod "network-metrics-daemon-q8mbt" (UID: "374025e5-ccaa-4cc4-816b-136713f17d6d") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.321589 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/374025e5-ccaa-4cc4-816b-136713f17d6d-metrics-certs\") pod \"network-metrics-daemon-q8mbt\" (UID: \"374025e5-ccaa-4cc4-816b-136713f17d6d\") " pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.328851 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.328921 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.328947 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.329017 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.329051 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:02Z","lastTransitionTime":"2025-12-06T01:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.432609 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.432681 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.432699 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.432725 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.432746 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:02Z","lastTransitionTime":"2025-12-06T01:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.494417 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.514323 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:02Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.514486 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.533877 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sq7kg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f95c58-fd69-48f6-94be-15927aa850c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d44b91702b22653a2cc67f20e049260e82a69445cff9d6d2abbec16d1e725147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw8lm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sq7kg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:02Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.536872 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.536937 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.536956 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.537002 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.537019 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:02Z","lastTransitionTime":"2025-12-06T01:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.555216 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e52ba3-560c-483d-9c51-898cbe94aad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f6d3d0e90489e71ee7e5e352e5d1752a7248686adf123c15f2f9aba34e9c556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tw85g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:02Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.569602 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t8t9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfebf788-e0ac-4bd4-8926-1f7b7cabba71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60a6e0233b2965c7438c6ee9467c180d9cd0b1407e2ed0279cc0947686bb9fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gpqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://966bddb0df2e52745119f2f51e57c859a39ca726270405f7b9af19c9d90a4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gpqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t8t9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:02Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.584273 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5409f18b-76b3-4388-a9c8-50a1ec8a37d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3c6493775f81f3e0feeee29f2b3c8420a15ba9ce8117fab50521b2251af46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6fa450da9e79fab8a6d70eb261ad080499995eef8884b2e01c68f44f3d1149\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cb22622bd70c57cf25cb4e61bec6541175f1fee668e4b1aaa7e36853827315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:02Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.601475 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021d3a65c56a737a682268cde0d57a186389f597de592f541a72ea5958a170c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70371e1283910d9d652089554a815343eb1d54110a2472e067d5a2e8878c9193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:02Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.615666 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d8086b693e86f8e2ea45695911c64df4acded68da7cc61db4cdbc19b2402879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:02Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.629527 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q8mbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"374025e5-ccaa-4cc4-816b-136713f17d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khpzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khpzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q8mbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:02Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.640296 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.640356 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.640371 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.640408 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.640422 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:02Z","lastTransitionTime":"2025-12-06T01:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.646656 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kw4lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c790aea7-3412-47a6-a3d0-3dd019c26022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4698040873e672e01eca6a4791166d92bd11bdcc3197bbe0340472eba3ba69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnwnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kw4lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:02Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.664532 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc01e1f52af4386bd143ff8779458f0e335475f40f067c608c70a765d6fad952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:02Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.688122 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1c8508-3f35-44b2-988e-cc665121ce94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25af58348673cb95595baaa6e45859db48cde037f62e8e7f49aac6a5932fb194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25af58348673cb95595baaa6e45859db48cde037f62e8e7f49aac6a5932fb194\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T01:02:48Z\\\",\\\"message\\\":\\\"8.084433 6500 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 01:02:48.084572 6500 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 01:02:48.084834 6500 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 01:02:48.084927 6500 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 01:02:48.085548 6500 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1206 01:02:48.085562 6500 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1206 01:02:48.085580 6500 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1206 01:02:48.085605 6500 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1206 01:02:48.085631 6500 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1206 01:02:48.089921 6500 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1206 01:02:48.089953 6500 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1206 01:02:48.090015 6500 factory.go:656] Stopping watch factory\\\\nI1206 01:02:48.090031 6500 ovnkube.go:599] Stopped ovnkube\\\\nI1206 01:02:48.090073 6500 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1206 01\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5t29r_openshift-ovn-kubernetes(7a1c8508-3f35-44b2-988e-cc665121ce94)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5t29r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:02Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.701789 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g2v2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee92c0e499eed2711d5027aba6265d26549865fbfb26de5dd38442e3c9a852b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlkqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g2v2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:02Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.715839 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f3f8be-7093-40c6-8c6e-6bb65c878b31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d50b2b6313c316a02866d767184d6eb48a30af52e2d86c65a08dab461e73a7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:26Z\\\",\\\"message\\\":\\\"ication::requestheader-client-ca-file\\\\nI1206 01:02:26.397262 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1206 01:02:26.397306 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1206 01:02:26.397362 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764982931\\\\\\\\\\\\\\\" (2025-12-06 01:02:10 +0000 UTC to 2026-01-05 01:02:11 +0000 UTC (now=2025-12-06 01:02:26.397254869 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397406 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1206 01:02:26.397421 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1206 01:02:26.397440 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\"\\\\nI1206 01:02:26.397610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764982946\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764982946\\\\\\\\\\\\\\\" (2025-12-06 00:02:26 +0000 UTC to 2026-12-06 00:02:26 +0000 UTC (now=2025-12-06 01:02:26.397572448 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397373 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1206 01:02:26.397647 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1206 01:02:26.397687 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1206 01:02:26.397767 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1206 01:02:26.399022 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:02Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.731793 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:02Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.743189 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.743264 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.743282 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.743307 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.743323 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:02Z","lastTransitionTime":"2025-12-06T01:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.751280 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:02Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.766931 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f35cea72-9e31-4432-9e3b-16a50ebce794\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9893b4133f9e41bd3fa958f5e1c25f5f0eb3cd3bd8d195d58f282866dc850c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674001323aaa7a65c9d6df014b5b35f7afb939aa6afe57ac9858923e3fe6ce15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ppxx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:02Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.846496 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.846534 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.846547 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.846565 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.846577 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:02Z","lastTransitionTime":"2025-12-06T01:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.949532 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.949585 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.949599 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.949617 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:02 crc kubenswrapper[4991]: I1206 01:03:02.949630 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:02Z","lastTransitionTime":"2025-12-06T01:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.052205 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.052245 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.052257 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.052272 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.052284 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:03Z","lastTransitionTime":"2025-12-06T01:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.155645 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.155698 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.155712 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.155731 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.155743 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:03Z","lastTransitionTime":"2025-12-06T01:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.258188 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.258234 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.258247 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.258266 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.258280 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:03Z","lastTransitionTime":"2025-12-06T01:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.360749 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.360819 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.360828 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.360842 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.360851 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:03Z","lastTransitionTime":"2025-12-06T01:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.463254 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.463299 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.463311 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.463327 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.463338 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:03Z","lastTransitionTime":"2025-12-06T01:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.479710 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.479749 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.479761 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.479779 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.479790 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:03Z","lastTransitionTime":"2025-12-06T01:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:03 crc kubenswrapper[4991]: E1206 01:03:03.497158 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"596c3607-4a7a-43fe-9f86-5d0b3f7502dc\\\",\\\"systemUUID\\\":\\\"a16d5df6-771f-4852-a5a1-7a32af9d583d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:03Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.501604 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.501650 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.501662 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.501683 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.501697 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:03Z","lastTransitionTime":"2025-12-06T01:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:03 crc kubenswrapper[4991]: E1206 01:03:03.520972 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"596c3607-4a7a-43fe-9f86-5d0b3f7502dc\\\",\\\"systemUUID\\\":\\\"a16d5df6-771f-4852-a5a1-7a32af9d583d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:03Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.529345 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.529400 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.529416 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.529434 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.529446 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:03Z","lastTransitionTime":"2025-12-06T01:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:03 crc kubenswrapper[4991]: E1206 01:03:03.544747 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"596c3607-4a7a-43fe-9f86-5d0b3f7502dc\\\",\\\"systemUUID\\\":\\\"a16d5df6-771f-4852-a5a1-7a32af9d583d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:03Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.550558 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.550869 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.551108 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.551293 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.551488 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:03Z","lastTransitionTime":"2025-12-06T01:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:03 crc kubenswrapper[4991]: E1206 01:03:03.565676 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"596c3607-4a7a-43fe-9f86-5d0b3f7502dc\\\",\\\"systemUUID\\\":\\\"a16d5df6-771f-4852-a5a1-7a32af9d583d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:03Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.569906 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.569962 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.569973 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.570017 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.570029 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:03Z","lastTransitionTime":"2025-12-06T01:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:03 crc kubenswrapper[4991]: E1206 01:03:03.584705 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"596c3607-4a7a-43fe-9f86-5d0b3f7502dc\\\",\\\"systemUUID\\\":\\\"a16d5df6-771f-4852-a5a1-7a32af9d583d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:03Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:03 crc kubenswrapper[4991]: E1206 01:03:03.584853 4991 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.586487 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.586522 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.586534 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.586552 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.586565 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:03Z","lastTransitionTime":"2025-12-06T01:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.689597 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.689671 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.689691 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.689720 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.689739 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:03Z","lastTransitionTime":"2025-12-06T01:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.792327 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.792363 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.792373 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.792388 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.792399 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:03Z","lastTransitionTime":"2025-12-06T01:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.895793 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.895875 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.895903 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.895943 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.895966 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:03Z","lastTransitionTime":"2025-12-06T01:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.998864 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.998932 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:03 crc kubenswrapper[4991]: I1206 01:03:03.998956 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:03.999008 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:03.999029 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:03Z","lastTransitionTime":"2025-12-06T01:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.037924 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.037957 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.037925 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.038095 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:03:04 crc kubenswrapper[4991]: E1206 01:03:04.038135 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:03:04 crc kubenswrapper[4991]: E1206 01:03:04.038300 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:03:04 crc kubenswrapper[4991]: E1206 01:03:04.038466 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:03:04 crc kubenswrapper[4991]: E1206 01:03:04.038333 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.102059 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.102119 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.102129 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.102146 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.102157 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:04Z","lastTransitionTime":"2025-12-06T01:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.204889 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.204937 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.204946 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.204961 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.204973 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:04Z","lastTransitionTime":"2025-12-06T01:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.307261 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.307298 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.307306 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.307321 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.307330 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:04Z","lastTransitionTime":"2025-12-06T01:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.410187 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.410221 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.410237 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.410255 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.410264 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:04Z","lastTransitionTime":"2025-12-06T01:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.513452 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.513499 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.513509 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.513525 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.513536 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:04Z","lastTransitionTime":"2025-12-06T01:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.616168 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.616241 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.616252 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.616271 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.616282 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:04Z","lastTransitionTime":"2025-12-06T01:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.719178 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.719250 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.719272 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.719302 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.719320 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:04Z","lastTransitionTime":"2025-12-06T01:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.823791 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.823851 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.823869 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.823895 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.823916 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:04Z","lastTransitionTime":"2025-12-06T01:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.926866 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.926938 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.926962 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.927025 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:04 crc kubenswrapper[4991]: I1206 01:03:04.927047 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:04Z","lastTransitionTime":"2025-12-06T01:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.029715 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.029796 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.029816 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.029847 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.029866 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:05Z","lastTransitionTime":"2025-12-06T01:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.133203 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.133283 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.133301 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.133332 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.133351 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:05Z","lastTransitionTime":"2025-12-06T01:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.236322 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.236416 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.236442 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.236481 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.236504 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:05Z","lastTransitionTime":"2025-12-06T01:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.340119 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.340193 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.340212 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.340240 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.340259 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:05Z","lastTransitionTime":"2025-12-06T01:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.444573 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.444649 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.444668 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.444695 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.444715 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:05Z","lastTransitionTime":"2025-12-06T01:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.547191 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.547270 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.547286 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.547311 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.547330 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:05Z","lastTransitionTime":"2025-12-06T01:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.649628 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.649680 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.649696 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.649723 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.649741 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:05Z","lastTransitionTime":"2025-12-06T01:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.754109 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.754268 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.754286 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.754312 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.754326 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:05Z","lastTransitionTime":"2025-12-06T01:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.857860 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.857913 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.857926 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.857950 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.857964 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:05Z","lastTransitionTime":"2025-12-06T01:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.962065 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.962145 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.962169 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.962196 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:05 crc kubenswrapper[4991]: I1206 01:03:05.962216 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:05Z","lastTransitionTime":"2025-12-06T01:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.037841 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.037915 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.037862 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.038132 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:03:06 crc kubenswrapper[4991]: E1206 01:03:06.038371 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:03:06 crc kubenswrapper[4991]: E1206 01:03:06.038534 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:03:06 crc kubenswrapper[4991]: E1206 01:03:06.038742 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:03:06 crc kubenswrapper[4991]: E1206 01:03:06.038897 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.065529 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.065645 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.065662 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.065704 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.065719 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:06Z","lastTransitionTime":"2025-12-06T01:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.168866 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.168930 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.168944 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.168967 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.168993 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:06Z","lastTransitionTime":"2025-12-06T01:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.272041 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.272111 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.272128 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.272148 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.272165 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:06Z","lastTransitionTime":"2025-12-06T01:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.375407 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.375477 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.375546 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.375593 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.375620 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:06Z","lastTransitionTime":"2025-12-06T01:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.478022 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.478084 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.478096 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.478118 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.478132 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:06Z","lastTransitionTime":"2025-12-06T01:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.581247 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.581321 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.581335 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.581354 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.581367 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:06Z","lastTransitionTime":"2025-12-06T01:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.684511 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.684555 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.684566 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.684581 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.684591 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:06Z","lastTransitionTime":"2025-12-06T01:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.787677 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.787752 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.787764 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.787801 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.787815 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:06Z","lastTransitionTime":"2025-12-06T01:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.891431 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.891486 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.891499 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.891519 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.891531 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:06Z","lastTransitionTime":"2025-12-06T01:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.994572 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.994651 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.994669 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.995149 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:06 crc kubenswrapper[4991]: I1206 01:03:06.995213 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:06Z","lastTransitionTime":"2025-12-06T01:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.054005 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kw4lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c790aea7-3412-47a6-a3d0-3dd019c26022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4698040873e672e01eca6a4791166d92bd11bdcc3197bbe0340472eba3ba69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnwnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kw4lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:07Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.069281 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc01e1f52af4386bd143ff8779458f0e335475f40f067c608c70a765d6fad952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:07Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.087380 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1c8508-3f35-44b2-988e-cc665121ce94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25af58348673cb95595baaa6e45859db48cde037f62e8e7f49aac6a5932fb194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25af58348673cb95595baaa6e45859db48cde037f62e8e7f49aac6a5932fb194\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T01:02:48Z\\\",\\\"message\\\":\\\"8.084433 6500 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 01:02:48.084572 6500 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 01:02:48.084834 6500 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 01:02:48.084927 6500 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 01:02:48.085548 6500 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1206 01:02:48.085562 6500 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1206 01:02:48.085580 6500 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1206 01:02:48.085605 6500 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1206 01:02:48.085631 6500 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1206 01:02:48.089921 6500 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1206 01:02:48.089953 6500 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1206 01:02:48.090015 6500 factory.go:656] Stopping watch factory\\\\nI1206 01:02:48.090031 6500 ovnkube.go:599] Stopped ovnkube\\\\nI1206 01:02:48.090073 6500 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1206 01\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5t29r_openshift-ovn-kubernetes(7a1c8508-3f35-44b2-988e-cc665121ce94)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5t29r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:07Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.100915 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.100958 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.100970 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.101011 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.100962 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g2v2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee92c0e499eed2711d5027aba6265d26549865fbfb26de5dd38442e3c9a852b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlkqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g2v2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:07Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.101027 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:07Z","lastTransitionTime":"2025-12-06T01:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.116338 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f3f8be-7093-40c6-8c6e-6bb65c878b31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d50b2b6313c316a02866d767184d6eb48a30af52e2d86c65a08dab461e73a7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:26Z\\\",\\\"message\\\":\\\"ication::requestheader-client-ca-file\\\\nI1206 01:02:26.397262 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1206 01:02:26.397306 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1206 01:02:26.397362 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764982931\\\\\\\\\\\\\\\" (2025-12-06 01:02:10 +0000 UTC to 2026-01-05 01:02:11 +0000 UTC (now=2025-12-06 01:02:26.397254869 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397406 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1206 01:02:26.397421 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1206 01:02:26.397440 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\"\\\\nI1206 01:02:26.397610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764982946\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764982946\\\\\\\\\\\\\\\" (2025-12-06 00:02:26 +0000 UTC to 2026-12-06 00:02:26 +0000 UTC (now=2025-12-06 01:02:26.397572448 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397373 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1206 01:02:26.397647 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1206 01:02:26.397687 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1206 01:02:26.397767 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1206 01:02:26.399022 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:07Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.133634 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:07Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.151021 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:07Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.166527 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f35cea72-9e31-4432-9e3b-16a50ebce794\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9893b4133f9e41bd3fa958f5e1c25f5f0eb3cd3bd8d195d58f282866dc850c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674001323aaa7a65c9d6df014b5b35f7afb939aa6afe57ac9858923e3fe6ce15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ppxx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:07Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.183009 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5409f18b-76b3-4388-a9c8-50a1ec8a37d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3c6493775f81f3e0feeee29f2b3c8420a15ba9ce8117fab50521b2251af46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6fa450da9e79fab8a6d70eb261ad080499995eef8884b2e01c68f44f3d1149\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cb22622bd70c57cf25cb4e61bec6541175f1fee668e4b1aaa7e36853827315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:07Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.197782 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c85b3d7-edd3-4d42-bb8b-c7892feeff04\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4192bffb9f5423bfaf52510034884b3bbdd764b00e1b3dae354bcfe933a8c0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4916d46886a800ca9719abe6ad8686b6208fec6dd5a84454728e228ac5cb46e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd576444486a121ae8f02a6a7a64402e2db59d1baf0410a0b0036de8a0a1963e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91e6ba531f19ebc253e48073eadf13d896be7771a81abf986f83893cc4eb3ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91e6ba531f19ebc253e48073eadf13d896be7771a81abf986f83893cc4eb3ea9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:07Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.204845 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.204889 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.204902 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.204920 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.204934 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:07Z","lastTransitionTime":"2025-12-06T01:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.213568 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021d3a65c56a737a682268cde0d57a186389f597de592f541a72ea5958a170c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70371e1283910d9d652089554a815343eb1d54110a2472e067d5a2e8878c9193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:07Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.229716 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d8086b693e86f8e2ea45695911c64df4acded68da7cc61db4cdbc19b2402879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:07Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.244826 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:07Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.264386 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sq7kg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f95c58-fd69-48f6-94be-15927aa850c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d44b91702b22653a2cc67f20e049260e82a69445cff9d6d2abbec16d1e725147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw8lm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sq7kg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:07Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.281412 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e52ba3-560c-483d-9c51-898cbe94aad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f6d3d0e90489e71ee7e5e352e5d1752a7248686adf123c15f2f9aba34e9c556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tw85g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:07Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.301539 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t8t9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfebf788-e0ac-4bd4-8926-1f7b7cabba71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60a6e0233b2965c7438c6ee9467c180d9cd0b1407e2ed0279cc0947686bb9fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gpqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://966bddb0df2e52745119f2f51e57c859a39ca726270405f7b9af19c9d90a4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gpqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t8t9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:07Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.307492 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.307539 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.307554 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.307579 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.307594 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:07Z","lastTransitionTime":"2025-12-06T01:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.314389 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q8mbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"374025e5-ccaa-4cc4-816b-136713f17d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khpzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khpzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q8mbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:07Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.410794 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.410846 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.410856 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.410875 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.410886 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:07Z","lastTransitionTime":"2025-12-06T01:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.513024 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.513078 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.513089 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.513106 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.513117 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:07Z","lastTransitionTime":"2025-12-06T01:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.615727 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.615802 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.615820 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.615849 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.615874 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:07Z","lastTransitionTime":"2025-12-06T01:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.718818 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.718886 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.718899 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.718921 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.718934 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:07Z","lastTransitionTime":"2025-12-06T01:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.822305 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.822369 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.822390 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.822417 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.822437 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:07Z","lastTransitionTime":"2025-12-06T01:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.924895 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.924947 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.924960 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.924977 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:07 crc kubenswrapper[4991]: I1206 01:03:07.925023 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:07Z","lastTransitionTime":"2025-12-06T01:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.028449 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.028703 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.028714 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.028731 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.028743 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:08Z","lastTransitionTime":"2025-12-06T01:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.037741 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.037816 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:03:08 crc kubenswrapper[4991]: E1206 01:03:08.037833 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.037878 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:03:08 crc kubenswrapper[4991]: E1206 01:03:08.038044 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.037813 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:03:08 crc kubenswrapper[4991]: E1206 01:03:08.038149 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:03:08 crc kubenswrapper[4991]: E1206 01:03:08.037899 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.131607 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.131650 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.131661 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.131680 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.131690 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:08Z","lastTransitionTime":"2025-12-06T01:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.234818 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.234904 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.234925 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.234972 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.235045 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:08Z","lastTransitionTime":"2025-12-06T01:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.338540 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.338590 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.338600 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.338641 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.338655 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:08Z","lastTransitionTime":"2025-12-06T01:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.442058 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.442106 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.442116 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.442132 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.442145 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:08Z","lastTransitionTime":"2025-12-06T01:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.544936 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.545045 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.545058 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.545073 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.545083 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:08Z","lastTransitionTime":"2025-12-06T01:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.648506 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.648572 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.648621 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.648692 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.648716 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:08Z","lastTransitionTime":"2025-12-06T01:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.755808 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.756231 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.756308 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.756403 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.756492 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:08Z","lastTransitionTime":"2025-12-06T01:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.859287 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.859593 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.859695 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.859784 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.859852 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:08Z","lastTransitionTime":"2025-12-06T01:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.963066 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.963115 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.963124 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.963142 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:08 crc kubenswrapper[4991]: I1206 01:03:08.963153 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:08Z","lastTransitionTime":"2025-12-06T01:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.065819 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.066235 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.066382 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.066522 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.066695 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:09Z","lastTransitionTime":"2025-12-06T01:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.170615 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.171526 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.171681 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.171821 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.172155 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:09Z","lastTransitionTime":"2025-12-06T01:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.276244 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.276340 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.276361 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.276390 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.276409 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:09Z","lastTransitionTime":"2025-12-06T01:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.379908 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.380275 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.380387 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.380487 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.380566 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:09Z","lastTransitionTime":"2025-12-06T01:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.482508 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.482562 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.482575 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.482595 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.482609 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:09Z","lastTransitionTime":"2025-12-06T01:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.585653 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.585701 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.585710 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.585732 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.585742 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:09Z","lastTransitionTime":"2025-12-06T01:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.688612 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.689045 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.689161 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.689251 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.689319 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:09Z","lastTransitionTime":"2025-12-06T01:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.792230 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.792617 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.792710 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.792778 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.792836 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:09Z","lastTransitionTime":"2025-12-06T01:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.895789 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.895842 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.895854 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.895875 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.895888 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:09Z","lastTransitionTime":"2025-12-06T01:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.998555 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.998904 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.999037 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.999180 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:09 crc kubenswrapper[4991]: I1206 01:03:09.999274 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:09Z","lastTransitionTime":"2025-12-06T01:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.037177 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.037197 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.037268 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:03:10 crc kubenswrapper[4991]: E1206 01:03:10.037769 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:03:10 crc kubenswrapper[4991]: E1206 01:03:10.037606 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:03:10 crc kubenswrapper[4991]: E1206 01:03:10.037820 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.037368 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:03:10 crc kubenswrapper[4991]: E1206 01:03:10.037957 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.101548 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.101910 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.101975 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.102082 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.102144 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:10Z","lastTransitionTime":"2025-12-06T01:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.205342 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.206124 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.206158 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.206187 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.206205 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:10Z","lastTransitionTime":"2025-12-06T01:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.309031 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.309092 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.309109 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.309136 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.309154 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:10Z","lastTransitionTime":"2025-12-06T01:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.412093 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.412151 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.412161 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.412181 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.412192 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:10Z","lastTransitionTime":"2025-12-06T01:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.515067 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.515158 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.515184 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.515218 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.515244 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:10Z","lastTransitionTime":"2025-12-06T01:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.618547 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.618613 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.618633 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.618663 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.618684 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:10Z","lastTransitionTime":"2025-12-06T01:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.722486 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.722563 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.722589 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.722625 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.722652 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:10Z","lastTransitionTime":"2025-12-06T01:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.825424 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.825489 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.825504 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.825532 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.825547 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:10Z","lastTransitionTime":"2025-12-06T01:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.928323 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.928362 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.928371 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.928385 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:10 crc kubenswrapper[4991]: I1206 01:03:10.928395 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:10Z","lastTransitionTime":"2025-12-06T01:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.031243 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.031284 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.031297 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.031314 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.031327 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:11Z","lastTransitionTime":"2025-12-06T01:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.133942 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.133979 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.134001 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.134016 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.134026 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:11Z","lastTransitionTime":"2025-12-06T01:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.238034 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.238088 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.238098 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.238114 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.238124 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:11Z","lastTransitionTime":"2025-12-06T01:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.341122 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.341216 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.341237 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.341274 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.341337 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:11Z","lastTransitionTime":"2025-12-06T01:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.444444 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.444522 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.444537 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.444562 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.444575 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:11Z","lastTransitionTime":"2025-12-06T01:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.546895 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.546967 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.547012 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.547043 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.547062 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:11Z","lastTransitionTime":"2025-12-06T01:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.649796 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.649857 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.649874 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.649898 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.649915 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:11Z","lastTransitionTime":"2025-12-06T01:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.752880 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.752940 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.752953 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.752978 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.753018 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:11Z","lastTransitionTime":"2025-12-06T01:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.857363 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.857431 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.857449 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.857477 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.857496 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:11Z","lastTransitionTime":"2025-12-06T01:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.960520 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.960570 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.960583 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.960603 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:11 crc kubenswrapper[4991]: I1206 01:03:11.960616 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:11Z","lastTransitionTime":"2025-12-06T01:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.037830 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.037949 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.038019 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:03:12 crc kubenswrapper[4991]: E1206 01:03:12.038081 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.038218 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:03:12 crc kubenswrapper[4991]: E1206 01:03:12.038204 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:03:12 crc kubenswrapper[4991]: E1206 01:03:12.038272 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:03:12 crc kubenswrapper[4991]: E1206 01:03:12.038337 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.064282 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.064326 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.064335 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.064368 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.064383 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:12Z","lastTransitionTime":"2025-12-06T01:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.167712 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.167766 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.167784 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.167814 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.167835 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:12Z","lastTransitionTime":"2025-12-06T01:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.270327 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.270398 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.270420 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.270451 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.270475 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:12Z","lastTransitionTime":"2025-12-06T01:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.372891 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.372930 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.372941 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.372957 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.372967 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:12Z","lastTransitionTime":"2025-12-06T01:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.476441 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.476557 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.476575 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.476604 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.476628 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:12Z","lastTransitionTime":"2025-12-06T01:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.580445 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.580507 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.580518 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.580538 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.580554 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:12Z","lastTransitionTime":"2025-12-06T01:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.683394 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.683455 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.683468 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.683490 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.683503 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:12Z","lastTransitionTime":"2025-12-06T01:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.786782 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.786846 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.786859 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.786890 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.786905 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:12Z","lastTransitionTime":"2025-12-06T01:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.890055 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.890115 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.890130 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.890156 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.890176 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:12Z","lastTransitionTime":"2025-12-06T01:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.996777 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.997068 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.997084 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.997103 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:12 crc kubenswrapper[4991]: I1206 01:03:12.997114 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:12Z","lastTransitionTime":"2025-12-06T01:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.038270 4991 scope.go:117] "RemoveContainer" containerID="25af58348673cb95595baaa6e45859db48cde037f62e8e7f49aac6a5932fb194" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.101409 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.101464 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.101476 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.101494 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.101504 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:13Z","lastTransitionTime":"2025-12-06T01:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.205081 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.205150 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.205165 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.205188 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.205204 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:13Z","lastTransitionTime":"2025-12-06T01:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.307957 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.308021 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.308034 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.308052 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.308062 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:13Z","lastTransitionTime":"2025-12-06T01:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.411114 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.411152 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.411160 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.411174 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.411184 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:13Z","lastTransitionTime":"2025-12-06T01:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.499283 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5t29r_7a1c8508-3f35-44b2-988e-cc665121ce94/ovnkube-controller/1.log" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.501481 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" event={"ID":"7a1c8508-3f35-44b2-988e-cc665121ce94","Type":"ContainerStarted","Data":"34a6fd158c0c0ae9d2cd68294056b2c96405dc3c4bfe170d8370f86310d8fd12"} Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.502205 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.513461 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.513506 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.513515 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.513532 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.513542 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:13Z","lastTransitionTime":"2025-12-06T01:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.518287 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g2v2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee92c0e499eed2711d5027aba6265d26549865fbfb26de5dd38442e3c9a852b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlkqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g2v2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:13Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.546064 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc01e1f52af4386bd143ff8779458f0e335475f40f067c608c70a765d6fad952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:13Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.566218 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1c8508-3f35-44b2-988e-cc665121ce94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a6fd158c0c0ae9d2cd68294056b2c96405dc3c4bfe170d8370f86310d8fd12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25af58348673cb95595baaa6e45859db48cde037f62e8e7f49aac6a5932fb194\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T01:02:48Z\\\",\\\"message\\\":\\\"8.084433 6500 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 01:02:48.084572 6500 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 01:02:48.084834 6500 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 01:02:48.084927 6500 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 01:02:48.085548 6500 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1206 01:02:48.085562 6500 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1206 01:02:48.085580 6500 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1206 01:02:48.085605 6500 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1206 01:02:48.085631 6500 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1206 01:02:48.089921 6500 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1206 01:02:48.089953 6500 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1206 01:02:48.090015 6500 factory.go:656] Stopping watch factory\\\\nI1206 01:02:48.090031 6500 ovnkube.go:599] Stopped ovnkube\\\\nI1206 01:02:48.090073 6500 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1206 01\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:03:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5t29r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:13Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.578720 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f35cea72-9e31-4432-9e3b-16a50ebce794\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9893b4133f9e41bd3fa958f5e1c25f5f0eb3cd3bd8d195d58f282866dc850c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674001323aaa7a65c9d6df014b5b35f7afb939aa6afe57ac9858923e3fe6ce15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ppxx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:13Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.592370 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f3f8be-7093-40c6-8c6e-6bb65c878b31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d50b2b6313c316a02866d767184d6eb48a30af52e2d86c65a08dab461e73a7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:26Z\\\",\\\"message\\\":\\\"ication::requestheader-client-ca-file\\\\nI1206 01:02:26.397262 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1206 01:02:26.397306 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1206 01:02:26.397362 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764982931\\\\\\\\\\\\\\\" (2025-12-06 01:02:10 +0000 UTC to 2026-01-05 01:02:11 +0000 UTC (now=2025-12-06 01:02:26.397254869 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397406 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1206 01:02:26.397421 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1206 01:02:26.397440 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\"\\\\nI1206 01:02:26.397610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764982946\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764982946\\\\\\\\\\\\\\\" (2025-12-06 00:02:26 +0000 UTC to 2026-12-06 00:02:26 +0000 UTC (now=2025-12-06 01:02:26.397572448 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397373 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1206 01:02:26.397647 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1206 01:02:26.397687 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1206 01:02:26.397767 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1206 01:02:26.399022 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:13Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.606559 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:13Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.616757 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.616787 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.616796 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.616813 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.616823 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:13Z","lastTransitionTime":"2025-12-06T01:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.620644 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:13Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.644875 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d8086b693e86f8e2ea45695911c64df4acded68da7cc61db4cdbc19b2402879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:13Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.678102 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:13Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.700434 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sq7kg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f95c58-fd69-48f6-94be-15927aa850c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d44b91702b22653a2cc67f20e049260e82a69445cff9d6d2abbec16d1e725147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw8lm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sq7kg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:13Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.719169 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.719216 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.719226 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.719244 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.719255 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:13Z","lastTransitionTime":"2025-12-06T01:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.726827 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e52ba3-560c-483d-9c51-898cbe94aad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f6d3d0e90489e71ee7e5e352e5d1752a7248686adf123c15f2f9aba34e9c556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tw85g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:13Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.740859 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t8t9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfebf788-e0ac-4bd4-8926-1f7b7cabba71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60a6e0233b2965c7438c6ee9467c180d9cd0b1407e2ed0279cc0947686bb9fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gpqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://966bddb0df2e52745119f2f51e57c859a39ca726270405f7b9af19c9d90a4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gpqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t8t9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:13Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.761723 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5409f18b-76b3-4388-a9c8-50a1ec8a37d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3c6493775f81f3e0feeee29f2b3c8420a15ba9ce8117fab50521b2251af46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6fa450da9e79fab8a6d70eb261ad080499995eef8884b2e01c68f44f3d1149\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cb22622bd70c57cf25cb4e61bec6541175f1fee668e4b1aaa7e36853827315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:13Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.783780 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c85b3d7-edd3-4d42-bb8b-c7892feeff04\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4192bffb9f5423bfaf52510034884b3bbdd764b00e1b3dae354bcfe933a8c0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4916d46886a800ca9719abe6ad8686b6208fec6dd5a84454728e228ac5cb46e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd576444486a121ae8f02a6a7a64402e2db59d1baf0410a0b0036de8a0a1963e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91e6ba531f19ebc253e48073eadf13d896be7771a81abf986f83893cc4eb3ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91e6ba531f19ebc253e48073eadf13d896be7771a81abf986f83893cc4eb3ea9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:13Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.799809 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021d3a65c56a737a682268cde0d57a186389f597de592f541a72ea5958a170c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70371e1283910d9d652089554a815343eb1d54110a2472e067d5a2e8878c9193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:13Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.815638 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q8mbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"374025e5-ccaa-4cc4-816b-136713f17d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khpzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khpzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q8mbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:13Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.822319 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.822367 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.822377 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.822395 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.822410 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:13Z","lastTransitionTime":"2025-12-06T01:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.827115 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kw4lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c790aea7-3412-47a6-a3d0-3dd019c26022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4698040873e672e01eca6a4791166d92bd11bdcc3197bbe0340472eba3ba69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnwnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kw4lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:13Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.891485 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.891533 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.891544 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.891561 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.891574 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:13Z","lastTransitionTime":"2025-12-06T01:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:13 crc kubenswrapper[4991]: E1206 01:03:13.904016 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"596c3607-4a7a-43fe-9f86-5d0b3f7502dc\\\",\\\"systemUUID\\\":\\\"a16d5df6-771f-4852-a5a1-7a32af9d583d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:13Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.911191 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.911236 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.911251 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.911270 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.911281 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:13Z","lastTransitionTime":"2025-12-06T01:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:13 crc kubenswrapper[4991]: E1206 01:03:13.924029 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"596c3607-4a7a-43fe-9f86-5d0b3f7502dc\\\",\\\"systemUUID\\\":\\\"a16d5df6-771f-4852-a5a1-7a32af9d583d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:13Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.935410 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.935467 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.935480 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.935499 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.935513 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:13Z","lastTransitionTime":"2025-12-06T01:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:13 crc kubenswrapper[4991]: E1206 01:03:13.949222 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"596c3607-4a7a-43fe-9f86-5d0b3f7502dc\\\",\\\"systemUUID\\\":\\\"a16d5df6-771f-4852-a5a1-7a32af9d583d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:13Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.955715 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.955778 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.955794 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.955821 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.955835 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:13Z","lastTransitionTime":"2025-12-06T01:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:13 crc kubenswrapper[4991]: E1206 01:03:13.976809 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"596c3607-4a7a-43fe-9f86-5d0b3f7502dc\\\",\\\"systemUUID\\\":\\\"a16d5df6-771f-4852-a5a1-7a32af9d583d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:13Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.980906 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.980935 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.980944 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.980958 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.980968 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:13Z","lastTransitionTime":"2025-12-06T01:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:13 crc kubenswrapper[4991]: E1206 01:03:13.996508 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"596c3607-4a7a-43fe-9f86-5d0b3f7502dc\\\",\\\"systemUUID\\\":\\\"a16d5df6-771f-4852-a5a1-7a32af9d583d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:13Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:13 crc kubenswrapper[4991]: E1206 01:03:13.996619 4991 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.998308 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.998374 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.998389 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.998413 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:13 crc kubenswrapper[4991]: I1206 01:03:13.998434 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:13Z","lastTransitionTime":"2025-12-06T01:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.037225 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.037304 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.037304 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.037225 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:03:14 crc kubenswrapper[4991]: E1206 01:03:14.037418 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:03:14 crc kubenswrapper[4991]: E1206 01:03:14.037560 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:03:14 crc kubenswrapper[4991]: E1206 01:03:14.037710 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:03:14 crc kubenswrapper[4991]: E1206 01:03:14.037821 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.101899 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.101954 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.101971 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.102027 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.102045 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:14Z","lastTransitionTime":"2025-12-06T01:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.205114 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.205178 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.205188 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.205206 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.205216 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:14Z","lastTransitionTime":"2025-12-06T01:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.308373 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.308426 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.308442 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.308461 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.308475 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:14Z","lastTransitionTime":"2025-12-06T01:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.411626 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.411669 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.411678 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.411695 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.411707 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:14Z","lastTransitionTime":"2025-12-06T01:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.506922 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5t29r_7a1c8508-3f35-44b2-988e-cc665121ce94/ovnkube-controller/2.log" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.507497 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5t29r_7a1c8508-3f35-44b2-988e-cc665121ce94/ovnkube-controller/1.log" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.510788 4991 generic.go:334] "Generic (PLEG): container finished" podID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerID="34a6fd158c0c0ae9d2cd68294056b2c96405dc3c4bfe170d8370f86310d8fd12" exitCode=1 Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.510870 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" event={"ID":"7a1c8508-3f35-44b2-988e-cc665121ce94","Type":"ContainerDied","Data":"34a6fd158c0c0ae9d2cd68294056b2c96405dc3c4bfe170d8370f86310d8fd12"} Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.510957 4991 scope.go:117] "RemoveContainer" containerID="25af58348673cb95595baaa6e45859db48cde037f62e8e7f49aac6a5932fb194" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.511825 4991 scope.go:117] "RemoveContainer" containerID="34a6fd158c0c0ae9d2cd68294056b2c96405dc3c4bfe170d8370f86310d8fd12" Dec 06 01:03:14 crc kubenswrapper[4991]: E1206 01:03:14.512463 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5t29r_openshift-ovn-kubernetes(7a1c8508-3f35-44b2-988e-cc665121ce94)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.515273 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.515319 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.515332 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.515347 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.515359 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:14Z","lastTransitionTime":"2025-12-06T01:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.534609 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc01e1f52af4386bd143ff8779458f0e335475f40f067c608c70a765d6fad952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:14Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.555121 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1c8508-3f35-44b2-988e-cc665121ce94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a6fd158c0c0ae9d2cd68294056b2c96405dc3c4bfe170d8370f86310d8fd12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25af58348673cb95595baaa6e45859db48cde037f62e8e7f49aac6a5932fb194\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T01:02:48Z\\\",\\\"message\\\":\\\"8.084433 6500 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 01:02:48.084572 6500 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 01:02:48.084834 6500 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 01:02:48.084927 6500 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1206 01:02:48.085548 6500 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1206 01:02:48.085562 6500 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1206 01:02:48.085580 6500 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1206 01:02:48.085605 6500 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1206 01:02:48.085631 6500 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1206 01:02:48.089921 6500 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1206 01:02:48.089953 6500 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1206 01:02:48.090015 6500 factory.go:656] Stopping watch factory\\\\nI1206 01:02:48.090031 6500 ovnkube.go:599] Stopped ovnkube\\\\nI1206 01:02:48.090073 6500 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1206 01\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a6fd158c0c0ae9d2cd68294056b2c96405dc3c4bfe170d8370f86310d8fd12\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T01:03:14Z\\\",\\\"message\\\":\\\"ap[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-controller-manager/kube-controller-manager]} name:Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.36:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ba175bbe-5cc4-47e6-a32d-57693e1320bd}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1206 01:03:14.045336 6896 services_controller.go:444] Built service openshift-oauth-apiserver/api LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1206 01:03:14.045350 6896 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1206 01:03:14.045361 6896 services_controller.go:445] Built service openshift-oauth-apiserver/api LB template configs for network=default: []services.lbConfig(nil)\\\\nI1206 01:03:14.045398 6896 services_controller.go:452] Built service openshift-kube-scheduler-operator/metrics per-node LB for network=default: []services.LB{}\\\\nF1206 01:03:14.045406 6896 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:03:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5t29r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:14Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.567714 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g2v2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee92c0e499eed2711d5027aba6265d26549865fbfb26de5dd38442e3c9a852b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlkqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g2v2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:14Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.583756 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:14Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.597936 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:14Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.611798 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f35cea72-9e31-4432-9e3b-16a50ebce794\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9893b4133f9e41bd3fa958f5e1c25f5f0eb3cd3bd8d195d58f282866dc850c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674001323aaa7a65c9d6df014b5b35f7afb939aa6afe57ac9858923e3fe6ce15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ppxx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:14Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.618486 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.618539 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.618551 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.618572 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.618586 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:14Z","lastTransitionTime":"2025-12-06T01:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.626212 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f3f8be-7093-40c6-8c6e-6bb65c878b31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d50b2b6313c316a02866d767184d6eb48a30af52e2d86c65a08dab461e73a7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:26Z\\\",\\\"message\\\":\\\"ication::requestheader-client-ca-file\\\\nI1206 01:02:26.397262 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1206 01:02:26.397306 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1206 01:02:26.397362 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764982931\\\\\\\\\\\\\\\" (2025-12-06 01:02:10 +0000 UTC to 2026-01-05 01:02:11 +0000 UTC (now=2025-12-06 01:02:26.397254869 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397406 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1206 01:02:26.397421 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1206 01:02:26.397440 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\"\\\\nI1206 01:02:26.397610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764982946\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764982946\\\\\\\\\\\\\\\" (2025-12-06 00:02:26 +0000 UTC to 2026-12-06 00:02:26 +0000 UTC (now=2025-12-06 01:02:26.397572448 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397373 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1206 01:02:26.397647 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1206 01:02:26.397687 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1206 01:02:26.397767 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1206 01:02:26.399022 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:14Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.638563 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c85b3d7-edd3-4d42-bb8b-c7892feeff04\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4192bffb9f5423bfaf52510034884b3bbdd764b00e1b3dae354bcfe933a8c0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4916d46886a800ca9719abe6ad8686b6208fec6dd5a84454728e228ac5cb46e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd576444486a121ae8f02a6a7a64402e2db59d1baf0410a0b0036de8a0a1963e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91e6ba531f19ebc253e48073eadf13d896be7771a81abf986f83893cc4eb3ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91e6ba531f19ebc253e48073eadf13d896be7771a81abf986f83893cc4eb3ea9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:14Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.653502 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021d3a65c56a737a682268cde0d57a186389f597de592f541a72ea5958a170c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70371e1283910d9d652089554a815343eb1d54110a2472e067d5a2e8878c9193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:14Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.682212 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d8086b693e86f8e2ea45695911c64df4acded68da7cc61db4cdbc19b2402879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:14Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.698188 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:14Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.716600 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sq7kg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f95c58-fd69-48f6-94be-15927aa850c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d44b91702b22653a2cc67f20e049260e82a69445cff9d6d2abbec16d1e725147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw8lm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sq7kg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:14Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.721737 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.721831 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.721857 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.721892 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.721916 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:14Z","lastTransitionTime":"2025-12-06T01:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.735185 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e52ba3-560c-483d-9c51-898cbe94aad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f6d3d0e90489e71ee7e5e352e5d1752a7248686adf123c15f2f9aba34e9c556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tw85g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:14Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.753578 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t8t9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfebf788-e0ac-4bd4-8926-1f7b7cabba71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60a6e0233b2965c7438c6ee9467c180d9cd0b1407e2ed0279cc0947686bb9fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gpqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://966bddb0df2e52745119f2f51e57c859a39ca726270405f7b9af19c9d90a4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gpqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t8t9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:14Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.772773 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5409f18b-76b3-4388-a9c8-50a1ec8a37d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3c6493775f81f3e0feeee29f2b3c8420a15ba9ce8117fab50521b2251af46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6fa450da9e79fab8a6d70eb261ad080499995eef8884b2e01c68f44f3d1149\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cb22622bd70c57cf25cb4e61bec6541175f1fee668e4b1aaa7e36853827315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:14Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.790673 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q8mbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"374025e5-ccaa-4cc4-816b-136713f17d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khpzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khpzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q8mbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:14Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.806135 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kw4lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c790aea7-3412-47a6-a3d0-3dd019c26022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4698040873e672e01eca6a4791166d92bd11bdcc3197bbe0340472eba3ba69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnwnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kw4lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:14Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.825628 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.825679 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.825689 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.825706 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.825720 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:14Z","lastTransitionTime":"2025-12-06T01:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.928369 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.928446 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.928464 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.928491 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:14 crc kubenswrapper[4991]: I1206 01:03:14.928507 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:14Z","lastTransitionTime":"2025-12-06T01:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.031677 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.031756 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.031766 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.031783 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.031793 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:15Z","lastTransitionTime":"2025-12-06T01:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.135588 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.135665 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.135677 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.135699 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.135716 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:15Z","lastTransitionTime":"2025-12-06T01:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.239001 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.239071 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.239081 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.239101 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.239112 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:15Z","lastTransitionTime":"2025-12-06T01:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.342044 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.342106 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.342115 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.342136 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.342148 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:15Z","lastTransitionTime":"2025-12-06T01:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.445324 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.445384 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.445397 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.445417 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.445429 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:15Z","lastTransitionTime":"2025-12-06T01:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.516261 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5t29r_7a1c8508-3f35-44b2-988e-cc665121ce94/ovnkube-controller/2.log" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.520435 4991 scope.go:117] "RemoveContainer" containerID="34a6fd158c0c0ae9d2cd68294056b2c96405dc3c4bfe170d8370f86310d8fd12" Dec 06 01:03:15 crc kubenswrapper[4991]: E1206 01:03:15.520639 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5t29r_openshift-ovn-kubernetes(7a1c8508-3f35-44b2-988e-cc665121ce94)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.543781 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc01e1f52af4386bd143ff8779458f0e335475f40f067c608c70a765d6fad952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:15Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.547968 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.548081 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.548109 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.548144 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.548168 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:15Z","lastTransitionTime":"2025-12-06T01:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.573143 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1c8508-3f35-44b2-988e-cc665121ce94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a6fd158c0c0ae9d2cd68294056b2c96405dc3c4bfe170d8370f86310d8fd12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a6fd158c0c0ae9d2cd68294056b2c96405dc3c4bfe170d8370f86310d8fd12\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T01:03:14Z\\\",\\\"message\\\":\\\"ap[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-controller-manager/kube-controller-manager]} name:Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.36:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ba175bbe-5cc4-47e6-a32d-57693e1320bd}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1206 01:03:14.045336 6896 services_controller.go:444] Built service openshift-oauth-apiserver/api LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1206 01:03:14.045350 6896 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1206 01:03:14.045361 6896 services_controller.go:445] Built service openshift-oauth-apiserver/api LB template configs for network=default: []services.lbConfig(nil)\\\\nI1206 01:03:14.045398 6896 services_controller.go:452] Built service openshift-kube-scheduler-operator/metrics per-node LB for network=default: []services.LB{}\\\\nF1206 01:03:14.045406 6896 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:03:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5t29r_openshift-ovn-kubernetes(7a1c8508-3f35-44b2-988e-cc665121ce94)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5t29r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:15Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.588622 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g2v2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee92c0e499eed2711d5027aba6265d26549865fbfb26de5dd38442e3c9a852b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlkqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g2v2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:15Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.605142 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:15Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.618400 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:15Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.631963 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f35cea72-9e31-4432-9e3b-16a50ebce794\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9893b4133f9e41bd3fa958f5e1c25f5f0eb3cd3bd8d195d58f282866dc850c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674001323aaa7a65c9d6df014b5b35f7afb939aa6afe57ac9858923e3fe6ce15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ppxx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:15Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.646081 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f3f8be-7093-40c6-8c6e-6bb65c878b31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d50b2b6313c316a02866d767184d6eb48a30af52e2d86c65a08dab461e73a7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:26Z\\\",\\\"message\\\":\\\"ication::requestheader-client-ca-file\\\\nI1206 01:02:26.397262 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1206 01:02:26.397306 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1206 01:02:26.397362 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764982931\\\\\\\\\\\\\\\" (2025-12-06 01:02:10 +0000 UTC to 2026-01-05 01:02:11 +0000 UTC (now=2025-12-06 01:02:26.397254869 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397406 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1206 01:02:26.397421 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1206 01:02:26.397440 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\"\\\\nI1206 01:02:26.397610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764982946\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764982946\\\\\\\\\\\\\\\" (2025-12-06 00:02:26 +0000 UTC to 2026-12-06 00:02:26 +0000 UTC (now=2025-12-06 01:02:26.397572448 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397373 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1206 01:02:26.397647 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1206 01:02:26.397687 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1206 01:02:26.397767 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1206 01:02:26.399022 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:15Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.651147 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.651191 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.651204 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.651224 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.651237 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:15Z","lastTransitionTime":"2025-12-06T01:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.663191 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c85b3d7-edd3-4d42-bb8b-c7892feeff04\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4192bffb9f5423bfaf52510034884b3bbdd764b00e1b3dae354bcfe933a8c0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4916d46886a800ca9719abe6ad8686b6208fec6dd5a84454728e228ac5cb46e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd576444486a121ae8f02a6a7a64402e2db59d1baf0410a0b0036de8a0a1963e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91e6ba531f19ebc253e48073eadf13d896be7771a81abf986f83893cc4eb3ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91e6ba531f19ebc253e48073eadf13d896be7771a81abf986f83893cc4eb3ea9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:15Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.683329 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021d3a65c56a737a682268cde0d57a186389f597de592f541a72ea5958a170c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70371e1283910d9d652089554a815343eb1d54110a2472e067d5a2e8878c9193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:15Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.704291 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d8086b693e86f8e2ea45695911c64df4acded68da7cc61db4cdbc19b2402879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:15Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.722878 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:15Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.743567 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sq7kg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f95c58-fd69-48f6-94be-15927aa850c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d44b91702b22653a2cc67f20e049260e82a69445cff9d6d2abbec16d1e725147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw8lm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sq7kg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:15Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.754056 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.754122 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.754134 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.754152 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.754169 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:15Z","lastTransitionTime":"2025-12-06T01:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.766549 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e52ba3-560c-483d-9c51-898cbe94aad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f6d3d0e90489e71ee7e5e352e5d1752a7248686adf123c15f2f9aba34e9c556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tw85g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:15Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.782556 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t8t9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfebf788-e0ac-4bd4-8926-1f7b7cabba71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60a6e0233b2965c7438c6ee9467c180d9cd0b1407e2ed0279cc0947686bb9fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gpqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://966bddb0df2e52745119f2f51e57c859a39ca726270405f7b9af19c9d90a4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gpqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t8t9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:15Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.800843 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5409f18b-76b3-4388-a9c8-50a1ec8a37d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3c6493775f81f3e0feeee29f2b3c8420a15ba9ce8117fab50521b2251af46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6fa450da9e79fab8a6d70eb261ad080499995eef8884b2e01c68f44f3d1149\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cb22622bd70c57cf25cb4e61bec6541175f1fee668e4b1aaa7e36853827315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:15Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.816403 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q8mbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"374025e5-ccaa-4cc4-816b-136713f17d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khpzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khpzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q8mbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:15Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.828498 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kw4lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c790aea7-3412-47a6-a3d0-3dd019c26022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4698040873e672e01eca6a4791166d92bd11bdcc3197bbe0340472eba3ba69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnwnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kw4lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:15Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.857359 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.857411 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.857440 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.857460 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.857474 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:15Z","lastTransitionTime":"2025-12-06T01:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.960277 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.960404 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.960424 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.960454 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:15 crc kubenswrapper[4991]: I1206 01:03:15.960477 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:15Z","lastTransitionTime":"2025-12-06T01:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.037747 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.037769 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:03:16 crc kubenswrapper[4991]: E1206 01:03:16.037943 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.037967 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.038056 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:03:16 crc kubenswrapper[4991]: E1206 01:03:16.038165 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:03:16 crc kubenswrapper[4991]: E1206 01:03:16.038282 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:03:16 crc kubenswrapper[4991]: E1206 01:03:16.038445 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.063714 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.063765 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.063786 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.063813 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.063832 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:16Z","lastTransitionTime":"2025-12-06T01:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.166933 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.166977 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.167012 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.167031 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.167043 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:16Z","lastTransitionTime":"2025-12-06T01:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.270547 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.270592 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.270603 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.270623 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.270635 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:16Z","lastTransitionTime":"2025-12-06T01:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.374400 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.374505 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.374530 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.374565 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.374604 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:16Z","lastTransitionTime":"2025-12-06T01:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.478599 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.478657 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.478667 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.478686 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.478697 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:16Z","lastTransitionTime":"2025-12-06T01:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.582745 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.582809 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.582819 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.582839 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.582851 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:16Z","lastTransitionTime":"2025-12-06T01:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.685242 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.685295 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.685306 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.685324 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.685342 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:16Z","lastTransitionTime":"2025-12-06T01:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.788855 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.788956 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.788969 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.789038 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.789055 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:16Z","lastTransitionTime":"2025-12-06T01:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.891890 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.892028 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.892058 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.892094 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.892121 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:16Z","lastTransitionTime":"2025-12-06T01:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.995038 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.995092 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.995104 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.995127 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:16 crc kubenswrapper[4991]: I1206 01:03:16.995137 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:16Z","lastTransitionTime":"2025-12-06T01:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.062420 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:17Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.081888 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f35cea72-9e31-4432-9e3b-16a50ebce794\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9893b4133f9e41bd3fa958f5e1c25f5f0eb3cd3bd8d195d58f282866dc850c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674001323aaa7a65c9d6df014b5b35f7afb939aa6afe57ac9858923e3fe6ce15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ppxx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:17Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.097423 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.097472 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.097481 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.097498 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.097510 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:17Z","lastTransitionTime":"2025-12-06T01:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.099174 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f3f8be-7093-40c6-8c6e-6bb65c878b31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d50b2b6313c316a02866d767184d6eb48a30af52e2d86c65a08dab461e73a7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:26Z\\\",\\\"message\\\":\\\"ication::requestheader-client-ca-file\\\\nI1206 01:02:26.397262 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1206 01:02:26.397306 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1206 01:02:26.397362 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764982931\\\\\\\\\\\\\\\" (2025-12-06 01:02:10 +0000 UTC to 2026-01-05 01:02:11 +0000 UTC (now=2025-12-06 01:02:26.397254869 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397406 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1206 01:02:26.397421 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1206 01:02:26.397440 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\"\\\\nI1206 01:02:26.397610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764982946\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764982946\\\\\\\\\\\\\\\" (2025-12-06 00:02:26 +0000 UTC to 2026-12-06 00:02:26 +0000 UTC (now=2025-12-06 01:02:26.397572448 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397373 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1206 01:02:26.397647 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1206 01:02:26.397687 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1206 01:02:26.397767 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1206 01:02:26.399022 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:17Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.116888 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:17Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.137755 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021d3a65c56a737a682268cde0d57a186389f597de592f541a72ea5958a170c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70371e1283910d9d652089554a815343eb1d54110a2472e067d5a2e8878c9193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:17Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.157854 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d8086b693e86f8e2ea45695911c64df4acded68da7cc61db4cdbc19b2402879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:17Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.177906 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:17Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.198398 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sq7kg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f95c58-fd69-48f6-94be-15927aa850c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d44b91702b22653a2cc67f20e049260e82a69445cff9d6d2abbec16d1e725147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw8lm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sq7kg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:17Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.200949 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.201048 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.201075 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.201105 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.201126 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:17Z","lastTransitionTime":"2025-12-06T01:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.220249 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e52ba3-560c-483d-9c51-898cbe94aad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f6d3d0e90489e71ee7e5e352e5d1752a7248686adf123c15f2f9aba34e9c556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tw85g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:17Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.235164 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t8t9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfebf788-e0ac-4bd4-8926-1f7b7cabba71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60a6e0233b2965c7438c6ee9467c180d9cd0b1407e2ed0279cc0947686bb9fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gpqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://966bddb0df2e52745119f2f51e57c859a39ca726270405f7b9af19c9d90a4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gpqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t8t9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:17Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.249866 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5409f18b-76b3-4388-a9c8-50a1ec8a37d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3c6493775f81f3e0feeee29f2b3c8420a15ba9ce8117fab50521b2251af46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6fa450da9e79fab8a6d70eb261ad080499995eef8884b2e01c68f44f3d1149\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cb22622bd70c57cf25cb4e61bec6541175f1fee668e4b1aaa7e36853827315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:17Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.260454 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c85b3d7-edd3-4d42-bb8b-c7892feeff04\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4192bffb9f5423bfaf52510034884b3bbdd764b00e1b3dae354bcfe933a8c0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4916d46886a800ca9719abe6ad8686b6208fec6dd5a84454728e228ac5cb46e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd576444486a121ae8f02a6a7a64402e2db59d1baf0410a0b0036de8a0a1963e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91e6ba531f19ebc253e48073eadf13d896be7771a81abf986f83893cc4eb3ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91e6ba531f19ebc253e48073eadf13d896be7771a81abf986f83893cc4eb3ea9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:17Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.272183 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q8mbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"374025e5-ccaa-4cc4-816b-136713f17d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khpzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khpzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q8mbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:17Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.285608 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kw4lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c790aea7-3412-47a6-a3d0-3dd019c26022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4698040873e672e01eca6a4791166d92bd11bdcc3197bbe0340472eba3ba69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnwnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kw4lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:17Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.304289 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.304371 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.304398 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.304437 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.304464 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:17Z","lastTransitionTime":"2025-12-06T01:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.315408 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1c8508-3f35-44b2-988e-cc665121ce94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a6fd158c0c0ae9d2cd68294056b2c96405dc3c4bfe170d8370f86310d8fd12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a6fd158c0c0ae9d2cd68294056b2c96405dc3c4bfe170d8370f86310d8fd12\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T01:03:14Z\\\",\\\"message\\\":\\\"ap[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-controller-manager/kube-controller-manager]} name:Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.36:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ba175bbe-5cc4-47e6-a32d-57693e1320bd}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1206 01:03:14.045336 6896 services_controller.go:444] Built service openshift-oauth-apiserver/api LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1206 01:03:14.045350 6896 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1206 01:03:14.045361 6896 services_controller.go:445] Built service openshift-oauth-apiserver/api LB template configs for network=default: []services.lbConfig(nil)\\\\nI1206 01:03:14.045398 6896 services_controller.go:452] Built service openshift-kube-scheduler-operator/metrics per-node LB for network=default: []services.LB{}\\\\nF1206 01:03:14.045406 6896 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:03:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5t29r_openshift-ovn-kubernetes(7a1c8508-3f35-44b2-988e-cc665121ce94)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5t29r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:17Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.328226 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g2v2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee92c0e499eed2711d5027aba6265d26549865fbfb26de5dd38442e3c9a852b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlkqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g2v2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:17Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.354266 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc01e1f52af4386bd143ff8779458f0e335475f40f067c608c70a765d6fad952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:17Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.407892 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.407967 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.408008 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.408039 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.408056 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:17Z","lastTransitionTime":"2025-12-06T01:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.511252 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.511303 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.511315 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.511332 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.511344 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:17Z","lastTransitionTime":"2025-12-06T01:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.614922 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.614980 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.615027 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.615052 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.615068 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:17Z","lastTransitionTime":"2025-12-06T01:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.718898 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.719023 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.719049 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.719085 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.719114 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:17Z","lastTransitionTime":"2025-12-06T01:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.822103 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.822170 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.822187 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.822216 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.822240 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:17Z","lastTransitionTime":"2025-12-06T01:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.925566 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.925621 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.925634 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.925655 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:17 crc kubenswrapper[4991]: I1206 01:03:17.925669 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:17Z","lastTransitionTime":"2025-12-06T01:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.033522 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.033880 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.034276 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.034302 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.034322 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:18Z","lastTransitionTime":"2025-12-06T01:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.037859 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.037970 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.038051 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:03:18 crc kubenswrapper[4991]: E1206 01:03:18.038182 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.038613 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:03:18 crc kubenswrapper[4991]: E1206 01:03:18.038662 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:03:18 crc kubenswrapper[4991]: E1206 01:03:18.038672 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:03:18 crc kubenswrapper[4991]: E1206 01:03:18.038755 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.050668 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.137623 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.137658 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.137684 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.137702 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.137715 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:18Z","lastTransitionTime":"2025-12-06T01:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.240698 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.240775 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.240788 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.240812 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.240826 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:18Z","lastTransitionTime":"2025-12-06T01:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.325955 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/374025e5-ccaa-4cc4-816b-136713f17d6d-metrics-certs\") pod \"network-metrics-daemon-q8mbt\" (UID: \"374025e5-ccaa-4cc4-816b-136713f17d6d\") " pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:03:18 crc kubenswrapper[4991]: E1206 01:03:18.326252 4991 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 01:03:18 crc kubenswrapper[4991]: E1206 01:03:18.326358 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/374025e5-ccaa-4cc4-816b-136713f17d6d-metrics-certs podName:374025e5-ccaa-4cc4-816b-136713f17d6d nodeName:}" failed. No retries permitted until 2025-12-06 01:03:50.326324633 +0000 UTC m=+103.676615302 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/374025e5-ccaa-4cc4-816b-136713f17d6d-metrics-certs") pod "network-metrics-daemon-q8mbt" (UID: "374025e5-ccaa-4cc4-816b-136713f17d6d") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.343287 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.343359 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.343376 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.343416 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.343432 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:18Z","lastTransitionTime":"2025-12-06T01:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.446316 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.446373 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.446393 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.446430 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.446457 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:18Z","lastTransitionTime":"2025-12-06T01:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.550318 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.550389 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.550408 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.550816 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.550874 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:18Z","lastTransitionTime":"2025-12-06T01:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.654564 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.654625 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.654645 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.654673 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.654692 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:18Z","lastTransitionTime":"2025-12-06T01:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.758532 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.758595 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.758614 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.758641 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.758662 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:18Z","lastTransitionTime":"2025-12-06T01:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.862464 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.862523 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.862541 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.862569 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.862588 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:18Z","lastTransitionTime":"2025-12-06T01:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.965869 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.965932 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.965945 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.965963 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:18 crc kubenswrapper[4991]: I1206 01:03:18.965975 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:18Z","lastTransitionTime":"2025-12-06T01:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.069121 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.069200 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.069230 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.069296 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.069340 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:19Z","lastTransitionTime":"2025-12-06T01:03:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.172239 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.172336 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.172356 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.172414 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.172436 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:19Z","lastTransitionTime":"2025-12-06T01:03:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.275859 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.275894 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.275905 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.275922 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.275935 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:19Z","lastTransitionTime":"2025-12-06T01:03:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.379121 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.379177 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.379191 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.379212 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.379226 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:19Z","lastTransitionTime":"2025-12-06T01:03:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.481920 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.482019 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.482039 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.482066 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.482084 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:19Z","lastTransitionTime":"2025-12-06T01:03:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.585328 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.585419 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.585437 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.585466 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.585511 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:19Z","lastTransitionTime":"2025-12-06T01:03:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.688607 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.688671 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.688688 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.688719 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.688738 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:19Z","lastTransitionTime":"2025-12-06T01:03:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.792340 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.792414 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.792434 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.792468 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.792492 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:19Z","lastTransitionTime":"2025-12-06T01:03:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.895619 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.895663 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.895674 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.895695 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.895707 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:19Z","lastTransitionTime":"2025-12-06T01:03:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.999328 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.999409 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.999428 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.999460 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:19 crc kubenswrapper[4991]: I1206 01:03:19.999480 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:19Z","lastTransitionTime":"2025-12-06T01:03:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.037152 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.037142 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.037299 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.037174 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:03:20 crc kubenswrapper[4991]: E1206 01:03:20.037419 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:03:20 crc kubenswrapper[4991]: E1206 01:03:20.037572 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:03:20 crc kubenswrapper[4991]: E1206 01:03:20.037811 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:03:20 crc kubenswrapper[4991]: E1206 01:03:20.037890 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.103589 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.103647 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.103661 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.103682 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.103699 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:20Z","lastTransitionTime":"2025-12-06T01:03:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.206505 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.206578 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.206589 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.206607 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.206651 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:20Z","lastTransitionTime":"2025-12-06T01:03:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.309388 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.309444 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.309457 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.309479 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.309492 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:20Z","lastTransitionTime":"2025-12-06T01:03:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.413321 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.413405 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.413430 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.413464 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.413487 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:20Z","lastTransitionTime":"2025-12-06T01:03:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.516690 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.516802 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.516821 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.516890 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.516911 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:20Z","lastTransitionTime":"2025-12-06T01:03:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.539822 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sq7kg_06f95c58-fd69-48f6-94be-15927aa850c2/kube-multus/0.log" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.539889 4991 generic.go:334] "Generic (PLEG): container finished" podID="06f95c58-fd69-48f6-94be-15927aa850c2" containerID="d44b91702b22653a2cc67f20e049260e82a69445cff9d6d2abbec16d1e725147" exitCode=1 Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.539931 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sq7kg" event={"ID":"06f95c58-fd69-48f6-94be-15927aa850c2","Type":"ContainerDied","Data":"d44b91702b22653a2cc67f20e049260e82a69445cff9d6d2abbec16d1e725147"} Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.540394 4991 scope.go:117] "RemoveContainer" containerID="d44b91702b22653a2cc67f20e049260e82a69445cff9d6d2abbec16d1e725147" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.560589 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sq7kg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f95c58-fd69-48f6-94be-15927aa850c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d44b91702b22653a2cc67f20e049260e82a69445cff9d6d2abbec16d1e725147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d44b91702b22653a2cc67f20e049260e82a69445cff9d6d2abbec16d1e725147\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T01:03:20Z\\\",\\\"message\\\":\\\"2025-12-06T01:02:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8875cecd-5c55-4c33-8830-2385a56cc89c\\\\n2025-12-06T01:02:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8875cecd-5c55-4c33-8830-2385a56cc89c to /host/opt/cni/bin/\\\\n2025-12-06T01:02:33Z [verbose] multus-daemon started\\\\n2025-12-06T01:02:33Z [verbose] Readiness Indicator file check\\\\n2025-12-06T01:03:18Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw8lm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sq7kg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:20Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.584835 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e52ba3-560c-483d-9c51-898cbe94aad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f6d3d0e90489e71ee7e5e352e5d1752a7248686adf123c15f2f9aba34e9c556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tw85g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:20Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.605142 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t8t9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfebf788-e0ac-4bd4-8926-1f7b7cabba71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60a6e0233b2965c7438c6ee9467c180d9cd0b1407e2ed0279cc0947686bb9fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gpqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://966bddb0df2e52745119f2f51e57c859a39ca726270405f7b9af19c9d90a4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gpqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t8t9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:20Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.621295 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.621362 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.622140 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.621887 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5409f18b-76b3-4388-a9c8-50a1ec8a37d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3c6493775f81f3e0feeee29f2b3c8420a15ba9ce8117fab50521b2251af46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6fa450da9e79fab8a6d70eb261ad080499995eef8884b2e01c68f44f3d1149\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cb22622bd70c57cf25cb4e61bec6541175f1fee668e4b1aaa7e36853827315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:20Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.622407 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.622463 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:20Z","lastTransitionTime":"2025-12-06T01:03:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.641407 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c85b3d7-edd3-4d42-bb8b-c7892feeff04\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4192bffb9f5423bfaf52510034884b3bbdd764b00e1b3dae354bcfe933a8c0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4916d46886a800ca9719abe6ad8686b6208fec6dd5a84454728e228ac5cb46e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd576444486a121ae8f02a6a7a64402e2db59d1baf0410a0b0036de8a0a1963e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91e6ba531f19ebc253e48073eadf13d896be7771a81abf986f83893cc4eb3ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91e6ba531f19ebc253e48073eadf13d896be7771a81abf986f83893cc4eb3ea9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:20Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.658712 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021d3a65c56a737a682268cde0d57a186389f597de592f541a72ea5958a170c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70371e1283910d9d652089554a815343eb1d54110a2472e067d5a2e8878c9193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:20Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.675800 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d8086b693e86f8e2ea45695911c64df4acded68da7cc61db4cdbc19b2402879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:20Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.699881 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:20Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.718954 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q8mbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"374025e5-ccaa-4cc4-816b-136713f17d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khpzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khpzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q8mbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:20Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.725095 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.725203 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.725223 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.725245 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.725259 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:20Z","lastTransitionTime":"2025-12-06T01:03:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.738106 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kw4lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c790aea7-3412-47a6-a3d0-3dd019c26022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4698040873e672e01eca6a4791166d92bd11bdcc3197bbe0340472eba3ba69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnwnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kw4lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:20Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.751931 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c43bf08a-5774-4727-b681-b5d2291e51dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cf257527acb99d14685360262c24dfe896a1ae4e11680b2b8e1c68720182ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd4bdff2c70d7492d7819fa53b5844cd7b7c50157c2ca34841bcdd8fbffc1345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd4bdff2c70d7492d7819fa53b5844cd7b7c50157c2ca34841bcdd8fbffc1345\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:20Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.767099 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc01e1f52af4386bd143ff8779458f0e335475f40f067c608c70a765d6fad952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:20Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.786199 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1c8508-3f35-44b2-988e-cc665121ce94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a6fd158c0c0ae9d2cd68294056b2c96405dc3c4bfe170d8370f86310d8fd12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a6fd158c0c0ae9d2cd68294056b2c96405dc3c4bfe170d8370f86310d8fd12\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T01:03:14Z\\\",\\\"message\\\":\\\"ap[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-controller-manager/kube-controller-manager]} name:Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.36:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ba175bbe-5cc4-47e6-a32d-57693e1320bd}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1206 01:03:14.045336 6896 services_controller.go:444] Built service openshift-oauth-apiserver/api LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1206 01:03:14.045350 6896 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1206 01:03:14.045361 6896 services_controller.go:445] Built service openshift-oauth-apiserver/api LB template configs for network=default: []services.lbConfig(nil)\\\\nI1206 01:03:14.045398 6896 services_controller.go:452] Built service openshift-kube-scheduler-operator/metrics per-node LB for network=default: []services.LB{}\\\\nF1206 01:03:14.045406 6896 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:03:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5t29r_openshift-ovn-kubernetes(7a1c8508-3f35-44b2-988e-cc665121ce94)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5t29r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:20Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.796870 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g2v2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee92c0e499eed2711d5027aba6265d26549865fbfb26de5dd38442e3c9a852b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlkqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g2v2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:20Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.809066 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f3f8be-7093-40c6-8c6e-6bb65c878b31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d50b2b6313c316a02866d767184d6eb48a30af52e2d86c65a08dab461e73a7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:26Z\\\",\\\"message\\\":\\\"ication::requestheader-client-ca-file\\\\nI1206 01:02:26.397262 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1206 01:02:26.397306 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1206 01:02:26.397362 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764982931\\\\\\\\\\\\\\\" (2025-12-06 01:02:10 +0000 UTC to 2026-01-05 01:02:11 +0000 UTC (now=2025-12-06 01:02:26.397254869 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397406 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1206 01:02:26.397421 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1206 01:02:26.397440 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\"\\\\nI1206 01:02:26.397610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764982946\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764982946\\\\\\\\\\\\\\\" (2025-12-06 00:02:26 +0000 UTC to 2026-12-06 00:02:26 +0000 UTC (now=2025-12-06 01:02:26.397572448 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397373 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1206 01:02:26.397647 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1206 01:02:26.397687 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1206 01:02:26.397767 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1206 01:02:26.399022 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:20Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.821434 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:20Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.827938 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.827964 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.827973 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.828011 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.828021 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:20Z","lastTransitionTime":"2025-12-06T01:03:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.836208 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:20Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.851890 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f35cea72-9e31-4432-9e3b-16a50ebce794\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9893b4133f9e41bd3fa958f5e1c25f5f0eb3cd3bd8d195d58f282866dc850c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674001323aaa7a65c9d6df014b5b35f7afb939aa6afe57ac9858923e3fe6ce15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ppxx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:20Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.930945 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.930977 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.931002 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.931018 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:20 crc kubenswrapper[4991]: I1206 01:03:20.931028 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:20Z","lastTransitionTime":"2025-12-06T01:03:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.034632 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.034692 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.034713 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.034741 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.034762 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:21Z","lastTransitionTime":"2025-12-06T01:03:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.137966 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.138057 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.138074 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.138100 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.138119 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:21Z","lastTransitionTime":"2025-12-06T01:03:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.240758 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.240809 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.240822 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.240844 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.240861 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:21Z","lastTransitionTime":"2025-12-06T01:03:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.344154 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.344203 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.344215 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.344235 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.344248 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:21Z","lastTransitionTime":"2025-12-06T01:03:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.448249 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.448310 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.448329 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.448363 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.448388 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:21Z","lastTransitionTime":"2025-12-06T01:03:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.547472 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sq7kg_06f95c58-fd69-48f6-94be-15927aa850c2/kube-multus/0.log" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.547573 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sq7kg" event={"ID":"06f95c58-fd69-48f6-94be-15927aa850c2","Type":"ContainerStarted","Data":"ce2778914060f77aac8e6e77fc814436aeed10afdc9565dcd71b3b4c15324072"} Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.551101 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.551151 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.551689 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.551732 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.551754 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:21Z","lastTransitionTime":"2025-12-06T01:03:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.569123 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kw4lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c790aea7-3412-47a6-a3d0-3dd019c26022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4698040873e672e01eca6a4791166d92bd11bdcc3197bbe0340472eba3ba69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnwnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kw4lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:21Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.598050 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1c8508-3f35-44b2-988e-cc665121ce94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a6fd158c0c0ae9d2cd68294056b2c96405dc3c4bfe170d8370f86310d8fd12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a6fd158c0c0ae9d2cd68294056b2c96405dc3c4bfe170d8370f86310d8fd12\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T01:03:14Z\\\",\\\"message\\\":\\\"ap[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-controller-manager/kube-controller-manager]} name:Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.36:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ba175bbe-5cc4-47e6-a32d-57693e1320bd}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1206 01:03:14.045336 6896 services_controller.go:444] Built service openshift-oauth-apiserver/api LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1206 01:03:14.045350 6896 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1206 01:03:14.045361 6896 services_controller.go:445] Built service openshift-oauth-apiserver/api LB template configs for network=default: []services.lbConfig(nil)\\\\nI1206 01:03:14.045398 6896 services_controller.go:452] Built service openshift-kube-scheduler-operator/metrics per-node LB for network=default: []services.LB{}\\\\nF1206 01:03:14.045406 6896 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:03:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5t29r_openshift-ovn-kubernetes(7a1c8508-3f35-44b2-988e-cc665121ce94)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5t29r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:21Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.616188 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g2v2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee92c0e499eed2711d5027aba6265d26549865fbfb26de5dd38442e3c9a852b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlkqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g2v2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:21Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.632014 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c43bf08a-5774-4727-b681-b5d2291e51dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cf257527acb99d14685360262c24dfe896a1ae4e11680b2b8e1c68720182ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd4bdff2c70d7492d7819fa53b5844cd7b7c50157c2ca34841bcdd8fbffc1345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd4bdff2c70d7492d7819fa53b5844cd7b7c50157c2ca34841bcdd8fbffc1345\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:21Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.649400 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc01e1f52af4386bd143ff8779458f0e335475f40f067c608c70a765d6fad952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:21Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.654756 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.654795 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.654813 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.654836 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.654853 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:21Z","lastTransitionTime":"2025-12-06T01:03:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.665552 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:21Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.681505 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f35cea72-9e31-4432-9e3b-16a50ebce794\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9893b4133f9e41bd3fa958f5e1c25f5f0eb3cd3bd8d195d58f282866dc850c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674001323aaa7a65c9d6df014b5b35f7afb939aa6afe57ac9858923e3fe6ce15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ppxx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:21Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.697783 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f3f8be-7093-40c6-8c6e-6bb65c878b31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d50b2b6313c316a02866d767184d6eb48a30af52e2d86c65a08dab461e73a7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:26Z\\\",\\\"message\\\":\\\"ication::requestheader-client-ca-file\\\\nI1206 01:02:26.397262 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1206 01:02:26.397306 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1206 01:02:26.397362 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764982931\\\\\\\\\\\\\\\" (2025-12-06 01:02:10 +0000 UTC to 2026-01-05 01:02:11 +0000 UTC (now=2025-12-06 01:02:26.397254869 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397406 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1206 01:02:26.397421 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1206 01:02:26.397440 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\"\\\\nI1206 01:02:26.397610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764982946\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764982946\\\\\\\\\\\\\\\" (2025-12-06 00:02:26 +0000 UTC to 2026-12-06 00:02:26 +0000 UTC (now=2025-12-06 01:02:26.397572448 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397373 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1206 01:02:26.397647 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1206 01:02:26.397687 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1206 01:02:26.397767 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1206 01:02:26.399022 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:21Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.711878 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:21Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.732867 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021d3a65c56a737a682268cde0d57a186389f597de592f541a72ea5958a170c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70371e1283910d9d652089554a815343eb1d54110a2472e067d5a2e8878c9193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:21Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.755184 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d8086b693e86f8e2ea45695911c64df4acded68da7cc61db4cdbc19b2402879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:21Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.758453 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.758549 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.758577 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.758615 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.758641 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:21Z","lastTransitionTime":"2025-12-06T01:03:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.776413 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:21Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.801396 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sq7kg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f95c58-fd69-48f6-94be-15927aa850c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2778914060f77aac8e6e77fc814436aeed10afdc9565dcd71b3b4c15324072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d44b91702b22653a2cc67f20e049260e82a69445cff9d6d2abbec16d1e725147\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T01:03:20Z\\\",\\\"message\\\":\\\"2025-12-06T01:02:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8875cecd-5c55-4c33-8830-2385a56cc89c\\\\n2025-12-06T01:02:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8875cecd-5c55-4c33-8830-2385a56cc89c to /host/opt/cni/bin/\\\\n2025-12-06T01:02:33Z [verbose] multus-daemon started\\\\n2025-12-06T01:02:33Z [verbose] Readiness Indicator file check\\\\n2025-12-06T01:03:18Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw8lm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sq7kg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:21Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.829194 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e52ba3-560c-483d-9c51-898cbe94aad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f6d3d0e90489e71ee7e5e352e5d1752a7248686adf123c15f2f9aba34e9c556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tw85g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:21Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.851063 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t8t9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfebf788-e0ac-4bd4-8926-1f7b7cabba71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60a6e0233b2965c7438c6ee9467c180d9cd0b1407e2ed0279cc0947686bb9fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gpqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://966bddb0df2e52745119f2f51e57c859a39ca726270405f7b9af19c9d90a4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gpqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t8t9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:21Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.862160 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.862235 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.862259 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.862290 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.862313 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:21Z","lastTransitionTime":"2025-12-06T01:03:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.873663 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5409f18b-76b3-4388-a9c8-50a1ec8a37d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3c6493775f81f3e0feeee29f2b3c8420a15ba9ce8117fab50521b2251af46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6fa450da9e79fab8a6d70eb261ad080499995eef8884b2e01c68f44f3d1149\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cb22622bd70c57cf25cb4e61bec6541175f1fee668e4b1aaa7e36853827315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:21Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.893407 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c85b3d7-edd3-4d42-bb8b-c7892feeff04\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4192bffb9f5423bfaf52510034884b3bbdd764b00e1b3dae354bcfe933a8c0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4916d46886a800ca9719abe6ad8686b6208fec6dd5a84454728e228ac5cb46e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd576444486a121ae8f02a6a7a64402e2db59d1baf0410a0b0036de8a0a1963e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91e6ba531f19ebc253e48073eadf13d896be7771a81abf986f83893cc4eb3ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91e6ba531f19ebc253e48073eadf13d896be7771a81abf986f83893cc4eb3ea9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:21Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.912186 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q8mbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"374025e5-ccaa-4cc4-816b-136713f17d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khpzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khpzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q8mbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:21Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.966271 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.966332 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.966345 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.966370 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:21 crc kubenswrapper[4991]: I1206 01:03:21.966386 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:21Z","lastTransitionTime":"2025-12-06T01:03:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.037638 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.037758 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.037803 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:03:22 crc kubenswrapper[4991]: E1206 01:03:22.037968 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.038042 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:03:22 crc kubenswrapper[4991]: E1206 01:03:22.038170 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:03:22 crc kubenswrapper[4991]: E1206 01:03:22.038319 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:03:22 crc kubenswrapper[4991]: E1206 01:03:22.038427 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.070552 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.070616 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.070635 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.070662 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.070682 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:22Z","lastTransitionTime":"2025-12-06T01:03:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.173818 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.173884 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.173901 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.173926 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.173943 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:22Z","lastTransitionTime":"2025-12-06T01:03:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.276627 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.276667 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.276676 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.276693 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.276704 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:22Z","lastTransitionTime":"2025-12-06T01:03:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.380468 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.380531 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.380548 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.380575 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.380593 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:22Z","lastTransitionTime":"2025-12-06T01:03:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.484129 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.484196 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.484220 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.484257 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.484280 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:22Z","lastTransitionTime":"2025-12-06T01:03:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.587207 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.587275 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.587300 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.587333 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.587359 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:22Z","lastTransitionTime":"2025-12-06T01:03:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.691464 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.691528 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.691546 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.691574 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.691593 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:22Z","lastTransitionTime":"2025-12-06T01:03:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.795656 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.795719 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.795831 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.795861 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.795883 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:22Z","lastTransitionTime":"2025-12-06T01:03:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.899109 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.899169 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.899182 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.899203 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:22 crc kubenswrapper[4991]: I1206 01:03:22.899214 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:22Z","lastTransitionTime":"2025-12-06T01:03:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.002306 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.002388 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.002405 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.002428 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.002442 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:23Z","lastTransitionTime":"2025-12-06T01:03:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.105727 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.105790 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.105802 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.105828 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.105843 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:23Z","lastTransitionTime":"2025-12-06T01:03:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.209709 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.209833 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.209868 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.209908 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.209935 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:23Z","lastTransitionTime":"2025-12-06T01:03:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.313234 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.313321 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.313346 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.313379 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.313402 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:23Z","lastTransitionTime":"2025-12-06T01:03:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.417124 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.417207 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.417231 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.417264 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.417291 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:23Z","lastTransitionTime":"2025-12-06T01:03:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.520787 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.520855 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.520867 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.520888 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.520901 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:23Z","lastTransitionTime":"2025-12-06T01:03:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.624316 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.624378 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.624396 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.624426 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.624445 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:23Z","lastTransitionTime":"2025-12-06T01:03:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.727686 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.727769 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.727804 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.727837 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.727861 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:23Z","lastTransitionTime":"2025-12-06T01:03:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.830820 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.830879 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.830930 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.830960 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.831029 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:23Z","lastTransitionTime":"2025-12-06T01:03:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.934301 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.934375 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.934400 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.934432 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:23 crc kubenswrapper[4991]: I1206 01:03:23.934456 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:23Z","lastTransitionTime":"2025-12-06T01:03:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.036810 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.036854 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.036903 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.036859 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:03:24 crc kubenswrapper[4991]: E1206 01:03:24.036980 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:03:24 crc kubenswrapper[4991]: E1206 01:03:24.037212 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:03:24 crc kubenswrapper[4991]: E1206 01:03:24.037295 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:03:24 crc kubenswrapper[4991]: E1206 01:03:24.037981 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.037958 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.038182 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.038196 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.038217 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.038241 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:24Z","lastTransitionTime":"2025-12-06T01:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.078662 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.078740 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.078765 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.078797 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.078821 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:24Z","lastTransitionTime":"2025-12-06T01:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:24 crc kubenswrapper[4991]: E1206 01:03:24.094482 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"596c3607-4a7a-43fe-9f86-5d0b3f7502dc\\\",\\\"systemUUID\\\":\\\"a16d5df6-771f-4852-a5a1-7a32af9d583d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:24Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.100023 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.100106 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.100125 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.100184 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.100205 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:24Z","lastTransitionTime":"2025-12-06T01:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:24 crc kubenswrapper[4991]: E1206 01:03:24.122059 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"596c3607-4a7a-43fe-9f86-5d0b3f7502dc\\\",\\\"systemUUID\\\":\\\"a16d5df6-771f-4852-a5a1-7a32af9d583d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:24Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.128381 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.128451 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.128481 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.128519 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.128546 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:24Z","lastTransitionTime":"2025-12-06T01:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:24 crc kubenswrapper[4991]: E1206 01:03:24.145150 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"596c3607-4a7a-43fe-9f86-5d0b3f7502dc\\\",\\\"systemUUID\\\":\\\"a16d5df6-771f-4852-a5a1-7a32af9d583d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:24Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.150441 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.150542 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.150579 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.150605 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.150621 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:24Z","lastTransitionTime":"2025-12-06T01:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:24 crc kubenswrapper[4991]: E1206 01:03:24.166644 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"596c3607-4a7a-43fe-9f86-5d0b3f7502dc\\\",\\\"systemUUID\\\":\\\"a16d5df6-771f-4852-a5a1-7a32af9d583d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:24Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.173047 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.173124 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.173137 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.173159 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.173197 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:24Z","lastTransitionTime":"2025-12-06T01:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:24 crc kubenswrapper[4991]: E1206 01:03:24.195577 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"596c3607-4a7a-43fe-9f86-5d0b3f7502dc\\\",\\\"systemUUID\\\":\\\"a16d5df6-771f-4852-a5a1-7a32af9d583d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:24Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:24 crc kubenswrapper[4991]: E1206 01:03:24.195867 4991 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.198852 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.198896 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.198928 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.198948 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.198963 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:24Z","lastTransitionTime":"2025-12-06T01:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.302876 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.302957 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.303009 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.303031 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.303048 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:24Z","lastTransitionTime":"2025-12-06T01:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.406421 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.406466 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.406476 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.406492 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.406503 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:24Z","lastTransitionTime":"2025-12-06T01:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.510212 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.510273 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.510290 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.510312 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.510328 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:24Z","lastTransitionTime":"2025-12-06T01:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.614027 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.614090 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.614113 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.614142 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.614163 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:24Z","lastTransitionTime":"2025-12-06T01:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.718298 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.718342 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.718353 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.718371 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.718382 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:24Z","lastTransitionTime":"2025-12-06T01:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.821516 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.821560 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.821571 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.821586 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.821599 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:24Z","lastTransitionTime":"2025-12-06T01:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.924609 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.924727 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.924748 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.924777 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:24 crc kubenswrapper[4991]: I1206 01:03:24.924796 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:24Z","lastTransitionTime":"2025-12-06T01:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.027061 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.027140 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.027156 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.027177 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.027214 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:25Z","lastTransitionTime":"2025-12-06T01:03:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.130286 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.130350 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.130368 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.130391 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.130407 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:25Z","lastTransitionTime":"2025-12-06T01:03:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.234261 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.234311 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.234322 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.234342 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.234355 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:25Z","lastTransitionTime":"2025-12-06T01:03:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.338138 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.338202 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.338221 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.338250 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.338269 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:25Z","lastTransitionTime":"2025-12-06T01:03:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.442180 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.442238 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.442253 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.442276 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.442291 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:25Z","lastTransitionTime":"2025-12-06T01:03:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.545643 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.545717 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.545734 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.545759 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.545776 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:25Z","lastTransitionTime":"2025-12-06T01:03:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.649323 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.649727 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.649823 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.649921 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.650077 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:25Z","lastTransitionTime":"2025-12-06T01:03:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.752750 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.752840 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.752869 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.752911 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.752937 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:25Z","lastTransitionTime":"2025-12-06T01:03:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.857382 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.857471 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.857495 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.857529 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.857606 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:25Z","lastTransitionTime":"2025-12-06T01:03:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.971117 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.971204 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.971228 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.971262 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:25 crc kubenswrapper[4991]: I1206 01:03:25.971289 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:25Z","lastTransitionTime":"2025-12-06T01:03:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.037454 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:03:26 crc kubenswrapper[4991]: E1206 01:03:26.037653 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.037936 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:03:26 crc kubenswrapper[4991]: E1206 01:03:26.038038 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.038201 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:03:26 crc kubenswrapper[4991]: E1206 01:03:26.038283 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.038443 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:03:26 crc kubenswrapper[4991]: E1206 01:03:26.038526 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.074757 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.074833 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.074890 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.074923 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.074955 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:26Z","lastTransitionTime":"2025-12-06T01:03:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.179130 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.179214 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.179238 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.179269 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.179290 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:26Z","lastTransitionTime":"2025-12-06T01:03:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.282550 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.282620 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.282638 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.282663 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.282679 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:26Z","lastTransitionTime":"2025-12-06T01:03:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.387402 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.387474 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.387495 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.387525 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.387544 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:26Z","lastTransitionTime":"2025-12-06T01:03:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.491315 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.491388 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.491406 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.491433 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.491451 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:26Z","lastTransitionTime":"2025-12-06T01:03:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.597286 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.597352 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.597370 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.597414 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.597454 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:26Z","lastTransitionTime":"2025-12-06T01:03:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.701398 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.701476 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.701496 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.701527 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.701549 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:26Z","lastTransitionTime":"2025-12-06T01:03:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.804530 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.804567 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.804576 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.804596 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.804607 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:26Z","lastTransitionTime":"2025-12-06T01:03:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.907315 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.907398 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.907422 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.907456 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:26 crc kubenswrapper[4991]: I1206 01:03:26.907481 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:26Z","lastTransitionTime":"2025-12-06T01:03:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.011134 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.011213 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.011232 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.011264 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.011284 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:27Z","lastTransitionTime":"2025-12-06T01:03:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.057215 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kw4lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c790aea7-3412-47a6-a3d0-3dd019c26022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4698040873e672e01eca6a4791166d92bd11bdcc3197bbe0340472eba3ba69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnwnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kw4lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:27Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.075900 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c43bf08a-5774-4727-b681-b5d2291e51dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cf257527acb99d14685360262c24dfe896a1ae4e11680b2b8e1c68720182ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd4bdff2c70d7492d7819fa53b5844cd7b7c50157c2ca34841bcdd8fbffc1345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd4bdff2c70d7492d7819fa53b5844cd7b7c50157c2ca34841bcdd8fbffc1345\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:27Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.096812 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc01e1f52af4386bd143ff8779458f0e335475f40f067c608c70a765d6fad952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:27Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.113372 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.113420 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.113435 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.113456 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.113469 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:27Z","lastTransitionTime":"2025-12-06T01:03:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.131192 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1c8508-3f35-44b2-988e-cc665121ce94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a6fd158c0c0ae9d2cd68294056b2c96405dc3c4bfe170d8370f86310d8fd12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a6fd158c0c0ae9d2cd68294056b2c96405dc3c4bfe170d8370f86310d8fd12\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T01:03:14Z\\\",\\\"message\\\":\\\"ap[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-controller-manager/kube-controller-manager]} name:Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.36:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ba175bbe-5cc4-47e6-a32d-57693e1320bd}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1206 01:03:14.045336 6896 services_controller.go:444] Built service openshift-oauth-apiserver/api LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1206 01:03:14.045350 6896 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1206 01:03:14.045361 6896 services_controller.go:445] Built service openshift-oauth-apiserver/api LB template configs for network=default: []services.lbConfig(nil)\\\\nI1206 01:03:14.045398 6896 services_controller.go:452] Built service openshift-kube-scheduler-operator/metrics per-node LB for network=default: []services.LB{}\\\\nF1206 01:03:14.045406 6896 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:03:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5t29r_openshift-ovn-kubernetes(7a1c8508-3f35-44b2-988e-cc665121ce94)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5t29r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:27Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.147679 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g2v2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee92c0e499eed2711d5027aba6265d26549865fbfb26de5dd38442e3c9a852b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlkqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g2v2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:27Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.170186 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f3f8be-7093-40c6-8c6e-6bb65c878b31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d50b2b6313c316a02866d767184d6eb48a30af52e2d86c65a08dab461e73a7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:26Z\\\",\\\"message\\\":\\\"ication::requestheader-client-ca-file\\\\nI1206 01:02:26.397262 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1206 01:02:26.397306 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1206 01:02:26.397362 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764982931\\\\\\\\\\\\\\\" (2025-12-06 01:02:10 +0000 UTC to 2026-01-05 01:02:11 +0000 UTC (now=2025-12-06 01:02:26.397254869 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397406 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1206 01:02:26.397421 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1206 01:02:26.397440 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\"\\\\nI1206 01:02:26.397610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764982946\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764982946\\\\\\\\\\\\\\\" (2025-12-06 00:02:26 +0000 UTC to 2026-12-06 00:02:26 +0000 UTC (now=2025-12-06 01:02:26.397572448 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397373 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1206 01:02:26.397647 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1206 01:02:26.397687 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1206 01:02:26.397767 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1206 01:02:26.399022 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:27Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.189201 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:27Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.211438 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:27Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.216329 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.216391 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.216467 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.216830 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.216867 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:27Z","lastTransitionTime":"2025-12-06T01:03:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.231495 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f35cea72-9e31-4432-9e3b-16a50ebce794\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9893b4133f9e41bd3fa958f5e1c25f5f0eb3cd3bd8d195d58f282866dc850c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674001323aaa7a65c9d6df014b5b35f7afb939aa6afe57ac9858923e3fe6ce15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ppxx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:27Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.251602 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sq7kg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f95c58-fd69-48f6-94be-15927aa850c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2778914060f77aac8e6e77fc814436aeed10afdc9565dcd71b3b4c15324072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d44b91702b22653a2cc67f20e049260e82a69445cff9d6d2abbec16d1e725147\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T01:03:20Z\\\",\\\"message\\\":\\\"2025-12-06T01:02:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8875cecd-5c55-4c33-8830-2385a56cc89c\\\\n2025-12-06T01:02:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8875cecd-5c55-4c33-8830-2385a56cc89c to /host/opt/cni/bin/\\\\n2025-12-06T01:02:33Z [verbose] multus-daemon started\\\\n2025-12-06T01:02:33Z [verbose] Readiness Indicator file check\\\\n2025-12-06T01:03:18Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw8lm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sq7kg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:27Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.269980 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e52ba3-560c-483d-9c51-898cbe94aad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f6d3d0e90489e71ee7e5e352e5d1752a7248686adf123c15f2f9aba34e9c556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tw85g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:27Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.287340 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t8t9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfebf788-e0ac-4bd4-8926-1f7b7cabba71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60a6e0233b2965c7438c6ee9467c180d9cd0b1407e2ed0279cc0947686bb9fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gpqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://966bddb0df2e52745119f2f51e57c859a39ca726270405f7b9af19c9d90a4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gpqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t8t9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:27Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.307463 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5409f18b-76b3-4388-a9c8-50a1ec8a37d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3c6493775f81f3e0feeee29f2b3c8420a15ba9ce8117fab50521b2251af46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6fa450da9e79fab8a6d70eb261ad080499995eef8884b2e01c68f44f3d1149\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cb22622bd70c57cf25cb4e61bec6541175f1fee668e4b1aaa7e36853827315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:27Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.319697 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.319757 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.319775 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.319804 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.319824 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:27Z","lastTransitionTime":"2025-12-06T01:03:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.327557 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c85b3d7-edd3-4d42-bb8b-c7892feeff04\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4192bffb9f5423bfaf52510034884b3bbdd764b00e1b3dae354bcfe933a8c0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4916d46886a800ca9719abe6ad8686b6208fec6dd5a84454728e228ac5cb46e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd576444486a121ae8f02a6a7a64402e2db59d1baf0410a0b0036de8a0a1963e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91e6ba531f19ebc253e48073eadf13d896be7771a81abf986f83893cc4eb3ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91e6ba531f19ebc253e48073eadf13d896be7771a81abf986f83893cc4eb3ea9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:27Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.348880 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021d3a65c56a737a682268cde0d57a186389f597de592f541a72ea5958a170c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70371e1283910d9d652089554a815343eb1d54110a2472e067d5a2e8878c9193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:27Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.369083 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d8086b693e86f8e2ea45695911c64df4acded68da7cc61db4cdbc19b2402879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:27Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.394047 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:27Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.410708 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q8mbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"374025e5-ccaa-4cc4-816b-136713f17d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khpzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khpzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q8mbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:27Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.423186 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.423253 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.423269 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.423294 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.423310 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:27Z","lastTransitionTime":"2025-12-06T01:03:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.527117 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.527184 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.527201 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.527228 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.527244 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:27Z","lastTransitionTime":"2025-12-06T01:03:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.631377 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.631456 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.631480 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.631515 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.631536 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:27Z","lastTransitionTime":"2025-12-06T01:03:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.735383 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.735446 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.735459 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.735483 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.735497 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:27Z","lastTransitionTime":"2025-12-06T01:03:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.838603 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.838696 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.838724 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.838768 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.838793 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:27Z","lastTransitionTime":"2025-12-06T01:03:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.942460 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.942557 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.942577 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.942610 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:27 crc kubenswrapper[4991]: I1206 01:03:27.942630 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:27Z","lastTransitionTime":"2025-12-06T01:03:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.037230 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.037350 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:03:28 crc kubenswrapper[4991]: E1206 01:03:28.037429 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.037456 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.037521 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:03:28 crc kubenswrapper[4991]: E1206 01:03:28.037757 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:03:28 crc kubenswrapper[4991]: E1206 01:03:28.038260 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:03:28 crc kubenswrapper[4991]: E1206 01:03:28.038481 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.038614 4991 scope.go:117] "RemoveContainer" containerID="34a6fd158c0c0ae9d2cd68294056b2c96405dc3c4bfe170d8370f86310d8fd12" Dec 06 01:03:28 crc kubenswrapper[4991]: E1206 01:03:28.038807 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5t29r_openshift-ovn-kubernetes(7a1c8508-3f35-44b2-988e-cc665121ce94)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.046204 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.046281 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.046302 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.046331 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.046410 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:28Z","lastTransitionTime":"2025-12-06T01:03:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.150092 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.150166 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.150191 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.150228 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.150253 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:28Z","lastTransitionTime":"2025-12-06T01:03:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.253980 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.254078 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.254096 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.254126 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.254144 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:28Z","lastTransitionTime":"2025-12-06T01:03:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.357358 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.357410 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.357423 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.357441 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.357454 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:28Z","lastTransitionTime":"2025-12-06T01:03:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.460056 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.460128 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.460143 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.460170 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.460186 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:28Z","lastTransitionTime":"2025-12-06T01:03:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.563189 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.563251 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.563270 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.563297 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.563318 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:28Z","lastTransitionTime":"2025-12-06T01:03:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.666696 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.666770 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.666788 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.666820 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.666845 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:28Z","lastTransitionTime":"2025-12-06T01:03:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.769876 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.769963 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.770033 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.770063 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.770082 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:28Z","lastTransitionTime":"2025-12-06T01:03:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.873700 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.873743 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.873756 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.873778 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.873792 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:28Z","lastTransitionTime":"2025-12-06T01:03:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.976619 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.976711 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.976822 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.976874 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:28 crc kubenswrapper[4991]: I1206 01:03:28.976900 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:28Z","lastTransitionTime":"2025-12-06T01:03:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:29 crc kubenswrapper[4991]: I1206 01:03:29.079904 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:29 crc kubenswrapper[4991]: I1206 01:03:29.079976 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:29 crc kubenswrapper[4991]: I1206 01:03:29.080060 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:29 crc kubenswrapper[4991]: I1206 01:03:29.080092 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:29 crc kubenswrapper[4991]: I1206 01:03:29.080115 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:29Z","lastTransitionTime":"2025-12-06T01:03:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:29 crc kubenswrapper[4991]: I1206 01:03:29.182947 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:29 crc kubenswrapper[4991]: I1206 01:03:29.183050 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:29 crc kubenswrapper[4991]: I1206 01:03:29.183089 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:29 crc kubenswrapper[4991]: I1206 01:03:29.183124 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:29 crc kubenswrapper[4991]: I1206 01:03:29.183147 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:29Z","lastTransitionTime":"2025-12-06T01:03:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:29 crc kubenswrapper[4991]: I1206 01:03:29.285845 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:29 crc kubenswrapper[4991]: I1206 01:03:29.285886 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:29 crc kubenswrapper[4991]: I1206 01:03:29.285896 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:29 crc kubenswrapper[4991]: I1206 01:03:29.285915 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:29 crc kubenswrapper[4991]: I1206 01:03:29.285927 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:29Z","lastTransitionTime":"2025-12-06T01:03:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:29 crc kubenswrapper[4991]: I1206 01:03:29.389329 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:29 crc kubenswrapper[4991]: I1206 01:03:29.389398 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:29 crc kubenswrapper[4991]: I1206 01:03:29.389416 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:29 crc kubenswrapper[4991]: I1206 01:03:29.389517 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:29 crc kubenswrapper[4991]: I1206 01:03:29.389537 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:29Z","lastTransitionTime":"2025-12-06T01:03:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:29 crc kubenswrapper[4991]: I1206 01:03:29.493287 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:29 crc kubenswrapper[4991]: I1206 01:03:29.493340 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:29 crc kubenswrapper[4991]: I1206 01:03:29.493357 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:29 crc kubenswrapper[4991]: I1206 01:03:29.493379 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:29 crc kubenswrapper[4991]: I1206 01:03:29.493393 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:29Z","lastTransitionTime":"2025-12-06T01:03:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:29 crc kubenswrapper[4991]: I1206 01:03:29.595893 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:29 crc kubenswrapper[4991]: I1206 01:03:29.595949 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:29 crc kubenswrapper[4991]: I1206 01:03:29.595962 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:29 crc kubenswrapper[4991]: I1206 01:03:29.596003 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:29 crc kubenswrapper[4991]: I1206 01:03:29.596017 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:29Z","lastTransitionTime":"2025-12-06T01:03:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:29 crc kubenswrapper[4991]: I1206 01:03:29.698847 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:29 crc kubenswrapper[4991]: I1206 01:03:29.698892 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:29 crc kubenswrapper[4991]: I1206 01:03:29.698905 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:29 crc kubenswrapper[4991]: I1206 01:03:29.698924 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:29 crc kubenswrapper[4991]: I1206 01:03:29.698936 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:29Z","lastTransitionTime":"2025-12-06T01:03:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:29 crc kubenswrapper[4991]: I1206 01:03:29.802734 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:29 crc kubenswrapper[4991]: I1206 01:03:29.802807 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:29 crc kubenswrapper[4991]: I1206 01:03:29.802830 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:29 crc kubenswrapper[4991]: I1206 01:03:29.802866 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:29 crc kubenswrapper[4991]: I1206 01:03:29.802892 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:29Z","lastTransitionTime":"2025-12-06T01:03:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:29 crc kubenswrapper[4991]: I1206 01:03:29.873682 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:03:29 crc kubenswrapper[4991]: E1206 01:03:29.874109 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:04:33.874064443 +0000 UTC m=+147.224354931 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:03:29 crc kubenswrapper[4991]: I1206 01:03:29.905858 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:29 crc kubenswrapper[4991]: I1206 01:03:29.905922 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:29 crc kubenswrapper[4991]: I1206 01:03:29.905935 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:29 crc kubenswrapper[4991]: I1206 01:03:29.905954 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:29 crc kubenswrapper[4991]: I1206 01:03:29.905967 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:29Z","lastTransitionTime":"2025-12-06T01:03:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.009147 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.009205 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.009216 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.009233 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.009247 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:30Z","lastTransitionTime":"2025-12-06T01:03:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.037795 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.037879 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.037931 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.037961 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:03:30 crc kubenswrapper[4991]: E1206 01:03:30.038131 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:03:30 crc kubenswrapper[4991]: E1206 01:03:30.038299 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:03:30 crc kubenswrapper[4991]: E1206 01:03:30.038425 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:03:30 crc kubenswrapper[4991]: E1206 01:03:30.038530 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.076557 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.076629 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.076653 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.076679 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:03:30 crc kubenswrapper[4991]: E1206 01:03:30.076825 4991 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 01:03:30 crc kubenswrapper[4991]: E1206 01:03:30.076947 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 01:04:34.076919996 +0000 UTC m=+147.427210465 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 06 01:03:30 crc kubenswrapper[4991]: E1206 01:03:30.076979 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 01:03:30 crc kubenswrapper[4991]: E1206 01:03:30.077062 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 01:03:30 crc kubenswrapper[4991]: E1206 01:03:30.076837 4991 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 01:03:30 crc kubenswrapper[4991]: E1206 01:03:30.077081 4991 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 01:03:30 crc kubenswrapper[4991]: E1206 01:03:30.077059 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 06 01:03:30 crc kubenswrapper[4991]: E1206 01:03:30.077204 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 06 01:03:30 crc kubenswrapper[4991]: E1206 01:03:30.077227 4991 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 01:03:30 crc kubenswrapper[4991]: E1206 01:03:30.077157 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-06 01:04:34.077136067 +0000 UTC m=+147.427426535 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 01:03:30 crc kubenswrapper[4991]: E1206 01:03:30.077277 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-06 01:04:34.077265813 +0000 UTC m=+147.427556281 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 06 01:03:30 crc kubenswrapper[4991]: E1206 01:03:30.077299 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-06 01:04:34.077292405 +0000 UTC m=+147.427582873 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.112879 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.112973 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.113078 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.113126 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.113165 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:30Z","lastTransitionTime":"2025-12-06T01:03:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.216508 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.216576 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.216592 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.216616 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.216635 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:30Z","lastTransitionTime":"2025-12-06T01:03:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.320100 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.320161 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.320180 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.320207 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.320226 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:30Z","lastTransitionTime":"2025-12-06T01:03:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.423477 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.423548 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.423563 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.423584 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.423599 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:30Z","lastTransitionTime":"2025-12-06T01:03:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.527184 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.527239 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.527251 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.527273 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.527285 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:30Z","lastTransitionTime":"2025-12-06T01:03:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.630065 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.630123 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.630141 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.630170 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.630189 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:30Z","lastTransitionTime":"2025-12-06T01:03:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.734189 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.734256 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.734275 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.734302 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.734320 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:30Z","lastTransitionTime":"2025-12-06T01:03:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.837505 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.837588 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.837617 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.837652 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.837674 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:30Z","lastTransitionTime":"2025-12-06T01:03:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.941422 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.941481 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.941495 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.941515 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:30 crc kubenswrapper[4991]: I1206 01:03:30.941528 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:30Z","lastTransitionTime":"2025-12-06T01:03:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.044318 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.044411 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.044436 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.044473 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.044499 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:31Z","lastTransitionTime":"2025-12-06T01:03:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.147285 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.147355 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.147375 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.147404 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.147423 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:31Z","lastTransitionTime":"2025-12-06T01:03:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.250504 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.250937 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.251152 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.251312 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.251461 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:31Z","lastTransitionTime":"2025-12-06T01:03:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.355408 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.355474 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.355491 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.355521 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.355540 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:31Z","lastTransitionTime":"2025-12-06T01:03:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.458863 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.458928 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.458946 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.458973 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.459062 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:31Z","lastTransitionTime":"2025-12-06T01:03:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.562522 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.562590 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.562603 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.562626 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.562644 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:31Z","lastTransitionTime":"2025-12-06T01:03:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.666167 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.666306 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.666361 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.666391 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.666409 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:31Z","lastTransitionTime":"2025-12-06T01:03:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.769612 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.769687 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.769705 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.769728 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.769743 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:31Z","lastTransitionTime":"2025-12-06T01:03:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.874122 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.874195 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.874218 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.874245 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.874263 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:31Z","lastTransitionTime":"2025-12-06T01:03:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.976658 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.976700 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.976708 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.976721 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:31 crc kubenswrapper[4991]: I1206 01:03:31.976730 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:31Z","lastTransitionTime":"2025-12-06T01:03:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.037480 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.037565 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.037565 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:03:32 crc kubenswrapper[4991]: E1206 01:03:32.037719 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:03:32 crc kubenswrapper[4991]: E1206 01:03:32.037933 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.038129 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:03:32 crc kubenswrapper[4991]: E1206 01:03:32.038331 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:03:32 crc kubenswrapper[4991]: E1206 01:03:32.038431 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.080825 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.080908 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.080928 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.080962 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.080979 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:32Z","lastTransitionTime":"2025-12-06T01:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.185117 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.185217 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.185244 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.185280 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.185307 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:32Z","lastTransitionTime":"2025-12-06T01:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.289672 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.289760 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.289782 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.289813 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.289837 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:32Z","lastTransitionTime":"2025-12-06T01:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.392870 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.392943 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.392967 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.393027 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.393102 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:32Z","lastTransitionTime":"2025-12-06T01:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.495893 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.495945 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.495957 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.495978 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.496041 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:32Z","lastTransitionTime":"2025-12-06T01:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.598927 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.599458 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.599614 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.599751 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.599892 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:32Z","lastTransitionTime":"2025-12-06T01:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.703432 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.703811 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.703956 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.704092 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.704177 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:32Z","lastTransitionTime":"2025-12-06T01:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.807598 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.808108 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.808242 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.808371 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.808441 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:32Z","lastTransitionTime":"2025-12-06T01:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.911500 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.911575 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.911599 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.911631 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:32 crc kubenswrapper[4991]: I1206 01:03:32.911657 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:32Z","lastTransitionTime":"2025-12-06T01:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.015092 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.015155 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.015168 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.015190 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.015204 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:33Z","lastTransitionTime":"2025-12-06T01:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.118334 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.118401 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.118422 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.118453 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.118479 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:33Z","lastTransitionTime":"2025-12-06T01:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.220885 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.220965 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.221017 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.221054 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.221078 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:33Z","lastTransitionTime":"2025-12-06T01:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.324366 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.324426 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.324441 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.324466 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.324482 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:33Z","lastTransitionTime":"2025-12-06T01:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.428079 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.428135 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.428150 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.428172 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.428187 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:33Z","lastTransitionTime":"2025-12-06T01:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.531047 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.531103 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.531116 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.531139 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.531153 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:33Z","lastTransitionTime":"2025-12-06T01:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.634105 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.634462 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.634622 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.634748 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.634870 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:33Z","lastTransitionTime":"2025-12-06T01:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.738427 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.738479 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.738488 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.738513 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.738526 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:33Z","lastTransitionTime":"2025-12-06T01:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.841582 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.841643 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.841653 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.841673 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.841683 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:33Z","lastTransitionTime":"2025-12-06T01:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.944108 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.944154 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.944167 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.944187 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:33 crc kubenswrapper[4991]: I1206 01:03:33.944201 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:33Z","lastTransitionTime":"2025-12-06T01:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.037413 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.037471 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.037443 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.037633 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:03:34 crc kubenswrapper[4991]: E1206 01:03:34.037765 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:03:34 crc kubenswrapper[4991]: E1206 01:03:34.037975 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:03:34 crc kubenswrapper[4991]: E1206 01:03:34.038066 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:03:34 crc kubenswrapper[4991]: E1206 01:03:34.038138 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.047618 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.047682 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.047700 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.047728 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.047748 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:34Z","lastTransitionTime":"2025-12-06T01:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.151189 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.151289 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.151312 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.151344 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.151362 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:34Z","lastTransitionTime":"2025-12-06T01:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.254244 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.254319 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.254334 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.254356 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.254374 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:34Z","lastTransitionTime":"2025-12-06T01:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.358606 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.358684 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.358699 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.358718 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.358734 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:34Z","lastTransitionTime":"2025-12-06T01:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.461699 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.461756 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.461767 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.461784 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.461793 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:34Z","lastTransitionTime":"2025-12-06T01:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.487239 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.487294 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.487307 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.487329 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.487344 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:34Z","lastTransitionTime":"2025-12-06T01:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:34 crc kubenswrapper[4991]: E1206 01:03:34.509078 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"596c3607-4a7a-43fe-9f86-5d0b3f7502dc\\\",\\\"systemUUID\\\":\\\"a16d5df6-771f-4852-a5a1-7a32af9d583d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:34Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.516082 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.516141 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.516159 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.516184 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.516203 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:34Z","lastTransitionTime":"2025-12-06T01:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:34 crc kubenswrapper[4991]: E1206 01:03:34.539051 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"596c3607-4a7a-43fe-9f86-5d0b3f7502dc\\\",\\\"systemUUID\\\":\\\"a16d5df6-771f-4852-a5a1-7a32af9d583d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:34Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.547630 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.547696 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.547717 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.547746 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.547770 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:34Z","lastTransitionTime":"2025-12-06T01:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:34 crc kubenswrapper[4991]: E1206 01:03:34.577304 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"596c3607-4a7a-43fe-9f86-5d0b3f7502dc\\\",\\\"systemUUID\\\":\\\"a16d5df6-771f-4852-a5a1-7a32af9d583d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:34Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.583257 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.583312 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.583334 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.583364 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.583388 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:34Z","lastTransitionTime":"2025-12-06T01:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:34 crc kubenswrapper[4991]: E1206 01:03:34.606404 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"596c3607-4a7a-43fe-9f86-5d0b3f7502dc\\\",\\\"systemUUID\\\":\\\"a16d5df6-771f-4852-a5a1-7a32af9d583d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:34Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.613197 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.613294 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.613310 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.613332 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.613347 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:34Z","lastTransitionTime":"2025-12-06T01:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:34 crc kubenswrapper[4991]: E1206 01:03:34.630690 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-06T01:03:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"596c3607-4a7a-43fe-9f86-5d0b3f7502dc\\\",\\\"systemUUID\\\":\\\"a16d5df6-771f-4852-a5a1-7a32af9d583d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:34Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:34 crc kubenswrapper[4991]: E1206 01:03:34.630860 4991 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.632880 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.632919 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.632932 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.632950 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.632963 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:34Z","lastTransitionTime":"2025-12-06T01:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.736414 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.736837 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.737013 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.737165 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.737291 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:34Z","lastTransitionTime":"2025-12-06T01:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.840363 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.840452 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.840476 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.840509 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.840534 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:34Z","lastTransitionTime":"2025-12-06T01:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.944091 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.944445 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.944531 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.944624 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:34 crc kubenswrapper[4991]: I1206 01:03:34.944724 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:34Z","lastTransitionTime":"2025-12-06T01:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.047423 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.047485 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.047497 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.047522 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.047536 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:35Z","lastTransitionTime":"2025-12-06T01:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.059671 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.150883 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.150953 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.150972 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.151014 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.151030 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:35Z","lastTransitionTime":"2025-12-06T01:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.254504 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.254571 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.254589 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.254612 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.254627 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:35Z","lastTransitionTime":"2025-12-06T01:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.357450 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.357839 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.358054 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.358301 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.358446 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:35Z","lastTransitionTime":"2025-12-06T01:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.461332 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.461810 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.461870 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.462119 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.462219 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:35Z","lastTransitionTime":"2025-12-06T01:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.566137 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.566207 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.566218 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.566238 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.566255 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:35Z","lastTransitionTime":"2025-12-06T01:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.669627 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.669687 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.669712 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.669749 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.669773 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:35Z","lastTransitionTime":"2025-12-06T01:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.772965 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.773140 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.773162 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.773234 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.773260 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:35Z","lastTransitionTime":"2025-12-06T01:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.877356 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.877407 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.877421 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.877443 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.877458 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:35Z","lastTransitionTime":"2025-12-06T01:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.980547 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.980632 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.980650 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.980678 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:35 crc kubenswrapper[4991]: I1206 01:03:35.980700 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:35Z","lastTransitionTime":"2025-12-06T01:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.037885 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.037963 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.038018 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.038112 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:03:36 crc kubenswrapper[4991]: E1206 01:03:36.038225 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:03:36 crc kubenswrapper[4991]: E1206 01:03:36.038423 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:03:36 crc kubenswrapper[4991]: E1206 01:03:36.038601 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:03:36 crc kubenswrapper[4991]: E1206 01:03:36.038832 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.084226 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.084322 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.084363 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.084399 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.084442 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:36Z","lastTransitionTime":"2025-12-06T01:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.187962 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.188101 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.188119 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.188148 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.188168 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:36Z","lastTransitionTime":"2025-12-06T01:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.291778 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.291873 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.291908 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.291943 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.291966 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:36Z","lastTransitionTime":"2025-12-06T01:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.395545 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.395609 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.395634 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.395657 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.395673 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:36Z","lastTransitionTime":"2025-12-06T01:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.498312 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.498397 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.498409 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.498449 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.498462 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:36Z","lastTransitionTime":"2025-12-06T01:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.601550 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.602110 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.602291 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.602452 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.602614 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:36Z","lastTransitionTime":"2025-12-06T01:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.712481 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.712573 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.712597 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.712634 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.712658 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:36Z","lastTransitionTime":"2025-12-06T01:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.815567 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.816024 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.816218 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.816423 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.816593 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:36Z","lastTransitionTime":"2025-12-06T01:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.920047 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.920116 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.920131 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.920152 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:36 crc kubenswrapper[4991]: I1206 01:03:36.920165 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:36Z","lastTransitionTime":"2025-12-06T01:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.023071 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.023446 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.023511 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.023612 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.023676 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:37Z","lastTransitionTime":"2025-12-06T01:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.065764 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1c8508-3f35-44b2-988e-cc665121ce94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a6fd158c0c0ae9d2cd68294056b2c96405dc3c4bfe170d8370f86310d8fd12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a6fd158c0c0ae9d2cd68294056b2c96405dc3c4bfe170d8370f86310d8fd12\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T01:03:14Z\\\",\\\"message\\\":\\\"ap[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-controller-manager/kube-controller-manager]} name:Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.36:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ba175bbe-5cc4-47e6-a32d-57693e1320bd}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1206 01:03:14.045336 6896 services_controller.go:444] Built service openshift-oauth-apiserver/api LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1206 01:03:14.045350 6896 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1206 01:03:14.045361 6896 services_controller.go:445] Built service openshift-oauth-apiserver/api LB template configs for network=default: []services.lbConfig(nil)\\\\nI1206 01:03:14.045398 6896 services_controller.go:452] Built service openshift-kube-scheduler-operator/metrics per-node LB for network=default: []services.LB{}\\\\nF1206 01:03:14.045406 6896 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:03:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5t29r_openshift-ovn-kubernetes(7a1c8508-3f35-44b2-988e-cc665121ce94)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5t29r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:37Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.083127 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g2v2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee92c0e499eed2711d5027aba6265d26549865fbfb26de5dd38442e3c9a852b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlkqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g2v2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:37Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.100145 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c43bf08a-5774-4727-b681-b5d2291e51dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cf257527acb99d14685360262c24dfe896a1ae4e11680b2b8e1c68720182ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd4bdff2c70d7492d7819fa53b5844cd7b7c50157c2ca34841bcdd8fbffc1345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd4bdff2c70d7492d7819fa53b5844cd7b7c50157c2ca34841bcdd8fbffc1345\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:37Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.122504 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc01e1f52af4386bd143ff8779458f0e335475f40f067c608c70a765d6fad952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:37Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.126204 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.126236 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.126245 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.126261 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.126272 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:37Z","lastTransitionTime":"2025-12-06T01:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.137020 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:37Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.151964 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f35cea72-9e31-4432-9e3b-16a50ebce794\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9893b4133f9e41bd3fa958f5e1c25f5f0eb3cd3bd8d195d58f282866dc850c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674001323aaa7a65c9d6df014b5b35f7afb939aa6afe57ac9858923e3fe6ce15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ppxx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:37Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.168100 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f3f8be-7093-40c6-8c6e-6bb65c878b31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d50b2b6313c316a02866d767184d6eb48a30af52e2d86c65a08dab461e73a7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:26Z\\\",\\\"message\\\":\\\"ication::requestheader-client-ca-file\\\\nI1206 01:02:26.397262 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1206 01:02:26.397306 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1206 01:02:26.397362 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764982931\\\\\\\\\\\\\\\" (2025-12-06 01:02:10 +0000 UTC to 2026-01-05 01:02:11 +0000 UTC (now=2025-12-06 01:02:26.397254869 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397406 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1206 01:02:26.397421 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1206 01:02:26.397440 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\"\\\\nI1206 01:02:26.397610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764982946\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764982946\\\\\\\\\\\\\\\" (2025-12-06 00:02:26 +0000 UTC to 2026-12-06 00:02:26 +0000 UTC (now=2025-12-06 01:02:26.397572448 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397373 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1206 01:02:26.397647 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1206 01:02:26.397687 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1206 01:02:26.397767 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1206 01:02:26.399022 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:37Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.187292 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:37Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.199886 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021d3a65c56a737a682268cde0d57a186389f597de592f541a72ea5958a170c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70371e1283910d9d652089554a815343eb1d54110a2472e067d5a2e8878c9193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:37Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.215309 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d8086b693e86f8e2ea45695911c64df4acded68da7cc61db4cdbc19b2402879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:37Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.230064 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.230106 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.230116 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.230133 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.230145 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:37Z","lastTransitionTime":"2025-12-06T01:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.231202 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:37Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.247578 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sq7kg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f95c58-fd69-48f6-94be-15927aa850c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2778914060f77aac8e6e77fc814436aeed10afdc9565dcd71b3b4c15324072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d44b91702b22653a2cc67f20e049260e82a69445cff9d6d2abbec16d1e725147\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T01:03:20Z\\\",\\\"message\\\":\\\"2025-12-06T01:02:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8875cecd-5c55-4c33-8830-2385a56cc89c\\\\n2025-12-06T01:02:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8875cecd-5c55-4c33-8830-2385a56cc89c to /host/opt/cni/bin/\\\\n2025-12-06T01:02:33Z [verbose] multus-daemon started\\\\n2025-12-06T01:02:33Z [verbose] Readiness Indicator file check\\\\n2025-12-06T01:03:18Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw8lm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sq7kg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:37Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.264903 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e52ba3-560c-483d-9c51-898cbe94aad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f6d3d0e90489e71ee7e5e352e5d1752a7248686adf123c15f2f9aba34e9c556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tw85g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:37Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.276097 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t8t9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfebf788-e0ac-4bd4-8926-1f7b7cabba71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60a6e0233b2965c7438c6ee9467c180d9cd0b1407e2ed0279cc0947686bb9fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gpqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://966bddb0df2e52745119f2f51e57c859a39ca726270405f7b9af19c9d90a4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gpqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t8t9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:37Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.287953 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5409f18b-76b3-4388-a9c8-50a1ec8a37d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3c6493775f81f3e0feeee29f2b3c8420a15ba9ce8117fab50521b2251af46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6fa450da9e79fab8a6d70eb261ad080499995eef8884b2e01c68f44f3d1149\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cb22622bd70c57cf25cb4e61bec6541175f1fee668e4b1aaa7e36853827315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:37Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.300473 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c85b3d7-edd3-4d42-bb8b-c7892feeff04\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4192bffb9f5423bfaf52510034884b3bbdd764b00e1b3dae354bcfe933a8c0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4916d46886a800ca9719abe6ad8686b6208fec6dd5a84454728e228ac5cb46e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd576444486a121ae8f02a6a7a64402e2db59d1baf0410a0b0036de8a0a1963e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91e6ba531f19ebc253e48073eadf13d896be7771a81abf986f83893cc4eb3ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91e6ba531f19ebc253e48073eadf13d896be7771a81abf986f83893cc4eb3ea9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:37Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.313637 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q8mbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"374025e5-ccaa-4cc4-816b-136713f17d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khpzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khpzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q8mbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:37Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.332809 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.333073 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.333200 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.333274 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.333343 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:37Z","lastTransitionTime":"2025-12-06T01:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.337708 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9d3358e-659e-413a-bddb-ce2156f5993c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b45e185f43e3c61c2066f6d137d91c8f928e66602e40f9e8fd53109b2a4d949f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://380f473b8f2c7bb765927d3ce7e5b8c8b7e256f33563a3b757ef6a9607bd797e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c051621cde74bbe99154e5e507969df2760eb4f1a35dbca7eb839e13aa47e6d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a28671e696c6a362c864f4faec7570ef5396fdebb90e18e21d60b5a109982882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87816861e9eab5693a2d8d426667bd10220b806d625f48022cb7a64f01cbdda8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c9ac5eacade675cc065a0b9bbf0f5ef9622e237ff8dd6cc253f446184fba1f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c9ac5eacade675cc065a0b9bbf0f5ef9622e237ff8dd6cc253f446184fba1f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e90c095e0eedbf69c2bb3c468075593ce3376700cdc571994afac4d49a09e5fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e90c095e0eedbf69c2bb3c468075593ce3376700cdc571994afac4d49a09e5fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://76f9a40591f2f1c682e7f534acb271d5e015c380acbd8495e30082b0e4114b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76f9a40591f2f1c682e7f534acb271d5e015c380acbd8495e30082b0e4114b47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:37Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.353214 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kw4lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c790aea7-3412-47a6-a3d0-3dd019c26022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4698040873e672e01eca6a4791166d92bd11bdcc3197bbe0340472eba3ba69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnwnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kw4lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:37Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.436275 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.436311 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.436320 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.436334 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.436345 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:37Z","lastTransitionTime":"2025-12-06T01:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.539734 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.539785 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.539800 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.539819 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.539832 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:37Z","lastTransitionTime":"2025-12-06T01:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.642758 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.642804 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.642815 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.642833 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.642843 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:37Z","lastTransitionTime":"2025-12-06T01:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.746084 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.746128 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.746138 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.746155 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.746189 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:37Z","lastTransitionTime":"2025-12-06T01:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.849067 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.849144 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.849163 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.849187 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.849201 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:37Z","lastTransitionTime":"2025-12-06T01:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.953545 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.953591 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.953604 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.953622 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:37 crc kubenswrapper[4991]: I1206 01:03:37.953635 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:37Z","lastTransitionTime":"2025-12-06T01:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.037355 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.037442 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.037475 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:03:38 crc kubenswrapper[4991]: E1206 01:03:38.038111 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:03:38 crc kubenswrapper[4991]: E1206 01:03:38.037884 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:03:38 crc kubenswrapper[4991]: E1206 01:03:38.038244 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.037531 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:03:38 crc kubenswrapper[4991]: E1206 01:03:38.038425 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.057175 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.057224 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.057237 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.057258 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.057272 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:38Z","lastTransitionTime":"2025-12-06T01:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.159901 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.159954 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.159970 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.160018 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.160035 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:38Z","lastTransitionTime":"2025-12-06T01:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.262703 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.262753 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.262764 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.262784 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.262798 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:38Z","lastTransitionTime":"2025-12-06T01:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.365180 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.365242 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.365255 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.365277 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.365297 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:38Z","lastTransitionTime":"2025-12-06T01:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.468310 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.468388 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.468413 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.468445 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.468468 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:38Z","lastTransitionTime":"2025-12-06T01:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.571865 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.571904 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.571918 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.571936 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.571946 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:38Z","lastTransitionTime":"2025-12-06T01:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.675261 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.675327 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.675345 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.675373 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.675387 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:38Z","lastTransitionTime":"2025-12-06T01:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.778295 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.778352 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.778367 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.778387 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.778400 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:38Z","lastTransitionTime":"2025-12-06T01:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.881082 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.881131 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.881143 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.881163 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.881180 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:38Z","lastTransitionTime":"2025-12-06T01:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.983963 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.984078 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.984102 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.984136 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:38 crc kubenswrapper[4991]: I1206 01:03:38.984161 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:38Z","lastTransitionTime":"2025-12-06T01:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:39 crc kubenswrapper[4991]: I1206 01:03:39.086570 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:39 crc kubenswrapper[4991]: I1206 01:03:39.086627 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:39 crc kubenswrapper[4991]: I1206 01:03:39.086641 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:39 crc kubenswrapper[4991]: I1206 01:03:39.086662 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:39 crc kubenswrapper[4991]: I1206 01:03:39.086674 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:39Z","lastTransitionTime":"2025-12-06T01:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:39 crc kubenswrapper[4991]: I1206 01:03:39.189527 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:39 crc kubenswrapper[4991]: I1206 01:03:39.189612 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:39 crc kubenswrapper[4991]: I1206 01:03:39.189631 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:39 crc kubenswrapper[4991]: I1206 01:03:39.189655 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:39 crc kubenswrapper[4991]: I1206 01:03:39.189671 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:39Z","lastTransitionTime":"2025-12-06T01:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:39 crc kubenswrapper[4991]: I1206 01:03:39.292907 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:39 crc kubenswrapper[4991]: I1206 01:03:39.292955 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:39 crc kubenswrapper[4991]: I1206 01:03:39.292966 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:39 crc kubenswrapper[4991]: I1206 01:03:39.293013 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:39 crc kubenswrapper[4991]: I1206 01:03:39.293030 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:39Z","lastTransitionTime":"2025-12-06T01:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:39 crc kubenswrapper[4991]: I1206 01:03:39.396555 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:39 crc kubenswrapper[4991]: I1206 01:03:39.396613 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:39 crc kubenswrapper[4991]: I1206 01:03:39.396624 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:39 crc kubenswrapper[4991]: I1206 01:03:39.396645 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:39 crc kubenswrapper[4991]: I1206 01:03:39.396657 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:39Z","lastTransitionTime":"2025-12-06T01:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:39 crc kubenswrapper[4991]: I1206 01:03:39.499710 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:39 crc kubenswrapper[4991]: I1206 01:03:39.499766 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:39 crc kubenswrapper[4991]: I1206 01:03:39.499777 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:39 crc kubenswrapper[4991]: I1206 01:03:39.499801 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:39 crc kubenswrapper[4991]: I1206 01:03:39.499815 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:39Z","lastTransitionTime":"2025-12-06T01:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:39 crc kubenswrapper[4991]: I1206 01:03:39.602550 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:39 crc kubenswrapper[4991]: I1206 01:03:39.602610 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:39 crc kubenswrapper[4991]: I1206 01:03:39.602624 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:39 crc kubenswrapper[4991]: I1206 01:03:39.602646 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:39 crc kubenswrapper[4991]: I1206 01:03:39.602659 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:39Z","lastTransitionTime":"2025-12-06T01:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:39 crc kubenswrapper[4991]: I1206 01:03:39.705978 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:39 crc kubenswrapper[4991]: I1206 01:03:39.706091 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:39 crc kubenswrapper[4991]: I1206 01:03:39.706105 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:39 crc kubenswrapper[4991]: I1206 01:03:39.706164 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:39 crc kubenswrapper[4991]: I1206 01:03:39.706182 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:39Z","lastTransitionTime":"2025-12-06T01:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:39 crc kubenswrapper[4991]: I1206 01:03:39.809416 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:39 crc kubenswrapper[4991]: I1206 01:03:39.809503 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:39 crc kubenswrapper[4991]: I1206 01:03:39.809530 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:39 crc kubenswrapper[4991]: I1206 01:03:39.809565 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:39 crc kubenswrapper[4991]: I1206 01:03:39.809594 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:39Z","lastTransitionTime":"2025-12-06T01:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:39 crc kubenswrapper[4991]: I1206 01:03:39.913101 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:39 crc kubenswrapper[4991]: I1206 01:03:39.913195 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:39 crc kubenswrapper[4991]: I1206 01:03:39.913218 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:39 crc kubenswrapper[4991]: I1206 01:03:39.913256 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:39 crc kubenswrapper[4991]: I1206 01:03:39.913284 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:39Z","lastTransitionTime":"2025-12-06T01:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.016071 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.016118 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.016131 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.016149 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.016162 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:40Z","lastTransitionTime":"2025-12-06T01:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.036861 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:03:40 crc kubenswrapper[4991]: E1206 01:03:40.037078 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.037326 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:03:40 crc kubenswrapper[4991]: E1206 01:03:40.037392 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.037555 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:03:40 crc kubenswrapper[4991]: E1206 01:03:40.037647 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.037776 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:03:40 crc kubenswrapper[4991]: E1206 01:03:40.037846 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.119719 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.119764 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.119775 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.119795 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.119808 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:40Z","lastTransitionTime":"2025-12-06T01:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.222852 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.222908 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.222920 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.222939 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.222951 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:40Z","lastTransitionTime":"2025-12-06T01:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.326558 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.326630 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.326648 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.326676 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.326693 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:40Z","lastTransitionTime":"2025-12-06T01:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.434227 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.434271 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.434289 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.434305 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.434315 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:40Z","lastTransitionTime":"2025-12-06T01:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.538581 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.538672 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.538697 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.538918 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.538944 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:40Z","lastTransitionTime":"2025-12-06T01:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.642389 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.642430 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.642440 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.642457 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.642467 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:40Z","lastTransitionTime":"2025-12-06T01:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.746895 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.746940 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.746950 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.746967 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.746977 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:40Z","lastTransitionTime":"2025-12-06T01:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.849299 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.849342 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.849352 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.849368 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.849380 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:40Z","lastTransitionTime":"2025-12-06T01:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.951687 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.951739 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.951749 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.951763 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:40 crc kubenswrapper[4991]: I1206 01:03:40.951772 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:40Z","lastTransitionTime":"2025-12-06T01:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.056213 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.056275 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.056294 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.056318 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.056334 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:41Z","lastTransitionTime":"2025-12-06T01:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.158744 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.158797 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.158807 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.158826 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.158836 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:41Z","lastTransitionTime":"2025-12-06T01:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.261634 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.261683 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.261696 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.261714 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.262099 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:41Z","lastTransitionTime":"2025-12-06T01:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.364336 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.364378 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.364387 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.364402 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.364413 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:41Z","lastTransitionTime":"2025-12-06T01:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.467177 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.467246 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.467271 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.467302 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.467326 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:41Z","lastTransitionTime":"2025-12-06T01:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.570695 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.570771 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.570783 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.570808 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.570825 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:41Z","lastTransitionTime":"2025-12-06T01:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.674316 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.674374 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.674387 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.674424 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.674441 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:41Z","lastTransitionTime":"2025-12-06T01:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.777068 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.777107 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.777117 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.777131 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.777141 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:41Z","lastTransitionTime":"2025-12-06T01:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.879893 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.879934 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.879943 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.879959 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.879968 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:41Z","lastTransitionTime":"2025-12-06T01:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.983043 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.983087 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.983096 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.983115 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:41 crc kubenswrapper[4991]: I1206 01:03:41.983127 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:41Z","lastTransitionTime":"2025-12-06T01:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.037402 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:03:42 crc kubenswrapper[4991]: E1206 01:03:42.037609 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.038703 4991 scope.go:117] "RemoveContainer" containerID="34a6fd158c0c0ae9d2cd68294056b2c96405dc3c4bfe170d8370f86310d8fd12" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.039142 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:03:42 crc kubenswrapper[4991]: E1206 01:03:42.039233 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.039393 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:03:42 crc kubenswrapper[4991]: E1206 01:03:42.039460 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.039591 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:03:42 crc kubenswrapper[4991]: E1206 01:03:42.039655 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.085986 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.086316 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.086324 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.086339 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.086348 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:42Z","lastTransitionTime":"2025-12-06T01:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.188899 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.188943 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.188955 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.188993 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.189008 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:42Z","lastTransitionTime":"2025-12-06T01:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.291746 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.291800 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.291815 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.291835 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.291850 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:42Z","lastTransitionTime":"2025-12-06T01:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.394892 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.394946 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.394962 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.395006 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.395021 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:42Z","lastTransitionTime":"2025-12-06T01:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.497613 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.497671 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.497688 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.497720 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.497738 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:42Z","lastTransitionTime":"2025-12-06T01:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.600836 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.600895 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.600908 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.601206 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.601243 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:42Z","lastTransitionTime":"2025-12-06T01:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.630816 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5t29r_7a1c8508-3f35-44b2-988e-cc665121ce94/ovnkube-controller/2.log" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.633560 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" event={"ID":"7a1c8508-3f35-44b2-988e-cc665121ce94","Type":"ContainerStarted","Data":"7145e36d4adf53d8fd1f49d4918d8d275945f311209c21994c9048a268ed1377"} Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.634157 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.647489 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c85b3d7-edd3-4d42-bb8b-c7892feeff04\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4192bffb9f5423bfaf52510034884b3bbdd764b00e1b3dae354bcfe933a8c0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4916d46886a800ca9719abe6ad8686b6208fec6dd5a84454728e228ac5cb46e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd576444486a121ae8f02a6a7a64402e2db59d1baf0410a0b0036de8a0a1963e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91e6ba531f19ebc253e48073eadf13d896be7771a81abf986f83893cc4eb3ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91e6ba531f19ebc253e48073eadf13d896be7771a81abf986f83893cc4eb3ea9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:42Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.658529 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021d3a65c56a737a682268cde0d57a186389f597de592f541a72ea5958a170c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70371e1283910d9d652089554a815343eb1d54110a2472e067d5a2e8878c9193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:42Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.670279 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d8086b693e86f8e2ea45695911c64df4acded68da7cc61db4cdbc19b2402879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:42Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.682382 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:42Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.693423 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sq7kg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f95c58-fd69-48f6-94be-15927aa850c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2778914060f77aac8e6e77fc814436aeed10afdc9565dcd71b3b4c15324072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d44b91702b22653a2cc67f20e049260e82a69445cff9d6d2abbec16d1e725147\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T01:03:20Z\\\",\\\"message\\\":\\\"2025-12-06T01:02:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8875cecd-5c55-4c33-8830-2385a56cc89c\\\\n2025-12-06T01:02:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8875cecd-5c55-4c33-8830-2385a56cc89c to /host/opt/cni/bin/\\\\n2025-12-06T01:02:33Z [verbose] multus-daemon started\\\\n2025-12-06T01:02:33Z [verbose] Readiness Indicator file check\\\\n2025-12-06T01:03:18Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw8lm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sq7kg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:42Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.703428 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.703477 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.703492 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.703517 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.703532 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:42Z","lastTransitionTime":"2025-12-06T01:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.708165 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tw85g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e52ba3-560c-483d-9c51-898cbe94aad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f6d3d0e90489e71ee7e5e352e5d1752a7248686adf123c15f2f9aba34e9c556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e1aecfebc47a1fefcca2ce3b56a04b91805a56beed0357e49a89427f0550f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be3101fed463c931699eb0fd34d1780ab3b77fa0ca8e92a7ab6e982c0d7dfbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e821a0511240664d306ee6c25177cbce216253d489b2479cf27a0fc73360bb78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9951f6e5ff6b8822bd6e51521a2a427c15f6a41d6407baf9226b1552a414604\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8e6933d7cc98af06bf8b19ebf437210e8718ba9882195577d8dfb2baf1e0cef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ca1251ee195826f90d746ad7b10cf7902a77b6cf5271a83644d3f972c308a5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x44cm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tw85g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:42Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.718142 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t8t9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfebf788-e0ac-4bd4-8926-1f7b7cabba71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60a6e0233b2965c7438c6ee9467c180d9cd0b1407e2ed0279cc0947686bb9fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gpqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://966bddb0df2e52745119f2f51e57c859a39ca726270405f7b9af19c9d90a4118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9gpqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t8t9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:42Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.729495 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5409f18b-76b3-4388-a9c8-50a1ec8a37d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3c6493775f81f3e0feeee29f2b3c8420a15ba9ce8117fab50521b2251af46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6fa450da9e79fab8a6d70eb261ad080499995eef8884b2e01c68f44f3d1149\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95cb22622bd70c57cf25cb4e61bec6541175f1fee668e4b1aaa7e36853827315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:42Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.745401 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q8mbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"374025e5-ccaa-4cc4-816b-136713f17d6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khpzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khpzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:46Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q8mbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:42Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.760654 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kw4lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c790aea7-3412-47a6-a3d0-3dd019c26022\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4698040873e672e01eca6a4791166d92bd11bdcc3197bbe0340472eba3ba69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnwnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kw4lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:42Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.780606 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9d3358e-659e-413a-bddb-ce2156f5993c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b45e185f43e3c61c2066f6d137d91c8f928e66602e40f9e8fd53109b2a4d949f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://380f473b8f2c7bb765927d3ce7e5b8c8b7e256f33563a3b757ef6a9607bd797e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c051621cde74bbe99154e5e507969df2760eb4f1a35dbca7eb839e13aa47e6d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a28671e696c6a362c864f4faec7570ef5396fdebb90e18e21d60b5a109982882\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87816861e9eab5693a2d8d426667bd10220b806d625f48022cb7a64f01cbdda8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c9ac5eacade675cc065a0b9bbf0f5ef9622e237ff8dd6cc253f446184fba1f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c9ac5eacade675cc065a0b9bbf0f5ef9622e237ff8dd6cc253f446184fba1f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e90c095e0eedbf69c2bb3c468075593ce3376700cdc571994afac4d49a09e5fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e90c095e0eedbf69c2bb3c468075593ce3376700cdc571994afac4d49a09e5fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://76f9a40591f2f1c682e7f534acb271d5e015c380acbd8495e30082b0e4114b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76f9a40591f2f1c682e7f534acb271d5e015c380acbd8495e30082b0e4114b47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:42Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.794025 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc01e1f52af4386bd143ff8779458f0e335475f40f067c608c70a765d6fad952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:42Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.806542 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.806600 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.806610 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.806628 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.806639 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:42Z","lastTransitionTime":"2025-12-06T01:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.814428 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1c8508-3f35-44b2-988e-cc665121ce94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7145e36d4adf53d8fd1f49d4918d8d275945f311209c21994c9048a268ed1377\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a6fd158c0c0ae9d2cd68294056b2c96405dc3c4bfe170d8370f86310d8fd12\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-06T01:03:14Z\\\",\\\"message\\\":\\\"ap[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-controller-manager/kube-controller-manager]} name:Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.36:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ba175bbe-5cc4-47e6-a32d-57693e1320bd}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1206 01:03:14.045336 6896 services_controller.go:444] Built service openshift-oauth-apiserver/api LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1206 01:03:14.045350 6896 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1206 01:03:14.045361 6896 services_controller.go:445] Built service openshift-oauth-apiserver/api LB template configs for network=default: []services.lbConfig(nil)\\\\nI1206 01:03:14.045398 6896 services_controller.go:452] Built service openshift-kube-scheduler-operator/metrics per-node LB for network=default: []services.LB{}\\\\nF1206 01:03:14.045406 6896 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:03:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pnfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5t29r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:42Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.824995 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g2v2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c69985-6af9-4dc7-9e6f-9cb4c8ad1b37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee92c0e499eed2711d5027aba6265d26549865fbfb26de5dd38442e3c9a852b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlkqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g2v2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:42Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.841350 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c43bf08a-5774-4727-b681-b5d2291e51dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cf257527acb99d14685360262c24dfe896a1ae4e11680b2b8e1c68720182ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd4bdff2c70d7492d7819fa53b5844cd7b7c50157c2ca34841bcdd8fbffc1345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd4bdff2c70d7492d7819fa53b5844cd7b7c50157c2ca34841bcdd8fbffc1345\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:42Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.862787 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:42Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.886962 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:42Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.897699 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f35cea72-9e31-4432-9e3b-16a50ebce794\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9893b4133f9e41bd3fa958f5e1c25f5f0eb3cd3bd8d195d58f282866dc850c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674001323aaa7a65c9d6df014b5b35f7afb939aa6afe57ac9858923e3fe6ce15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8lq67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ppxx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:42Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.909250 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.909293 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.909304 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.909319 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.909328 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:42Z","lastTransitionTime":"2025-12-06T01:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:42 crc kubenswrapper[4991]: I1206 01:03:42.913574 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f3f8be-7093-40c6-8c6e-6bb65c878b31\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:03:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d50b2b6313c316a02866d767184d6eb48a30af52e2d86c65a08dab461e73a7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-06T01:02:26Z\\\",\\\"message\\\":\\\"ication::requestheader-client-ca-file\\\\nI1206 01:02:26.397262 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1206 01:02:26.397306 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1206 01:02:26.397362 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764982931\\\\\\\\\\\\\\\" (2025-12-06 01:02:10 +0000 UTC to 2026-01-05 01:02:11 +0000 UTC (now=2025-12-06 01:02:26.397254869 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397406 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1206 01:02:26.397421 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1206 01:02:26.397440 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721411396/tls.crt::/tmp/serving-cert-3721411396/tls.key\\\\\\\"\\\\nI1206 01:02:26.397610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764982946\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764982946\\\\\\\\\\\\\\\" (2025-12-06 00:02:26 +0000 UTC to 2026-12-06 00:02:26 +0000 UTC (now=2025-12-06 01:02:26.397572448 +0000 UTC))\\\\\\\"\\\\nI1206 01:02:26.397373 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1206 01:02:26.397647 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1206 01:02:26.397687 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1206 01:02:26.397767 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1206 01:02:26.399022 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-06T01:02:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-06T01:02:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-06T01:02:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-06T01:02:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-06T01:03:42Z is after 2025-08-24T17:21:41Z" Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.012342 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.012388 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.012398 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.012414 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.012426 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:43Z","lastTransitionTime":"2025-12-06T01:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.115209 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.115254 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.115272 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.115291 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.115301 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:43Z","lastTransitionTime":"2025-12-06T01:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.217038 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.217069 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.217077 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.217093 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.217103 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:43Z","lastTransitionTime":"2025-12-06T01:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.319772 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.319803 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.319811 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.319824 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.319834 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:43Z","lastTransitionTime":"2025-12-06T01:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.422403 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.422448 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.422459 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.422477 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.422487 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:43Z","lastTransitionTime":"2025-12-06T01:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.526186 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.526276 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.526319 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.526345 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.526361 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:43Z","lastTransitionTime":"2025-12-06T01:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.629776 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.629843 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.629857 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.629883 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.629900 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:43Z","lastTransitionTime":"2025-12-06T01:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.733536 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.733616 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.733637 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.733667 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.733692 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:43Z","lastTransitionTime":"2025-12-06T01:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.839168 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.839279 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.839298 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.839404 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.839466 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:43Z","lastTransitionTime":"2025-12-06T01:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.942689 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.942764 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.942776 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.942797 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:43 crc kubenswrapper[4991]: I1206 01:03:43.942811 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:43Z","lastTransitionTime":"2025-12-06T01:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.037836 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.037926 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.037841 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.038117 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:03:44 crc kubenswrapper[4991]: E1206 01:03:44.038185 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:03:44 crc kubenswrapper[4991]: E1206 01:03:44.038469 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:03:44 crc kubenswrapper[4991]: E1206 01:03:44.038615 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:03:44 crc kubenswrapper[4991]: E1206 01:03:44.038734 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.046651 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.046697 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.046707 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.046726 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.046741 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:44Z","lastTransitionTime":"2025-12-06T01:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.149859 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.149930 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.149940 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.149994 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.150007 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:44Z","lastTransitionTime":"2025-12-06T01:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.252672 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.252710 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.252721 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.252736 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.252747 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:44Z","lastTransitionTime":"2025-12-06T01:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.355572 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.355622 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.355634 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.355651 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.355663 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:44Z","lastTransitionTime":"2025-12-06T01:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.459242 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.459305 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.459318 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.459342 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.459359 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:44Z","lastTransitionTime":"2025-12-06T01:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.561911 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.562027 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.562061 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.562096 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.562121 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:44Z","lastTransitionTime":"2025-12-06T01:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.644035 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5t29r_7a1c8508-3f35-44b2-988e-cc665121ce94/ovnkube-controller/3.log" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.645027 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5t29r_7a1c8508-3f35-44b2-988e-cc665121ce94/ovnkube-controller/2.log" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.649872 4991 generic.go:334] "Generic (PLEG): container finished" podID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerID="7145e36d4adf53d8fd1f49d4918d8d275945f311209c21994c9048a268ed1377" exitCode=1 Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.649916 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" event={"ID":"7a1c8508-3f35-44b2-988e-cc665121ce94","Type":"ContainerDied","Data":"7145e36d4adf53d8fd1f49d4918d8d275945f311209c21994c9048a268ed1377"} Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.650017 4991 scope.go:117] "RemoveContainer" containerID="34a6fd158c0c0ae9d2cd68294056b2c96405dc3c4bfe170d8370f86310d8fd12" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.651791 4991 scope.go:117] "RemoveContainer" containerID="7145e36d4adf53d8fd1f49d4918d8d275945f311209c21994c9048a268ed1377" Dec 06 01:03:44 crc kubenswrapper[4991]: E1206 01:03:44.652314 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5t29r_openshift-ovn-kubernetes(7a1c8508-3f35-44b2-988e-cc665121ce94)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.667399 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.667485 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.667510 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.667546 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.667577 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:44Z","lastTransitionTime":"2025-12-06T01:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.717302 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=77.717279746 podStartE2EDuration="1m17.717279746s" podCreationTimestamp="2025-12-06 01:02:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:03:44.697695314 +0000 UTC m=+98.047985792" watchObservedRunningTime="2025-12-06 01:03:44.717279746 +0000 UTC m=+98.067570224" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.717641 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=42.717634584 podStartE2EDuration="42.717634584s" podCreationTimestamp="2025-12-06 01:03:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:03:44.716875486 +0000 UTC m=+98.067165984" watchObservedRunningTime="2025-12-06 01:03:44.717634584 +0000 UTC m=+98.067925062" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.773306 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.773344 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.773361 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.773383 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.773402 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:44Z","lastTransitionTime":"2025-12-06T01:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.786759 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-sq7kg" podStartSLOduration=73.786736687 podStartE2EDuration="1m13.786736687s" podCreationTimestamp="2025-12-06 01:02:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:03:44.786605711 +0000 UTC m=+98.136896189" watchObservedRunningTime="2025-12-06 01:03:44.786736687 +0000 UTC m=+98.137027195" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.814883 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-tw85g" podStartSLOduration=73.814856168 podStartE2EDuration="1m13.814856168s" podCreationTimestamp="2025-12-06 01:02:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:03:44.81347277 +0000 UTC m=+98.163763248" watchObservedRunningTime="2025-12-06 01:03:44.814856168 +0000 UTC m=+98.165146646" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.853598 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t8t9g" podStartSLOduration=72.85357561000001 podStartE2EDuration="1m12.85357561s" podCreationTimestamp="2025-12-06 01:02:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:03:44.836647599 +0000 UTC m=+98.186938087" watchObservedRunningTime="2025-12-06 01:03:44.85357561 +0000 UTC m=+98.203866088" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.876811 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.876867 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.876879 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.876899 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.876912 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:44Z","lastTransitionTime":"2025-12-06T01:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.914673 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.915007 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.915079 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.915154 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.915283 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-06T01:03:44Z","lastTransitionTime":"2025-12-06T01:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.915409 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=9.915384486 podStartE2EDuration="9.915384486s" podCreationTimestamp="2025-12-06 01:03:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:03:44.91445416 +0000 UTC m=+98.264744638" watchObservedRunningTime="2025-12-06 01:03:44.915384486 +0000 UTC m=+98.265674954" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.927638 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-kw4lt" podStartSLOduration=73.927612676 podStartE2EDuration="1m13.927612676s" podCreationTimestamp="2025-12-06 01:02:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:03:44.927202716 +0000 UTC m=+98.277493204" watchObservedRunningTime="2025-12-06 01:03:44.927612676 +0000 UTC m=+98.277903144" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.964829 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=26.964803663 podStartE2EDuration="26.964803663s" podCreationTimestamp="2025-12-06 01:03:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:03:44.948226309 +0000 UTC m=+98.298516777" watchObservedRunningTime="2025-12-06 01:03:44.964803663 +0000 UTC m=+98.315094131" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.969899 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-tl7pk"] Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.970397 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tl7pk" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.972602 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.972764 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.973035 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.974245 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 06 01:03:44 crc kubenswrapper[4991]: I1206 01:03:44.999414 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-g2v2x" podStartSLOduration=73.999389482 podStartE2EDuration="1m13.999389482s" podCreationTimestamp="2025-12-06 01:02:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:03:44.999316978 +0000 UTC m=+98.349607446" watchObservedRunningTime="2025-12-06 01:03:44.999389482 +0000 UTC m=+98.349679950" Dec 06 01:03:45 crc kubenswrapper[4991]: I1206 01:03:45.027698 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=79.027672641 podStartE2EDuration="1m19.027672641s" podCreationTimestamp="2025-12-06 01:02:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:03:45.013027111 +0000 UTC m=+98.363317579" watchObservedRunningTime="2025-12-06 01:03:45.027672641 +0000 UTC m=+98.377963099" Dec 06 01:03:45 crc kubenswrapper[4991]: I1206 01:03:45.060176 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f244ce2-82c9-40b9-8a95-581a64884d27-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-tl7pk\" (UID: \"0f244ce2-82c9-40b9-8a95-581a64884d27\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tl7pk" Dec 06 01:03:45 crc kubenswrapper[4991]: I1206 01:03:45.060262 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0f244ce2-82c9-40b9-8a95-581a64884d27-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-tl7pk\" (UID: \"0f244ce2-82c9-40b9-8a95-581a64884d27\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tl7pk" Dec 06 01:03:45 crc kubenswrapper[4991]: I1206 01:03:45.060297 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0f244ce2-82c9-40b9-8a95-581a64884d27-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-tl7pk\" (UID: \"0f244ce2-82c9-40b9-8a95-581a64884d27\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tl7pk" Dec 06 01:03:45 crc kubenswrapper[4991]: I1206 01:03:45.060325 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0f244ce2-82c9-40b9-8a95-581a64884d27-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-tl7pk\" (UID: \"0f244ce2-82c9-40b9-8a95-581a64884d27\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tl7pk" Dec 06 01:03:45 crc kubenswrapper[4991]: I1206 01:03:45.060366 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0f244ce2-82c9-40b9-8a95-581a64884d27-service-ca\") pod \"cluster-version-operator-5c965bbfc6-tl7pk\" (UID: \"0f244ce2-82c9-40b9-8a95-581a64884d27\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tl7pk" Dec 06 01:03:45 crc kubenswrapper[4991]: I1206 01:03:45.060839 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podStartSLOduration=74.060810558 podStartE2EDuration="1m14.060810558s" podCreationTimestamp="2025-12-06 01:02:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:03:45.05921977 +0000 UTC m=+98.409510248" watchObservedRunningTime="2025-12-06 01:03:45.060810558 +0000 UTC m=+98.411101026" Dec 06 01:03:45 crc kubenswrapper[4991]: I1206 01:03:45.173795 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f244ce2-82c9-40b9-8a95-581a64884d27-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-tl7pk\" (UID: \"0f244ce2-82c9-40b9-8a95-581a64884d27\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tl7pk" Dec 06 01:03:45 crc kubenswrapper[4991]: I1206 01:03:45.173851 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0f244ce2-82c9-40b9-8a95-581a64884d27-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-tl7pk\" (UID: \"0f244ce2-82c9-40b9-8a95-581a64884d27\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tl7pk" Dec 06 01:03:45 crc kubenswrapper[4991]: I1206 01:03:45.173877 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0f244ce2-82c9-40b9-8a95-581a64884d27-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-tl7pk\" (UID: \"0f244ce2-82c9-40b9-8a95-581a64884d27\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tl7pk" Dec 06 01:03:45 crc kubenswrapper[4991]: I1206 01:03:45.173901 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0f244ce2-82c9-40b9-8a95-581a64884d27-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-tl7pk\" (UID: \"0f244ce2-82c9-40b9-8a95-581a64884d27\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tl7pk" Dec 06 01:03:45 crc kubenswrapper[4991]: I1206 01:03:45.173933 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0f244ce2-82c9-40b9-8a95-581a64884d27-service-ca\") pod \"cluster-version-operator-5c965bbfc6-tl7pk\" (UID: \"0f244ce2-82c9-40b9-8a95-581a64884d27\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tl7pk" Dec 06 01:03:45 crc kubenswrapper[4991]: I1206 01:03:45.174041 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0f244ce2-82c9-40b9-8a95-581a64884d27-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-tl7pk\" (UID: \"0f244ce2-82c9-40b9-8a95-581a64884d27\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tl7pk" Dec 06 01:03:45 crc kubenswrapper[4991]: I1206 01:03:45.174153 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0f244ce2-82c9-40b9-8a95-581a64884d27-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-tl7pk\" (UID: \"0f244ce2-82c9-40b9-8a95-581a64884d27\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tl7pk" Dec 06 01:03:45 crc kubenswrapper[4991]: I1206 01:03:45.175059 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0f244ce2-82c9-40b9-8a95-581a64884d27-service-ca\") pod \"cluster-version-operator-5c965bbfc6-tl7pk\" (UID: \"0f244ce2-82c9-40b9-8a95-581a64884d27\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tl7pk" Dec 06 01:03:45 crc kubenswrapper[4991]: I1206 01:03:45.188879 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f244ce2-82c9-40b9-8a95-581a64884d27-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-tl7pk\" (UID: \"0f244ce2-82c9-40b9-8a95-581a64884d27\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tl7pk" Dec 06 01:03:45 crc kubenswrapper[4991]: I1206 01:03:45.196877 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0f244ce2-82c9-40b9-8a95-581a64884d27-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-tl7pk\" (UID: \"0f244ce2-82c9-40b9-8a95-581a64884d27\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tl7pk" Dec 06 01:03:45 crc kubenswrapper[4991]: I1206 01:03:45.283428 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tl7pk" Dec 06 01:03:45 crc kubenswrapper[4991]: W1206 01:03:45.301730 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f244ce2_82c9_40b9_8a95_581a64884d27.slice/crio-137e8c3e5e85194d12e2dbcc83b59026d49c169c8b8306433780f537afa68fdd WatchSource:0}: Error finding container 137e8c3e5e85194d12e2dbcc83b59026d49c169c8b8306433780f537afa68fdd: Status 404 returned error can't find the container with id 137e8c3e5e85194d12e2dbcc83b59026d49c169c8b8306433780f537afa68fdd Dec 06 01:03:45 crc kubenswrapper[4991]: I1206 01:03:45.657471 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5t29r_7a1c8508-3f35-44b2-988e-cc665121ce94/ovnkube-controller/3.log" Dec 06 01:03:45 crc kubenswrapper[4991]: I1206 01:03:45.666388 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tl7pk" event={"ID":"0f244ce2-82c9-40b9-8a95-581a64884d27","Type":"ContainerStarted","Data":"031243a169c978d2e9f3e967c85a09dac71d3837abc25f9de1d7ec2a55f1ed26"} Dec 06 01:03:45 crc kubenswrapper[4991]: I1206 01:03:45.666477 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tl7pk" event={"ID":"0f244ce2-82c9-40b9-8a95-581a64884d27","Type":"ContainerStarted","Data":"137e8c3e5e85194d12e2dbcc83b59026d49c169c8b8306433780f537afa68fdd"} Dec 06 01:03:45 crc kubenswrapper[4991]: I1206 01:03:45.692768 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tl7pk" podStartSLOduration=74.692687872 podStartE2EDuration="1m14.692687872s" podCreationTimestamp="2025-12-06 01:02:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:03:45.689349408 +0000 UTC m=+99.039639876" watchObservedRunningTime="2025-12-06 01:03:45.692687872 +0000 UTC m=+99.042978380" Dec 06 01:03:46 crc kubenswrapper[4991]: I1206 01:03:46.037486 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:03:46 crc kubenswrapper[4991]: I1206 01:03:46.037522 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:03:46 crc kubenswrapper[4991]: E1206 01:03:46.037610 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:03:46 crc kubenswrapper[4991]: I1206 01:03:46.037501 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:03:46 crc kubenswrapper[4991]: E1206 01:03:46.037692 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:03:46 crc kubenswrapper[4991]: E1206 01:03:46.037933 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:03:46 crc kubenswrapper[4991]: I1206 01:03:46.038399 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:03:46 crc kubenswrapper[4991]: E1206 01:03:46.038586 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:03:48 crc kubenswrapper[4991]: I1206 01:03:48.037617 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:03:48 crc kubenswrapper[4991]: I1206 01:03:48.037692 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:03:48 crc kubenswrapper[4991]: I1206 01:03:48.037643 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:03:48 crc kubenswrapper[4991]: E1206 01:03:48.037826 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:03:48 crc kubenswrapper[4991]: E1206 01:03:48.038399 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:03:48 crc kubenswrapper[4991]: E1206 01:03:48.038525 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:03:48 crc kubenswrapper[4991]: I1206 01:03:48.039415 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:03:48 crc kubenswrapper[4991]: E1206 01:03:48.039686 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:03:50 crc kubenswrapper[4991]: I1206 01:03:50.036930 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:03:50 crc kubenswrapper[4991]: I1206 01:03:50.037036 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:03:50 crc kubenswrapper[4991]: I1206 01:03:50.036930 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:03:50 crc kubenswrapper[4991]: I1206 01:03:50.037155 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:03:50 crc kubenswrapper[4991]: E1206 01:03:50.037228 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:03:50 crc kubenswrapper[4991]: E1206 01:03:50.037335 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:03:50 crc kubenswrapper[4991]: E1206 01:03:50.037397 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:03:50 crc kubenswrapper[4991]: E1206 01:03:50.037544 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:03:50 crc kubenswrapper[4991]: I1206 01:03:50.345490 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/374025e5-ccaa-4cc4-816b-136713f17d6d-metrics-certs\") pod \"network-metrics-daemon-q8mbt\" (UID: \"374025e5-ccaa-4cc4-816b-136713f17d6d\") " pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:03:50 crc kubenswrapper[4991]: E1206 01:03:50.345777 4991 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 01:03:50 crc kubenswrapper[4991]: E1206 01:03:50.345887 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/374025e5-ccaa-4cc4-816b-136713f17d6d-metrics-certs podName:374025e5-ccaa-4cc4-816b-136713f17d6d nodeName:}" failed. No retries permitted until 2025-12-06 01:04:54.345854465 +0000 UTC m=+167.696144943 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/374025e5-ccaa-4cc4-816b-136713f17d6d-metrics-certs") pod "network-metrics-daemon-q8mbt" (UID: "374025e5-ccaa-4cc4-816b-136713f17d6d") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 06 01:03:52 crc kubenswrapper[4991]: I1206 01:03:52.037649 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:03:52 crc kubenswrapper[4991]: I1206 01:03:52.037722 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:03:52 crc kubenswrapper[4991]: I1206 01:03:52.037848 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:03:52 crc kubenswrapper[4991]: I1206 01:03:52.037955 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:03:52 crc kubenswrapper[4991]: E1206 01:03:52.039169 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:03:52 crc kubenswrapper[4991]: E1206 01:03:52.039566 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:03:52 crc kubenswrapper[4991]: E1206 01:03:52.039670 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:03:52 crc kubenswrapper[4991]: E1206 01:03:52.039765 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:03:54 crc kubenswrapper[4991]: I1206 01:03:54.037598 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:03:54 crc kubenswrapper[4991]: I1206 01:03:54.037656 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:03:54 crc kubenswrapper[4991]: I1206 01:03:54.037700 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:03:54 crc kubenswrapper[4991]: I1206 01:03:54.037764 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:03:54 crc kubenswrapper[4991]: E1206 01:03:54.038201 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:03:54 crc kubenswrapper[4991]: E1206 01:03:54.038662 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:03:54 crc kubenswrapper[4991]: E1206 01:03:54.038761 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:03:54 crc kubenswrapper[4991]: E1206 01:03:54.038853 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:03:56 crc kubenswrapper[4991]: I1206 01:03:56.037493 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:03:56 crc kubenswrapper[4991]: I1206 01:03:56.037543 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:03:56 crc kubenswrapper[4991]: I1206 01:03:56.037617 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:03:56 crc kubenswrapper[4991]: E1206 01:03:56.037727 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:03:56 crc kubenswrapper[4991]: E1206 01:03:56.037976 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:03:56 crc kubenswrapper[4991]: E1206 01:03:56.038240 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:03:56 crc kubenswrapper[4991]: I1206 01:03:56.037493 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:03:56 crc kubenswrapper[4991]: E1206 01:03:56.039390 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:03:58 crc kubenswrapper[4991]: I1206 01:03:58.037466 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:03:58 crc kubenswrapper[4991]: I1206 01:03:58.037583 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:03:58 crc kubenswrapper[4991]: I1206 01:03:58.037650 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:03:58 crc kubenswrapper[4991]: E1206 01:03:58.037685 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:03:58 crc kubenswrapper[4991]: I1206 01:03:58.038035 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:03:58 crc kubenswrapper[4991]: E1206 01:03:58.038117 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:03:58 crc kubenswrapper[4991]: E1206 01:03:58.038323 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:03:58 crc kubenswrapper[4991]: E1206 01:03:58.038388 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:03:59 crc kubenswrapper[4991]: I1206 01:03:59.038167 4991 scope.go:117] "RemoveContainer" containerID="7145e36d4adf53d8fd1f49d4918d8d275945f311209c21994c9048a268ed1377" Dec 06 01:03:59 crc kubenswrapper[4991]: E1206 01:03:59.038463 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5t29r_openshift-ovn-kubernetes(7a1c8508-3f35-44b2-988e-cc665121ce94)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" Dec 06 01:04:00 crc kubenswrapper[4991]: I1206 01:04:00.037367 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:04:00 crc kubenswrapper[4991]: I1206 01:04:00.037369 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:04:00 crc kubenswrapper[4991]: I1206 01:04:00.037505 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:04:00 crc kubenswrapper[4991]: I1206 01:04:00.037580 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:04:00 crc kubenswrapper[4991]: E1206 01:04:00.037706 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:04:00 crc kubenswrapper[4991]: E1206 01:04:00.037882 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:04:00 crc kubenswrapper[4991]: E1206 01:04:00.038047 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:04:00 crc kubenswrapper[4991]: E1206 01:04:00.038236 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:04:02 crc kubenswrapper[4991]: I1206 01:04:02.037118 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:04:02 crc kubenswrapper[4991]: I1206 01:04:02.037215 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:04:02 crc kubenswrapper[4991]: E1206 01:04:02.037286 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:04:02 crc kubenswrapper[4991]: I1206 01:04:02.037130 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:04:02 crc kubenswrapper[4991]: E1206 01:04:02.037382 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:04:02 crc kubenswrapper[4991]: I1206 01:04:02.037118 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:04:02 crc kubenswrapper[4991]: E1206 01:04:02.037527 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:04:02 crc kubenswrapper[4991]: E1206 01:04:02.037723 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:04:04 crc kubenswrapper[4991]: I1206 01:04:04.037619 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:04:04 crc kubenswrapper[4991]: I1206 01:04:04.037675 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:04:04 crc kubenswrapper[4991]: I1206 01:04:04.037653 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:04:04 crc kubenswrapper[4991]: E1206 01:04:04.037867 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:04:04 crc kubenswrapper[4991]: I1206 01:04:04.037953 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:04:04 crc kubenswrapper[4991]: E1206 01:04:04.038238 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:04:04 crc kubenswrapper[4991]: E1206 01:04:04.038400 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:04:04 crc kubenswrapper[4991]: E1206 01:04:04.038563 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:04:06 crc kubenswrapper[4991]: I1206 01:04:06.037088 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:04:06 crc kubenswrapper[4991]: I1206 01:04:06.037124 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:04:06 crc kubenswrapper[4991]: I1206 01:04:06.037166 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:04:06 crc kubenswrapper[4991]: I1206 01:04:06.037333 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:04:06 crc kubenswrapper[4991]: E1206 01:04:06.037325 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:04:06 crc kubenswrapper[4991]: E1206 01:04:06.037559 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:04:06 crc kubenswrapper[4991]: E1206 01:04:06.037660 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:04:06 crc kubenswrapper[4991]: E1206 01:04:06.037817 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:04:06 crc kubenswrapper[4991]: I1206 01:04:06.747341 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sq7kg_06f95c58-fd69-48f6-94be-15927aa850c2/kube-multus/1.log" Dec 06 01:04:06 crc kubenswrapper[4991]: I1206 01:04:06.748375 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sq7kg_06f95c58-fd69-48f6-94be-15927aa850c2/kube-multus/0.log" Dec 06 01:04:06 crc kubenswrapper[4991]: I1206 01:04:06.748436 4991 generic.go:334] "Generic (PLEG): container finished" podID="06f95c58-fd69-48f6-94be-15927aa850c2" containerID="ce2778914060f77aac8e6e77fc814436aeed10afdc9565dcd71b3b4c15324072" exitCode=1 Dec 06 01:04:06 crc kubenswrapper[4991]: I1206 01:04:06.748503 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sq7kg" event={"ID":"06f95c58-fd69-48f6-94be-15927aa850c2","Type":"ContainerDied","Data":"ce2778914060f77aac8e6e77fc814436aeed10afdc9565dcd71b3b4c15324072"} Dec 06 01:04:06 crc kubenswrapper[4991]: I1206 01:04:06.748615 4991 scope.go:117] "RemoveContainer" containerID="d44b91702b22653a2cc67f20e049260e82a69445cff9d6d2abbec16d1e725147" Dec 06 01:04:06 crc kubenswrapper[4991]: I1206 01:04:06.749245 4991 scope.go:117] "RemoveContainer" containerID="ce2778914060f77aac8e6e77fc814436aeed10afdc9565dcd71b3b4c15324072" Dec 06 01:04:06 crc kubenswrapper[4991]: E1206 01:04:06.749531 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-sq7kg_openshift-multus(06f95c58-fd69-48f6-94be-15927aa850c2)\"" pod="openshift-multus/multus-sq7kg" podUID="06f95c58-fd69-48f6-94be-15927aa850c2" Dec 06 01:04:07 crc kubenswrapper[4991]: E1206 01:04:07.037491 4991 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 06 01:04:07 crc kubenswrapper[4991]: E1206 01:04:07.119365 4991 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 06 01:04:07 crc kubenswrapper[4991]: I1206 01:04:07.754087 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sq7kg_06f95c58-fd69-48f6-94be-15927aa850c2/kube-multus/1.log" Dec 06 01:04:08 crc kubenswrapper[4991]: I1206 01:04:08.037367 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:04:08 crc kubenswrapper[4991]: I1206 01:04:08.037435 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:04:08 crc kubenswrapper[4991]: I1206 01:04:08.037490 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:04:08 crc kubenswrapper[4991]: I1206 01:04:08.037709 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:04:08 crc kubenswrapper[4991]: E1206 01:04:08.038336 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:04:08 crc kubenswrapper[4991]: E1206 01:04:08.038223 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:04:08 crc kubenswrapper[4991]: E1206 01:04:08.038435 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:04:08 crc kubenswrapper[4991]: E1206 01:04:08.038737 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:04:10 crc kubenswrapper[4991]: I1206 01:04:10.037239 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:04:10 crc kubenswrapper[4991]: I1206 01:04:10.037408 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:04:10 crc kubenswrapper[4991]: E1206 01:04:10.038282 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:04:10 crc kubenswrapper[4991]: I1206 01:04:10.037518 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:04:10 crc kubenswrapper[4991]: I1206 01:04:10.037491 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:04:10 crc kubenswrapper[4991]: E1206 01:04:10.038506 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:04:10 crc kubenswrapper[4991]: E1206 01:04:10.038404 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:04:10 crc kubenswrapper[4991]: E1206 01:04:10.038689 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:04:12 crc kubenswrapper[4991]: I1206 01:04:12.037558 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:04:12 crc kubenswrapper[4991]: I1206 01:04:12.037613 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:04:12 crc kubenswrapper[4991]: I1206 01:04:12.037699 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:04:12 crc kubenswrapper[4991]: E1206 01:04:12.037841 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:04:12 crc kubenswrapper[4991]: E1206 01:04:12.037949 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:04:12 crc kubenswrapper[4991]: I1206 01:04:12.038211 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:04:12 crc kubenswrapper[4991]: E1206 01:04:12.038210 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:04:12 crc kubenswrapper[4991]: E1206 01:04:12.038433 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:04:12 crc kubenswrapper[4991]: E1206 01:04:12.121275 4991 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 06 01:04:14 crc kubenswrapper[4991]: I1206 01:04:14.037542 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:04:14 crc kubenswrapper[4991]: I1206 01:04:14.037601 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:04:14 crc kubenswrapper[4991]: I1206 01:04:14.037601 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:04:14 crc kubenswrapper[4991]: I1206 01:04:14.037648 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:04:14 crc kubenswrapper[4991]: E1206 01:04:14.038446 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:04:14 crc kubenswrapper[4991]: E1206 01:04:14.038525 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:04:14 crc kubenswrapper[4991]: E1206 01:04:14.038668 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:04:14 crc kubenswrapper[4991]: E1206 01:04:14.038875 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:04:14 crc kubenswrapper[4991]: I1206 01:04:14.039066 4991 scope.go:117] "RemoveContainer" containerID="7145e36d4adf53d8fd1f49d4918d8d275945f311209c21994c9048a268ed1377" Dec 06 01:04:14 crc kubenswrapper[4991]: E1206 01:04:14.039360 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5t29r_openshift-ovn-kubernetes(7a1c8508-3f35-44b2-988e-cc665121ce94)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" Dec 06 01:04:16 crc kubenswrapper[4991]: I1206 01:04:16.037439 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:04:16 crc kubenswrapper[4991]: I1206 01:04:16.037502 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:04:16 crc kubenswrapper[4991]: I1206 01:04:16.037641 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:04:16 crc kubenswrapper[4991]: E1206 01:04:16.037837 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:04:16 crc kubenswrapper[4991]: I1206 01:04:16.037872 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:04:16 crc kubenswrapper[4991]: E1206 01:04:16.038090 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:04:16 crc kubenswrapper[4991]: E1206 01:04:16.038289 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:04:16 crc kubenswrapper[4991]: E1206 01:04:16.038400 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:04:17 crc kubenswrapper[4991]: E1206 01:04:17.121929 4991 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 06 01:04:18 crc kubenswrapper[4991]: I1206 01:04:18.037399 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:04:18 crc kubenswrapper[4991]: E1206 01:04:18.037631 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:04:18 crc kubenswrapper[4991]: I1206 01:04:18.037659 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:04:18 crc kubenswrapper[4991]: I1206 01:04:18.037731 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:04:18 crc kubenswrapper[4991]: I1206 01:04:18.037932 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:04:18 crc kubenswrapper[4991]: E1206 01:04:18.038078 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:04:18 crc kubenswrapper[4991]: E1206 01:04:18.038171 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:04:18 crc kubenswrapper[4991]: E1206 01:04:18.038344 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:04:19 crc kubenswrapper[4991]: I1206 01:04:19.038114 4991 scope.go:117] "RemoveContainer" containerID="ce2778914060f77aac8e6e77fc814436aeed10afdc9565dcd71b3b4c15324072" Dec 06 01:04:19 crc kubenswrapper[4991]: I1206 01:04:19.813728 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sq7kg_06f95c58-fd69-48f6-94be-15927aa850c2/kube-multus/1.log" Dec 06 01:04:19 crc kubenswrapper[4991]: I1206 01:04:19.814383 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sq7kg" event={"ID":"06f95c58-fd69-48f6-94be-15927aa850c2","Type":"ContainerStarted","Data":"f8b1d542438157988a9ed58865b298ebff8c7d79f1ab942a399a01c263d6b4eb"} Dec 06 01:04:20 crc kubenswrapper[4991]: I1206 01:04:20.037172 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:04:20 crc kubenswrapper[4991]: I1206 01:04:20.037278 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:04:20 crc kubenswrapper[4991]: E1206 01:04:20.037341 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:04:20 crc kubenswrapper[4991]: I1206 01:04:20.037278 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:04:20 crc kubenswrapper[4991]: E1206 01:04:20.037505 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:04:20 crc kubenswrapper[4991]: I1206 01:04:20.037448 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:04:20 crc kubenswrapper[4991]: E1206 01:04:20.037593 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:04:20 crc kubenswrapper[4991]: E1206 01:04:20.037676 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:04:22 crc kubenswrapper[4991]: I1206 01:04:22.037883 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:04:22 crc kubenswrapper[4991]: I1206 01:04:22.037944 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:04:22 crc kubenswrapper[4991]: I1206 01:04:22.037903 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:04:22 crc kubenswrapper[4991]: E1206 01:04:22.038161 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:04:22 crc kubenswrapper[4991]: I1206 01:04:22.038209 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:04:22 crc kubenswrapper[4991]: E1206 01:04:22.038321 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:04:22 crc kubenswrapper[4991]: E1206 01:04:22.038456 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:04:22 crc kubenswrapper[4991]: E1206 01:04:22.038594 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:04:22 crc kubenswrapper[4991]: E1206 01:04:22.123660 4991 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 06 01:04:24 crc kubenswrapper[4991]: I1206 01:04:24.036833 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:04:24 crc kubenswrapper[4991]: I1206 01:04:24.036883 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:04:24 crc kubenswrapper[4991]: E1206 01:04:24.038401 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:04:24 crc kubenswrapper[4991]: I1206 01:04:24.037041 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:04:24 crc kubenswrapper[4991]: E1206 01:04:24.038670 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:04:24 crc kubenswrapper[4991]: I1206 01:04:24.036894 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:04:24 crc kubenswrapper[4991]: E1206 01:04:24.039102 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:04:24 crc kubenswrapper[4991]: E1206 01:04:24.039219 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:04:25 crc kubenswrapper[4991]: I1206 01:04:25.038686 4991 scope.go:117] "RemoveContainer" containerID="7145e36d4adf53d8fd1f49d4918d8d275945f311209c21994c9048a268ed1377" Dec 06 01:04:25 crc kubenswrapper[4991]: I1206 01:04:25.839325 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5t29r_7a1c8508-3f35-44b2-988e-cc665121ce94/ovnkube-controller/3.log" Dec 06 01:04:25 crc kubenswrapper[4991]: I1206 01:04:25.842031 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" event={"ID":"7a1c8508-3f35-44b2-988e-cc665121ce94","Type":"ContainerStarted","Data":"560c4f661e91f29a61975f6700a9745378b967d04dd66204f7b0b1a0b219abd3"} Dec 06 01:04:25 crc kubenswrapper[4991]: I1206 01:04:25.842579 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:04:26 crc kubenswrapper[4991]: I1206 01:04:26.036973 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:04:26 crc kubenswrapper[4991]: E1206 01:04:26.037127 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:04:26 crc kubenswrapper[4991]: I1206 01:04:26.037171 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:04:26 crc kubenswrapper[4991]: E1206 01:04:26.037259 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:04:26 crc kubenswrapper[4991]: I1206 01:04:26.037267 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:04:26 crc kubenswrapper[4991]: I1206 01:04:26.037309 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:04:26 crc kubenswrapper[4991]: E1206 01:04:26.037398 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:04:26 crc kubenswrapper[4991]: E1206 01:04:26.037559 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:04:26 crc kubenswrapper[4991]: I1206 01:04:26.311008 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" podStartSLOduration=115.310972404 podStartE2EDuration="1m55.310972404s" podCreationTimestamp="2025-12-06 01:02:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:04:25.907022877 +0000 UTC m=+139.257313365" watchObservedRunningTime="2025-12-06 01:04:26.310972404 +0000 UTC m=+139.661262872" Dec 06 01:04:26 crc kubenswrapper[4991]: I1206 01:04:26.311495 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-q8mbt"] Dec 06 01:04:26 crc kubenswrapper[4991]: I1206 01:04:26.846093 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:04:26 crc kubenswrapper[4991]: E1206 01:04:26.846318 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:04:27 crc kubenswrapper[4991]: E1206 01:04:27.124405 4991 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 06 01:04:28 crc kubenswrapper[4991]: I1206 01:04:28.037370 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:04:28 crc kubenswrapper[4991]: E1206 01:04:28.037624 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:04:28 crc kubenswrapper[4991]: I1206 01:04:28.037836 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:04:28 crc kubenswrapper[4991]: E1206 01:04:28.038227 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:04:28 crc kubenswrapper[4991]: I1206 01:04:28.038435 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:04:28 crc kubenswrapper[4991]: E1206 01:04:28.038528 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:04:29 crc kubenswrapper[4991]: I1206 01:04:29.037552 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:04:29 crc kubenswrapper[4991]: E1206 01:04:29.037720 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:04:30 crc kubenswrapper[4991]: I1206 01:04:30.037593 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:04:30 crc kubenswrapper[4991]: I1206 01:04:30.037736 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:04:30 crc kubenswrapper[4991]: E1206 01:04:30.037834 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:04:30 crc kubenswrapper[4991]: I1206 01:04:30.037622 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:04:30 crc kubenswrapper[4991]: E1206 01:04:30.037927 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:04:30 crc kubenswrapper[4991]: E1206 01:04:30.038101 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:04:31 crc kubenswrapper[4991]: I1206 01:04:31.037654 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:04:31 crc kubenswrapper[4991]: E1206 01:04:31.038150 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q8mbt" podUID="374025e5-ccaa-4cc4-816b-136713f17d6d" Dec 06 01:04:32 crc kubenswrapper[4991]: I1206 01:04:32.037918 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:04:32 crc kubenswrapper[4991]: I1206 01:04:32.038041 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:04:32 crc kubenswrapper[4991]: I1206 01:04:32.037918 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:04:32 crc kubenswrapper[4991]: E1206 01:04:32.038265 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 06 01:04:32 crc kubenswrapper[4991]: E1206 01:04:32.038439 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 06 01:04:32 crc kubenswrapper[4991]: E1206 01:04:32.038569 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 06 01:04:33 crc kubenswrapper[4991]: I1206 01:04:33.037351 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:04:33 crc kubenswrapper[4991]: I1206 01:04:33.040902 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 06 01:04:33 crc kubenswrapper[4991]: I1206 01:04:33.042684 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 06 01:04:33 crc kubenswrapper[4991]: I1206 01:04:33.915789 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:04:33 crc kubenswrapper[4991]: E1206 01:04:33.916160 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:06:35.916110345 +0000 UTC m=+269.266400843 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:34 crc kubenswrapper[4991]: I1206 01:04:34.037173 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:04:34 crc kubenswrapper[4991]: I1206 01:04:34.037173 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:04:34 crc kubenswrapper[4991]: I1206 01:04:34.037338 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:04:34 crc kubenswrapper[4991]: I1206 01:04:34.041335 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 06 01:04:34 crc kubenswrapper[4991]: I1206 01:04:34.041627 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 06 01:04:34 crc kubenswrapper[4991]: I1206 01:04:34.041853 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 06 01:04:34 crc kubenswrapper[4991]: I1206 01:04:34.042023 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 06 01:04:34 crc kubenswrapper[4991]: I1206 01:04:34.118832 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:04:34 crc kubenswrapper[4991]: I1206 01:04:34.118905 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:04:34 crc kubenswrapper[4991]: I1206 01:04:34.118944 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:04:34 crc kubenswrapper[4991]: I1206 01:04:34.119039 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:04:34 crc kubenswrapper[4991]: I1206 01:04:34.121279 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:04:34 crc kubenswrapper[4991]: I1206 01:04:34.129742 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:04:34 crc kubenswrapper[4991]: I1206 01:04:34.130144 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:04:34 crc kubenswrapper[4991]: I1206 01:04:34.130618 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:04:34 crc kubenswrapper[4991]: I1206 01:04:34.360681 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 06 01:04:34 crc kubenswrapper[4991]: I1206 01:04:34.371635 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:04:34 crc kubenswrapper[4991]: I1206 01:04:34.378806 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 06 01:04:34 crc kubenswrapper[4991]: W1206 01:04:34.667391 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-0172ac5a9fb347297de01f8686e84ecc688bfd888c7c4753959d56894b980ba1 WatchSource:0}: Error finding container 0172ac5a9fb347297de01f8686e84ecc688bfd888c7c4753959d56894b980ba1: Status 404 returned error can't find the container with id 0172ac5a9fb347297de01f8686e84ecc688bfd888c7c4753959d56894b980ba1 Dec 06 01:04:34 crc kubenswrapper[4991]: I1206 01:04:34.887116 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"b815bbc6f0e3adb2fb32cad22594139f66103b9f2e3eff2ff36677db0c2b5bce"} Dec 06 01:04:34 crc kubenswrapper[4991]: I1206 01:04:34.888909 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"0172ac5a9fb347297de01f8686e84ecc688bfd888c7c4753959d56894b980ba1"} Dec 06 01:04:34 crc kubenswrapper[4991]: I1206 01:04:34.890304 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"67909e1a99473a19eeab3d8901f35e709ea8975b0e2f367f9905ad59b0cb5533"} Dec 06 01:04:35 crc kubenswrapper[4991]: I1206 01:04:35.896818 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"b82986697f33ef3f29cfc6b2825c61e5f1a9b929d8aa4b8f68ee22494df45c6b"} Dec 06 01:04:35 crc kubenswrapper[4991]: I1206 01:04:35.899113 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"9fbd342bbc480545c0a1494bab61c87ea0fa1005743688ba950a9fb5cb1b1937"} Dec 06 01:04:35 crc kubenswrapper[4991]: I1206 01:04:35.901240 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"2052cd1747aaa410421f3c9688f532bdf1c67269c89a6fbecd47cd576c568a2d"} Dec 06 01:04:35 crc kubenswrapper[4991]: I1206 01:04:35.901479 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.230260 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.282782 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-nhs9f"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.283239 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nhs9f" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.284592 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-2xvjl"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.285150 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2xvjl" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.287636 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.288782 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.289327 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.289587 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.289839 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.292081 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-h594s"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.292858 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.293635 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.293761 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-h594s" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.294740 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.296067 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.296246 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.304233 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.304421 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.307362 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.307398 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.314883 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.315790 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.316216 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.318883 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xh4tr"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.319594 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-xh4tr" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.325158 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.325205 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.325394 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.325672 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.328572 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.329457 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.330525 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.330817 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jrjzh"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.330868 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.331099 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.331612 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jrjzh" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.332082 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.332640 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-hwqgl"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.332858 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.333594 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.334400 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mw769"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.334667 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-hwqgl" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.335175 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-mw769" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.336777 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.337593 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.337739 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.338460 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.338612 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.338696 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.338625 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.339672 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.340136 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.340471 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.340774 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.341205 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.341323 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.341524 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.342059 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.342237 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.344342 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.345182 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.345425 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wgg26"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.346073 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-mgs4q"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.346772 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mgs4q" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.349828 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wgg26" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.350314 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.350576 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.358183 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.359685 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vphlm"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.360347 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c9qfb"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.360708 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mzbqd"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.361298 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.361715 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9aca593e-3156-4661-9e09-5c9d355801d5-trusted-ca\") pod \"ingress-operator-5b745b69d9-2xvjl\" (UID: \"9aca593e-3156-4661-9e09-5c9d355801d5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2xvjl" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.361819 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9aca593e-3156-4661-9e09-5c9d355801d5-bound-sa-token\") pod \"ingress-operator-5b745b69d9-2xvjl\" (UID: \"9aca593e-3156-4661-9e09-5c9d355801d5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2xvjl" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.361920 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ef47549-ccde-4565-935c-8a92998c5eb3-config\") pod \"route-controller-manager-6576b87f9c-nhs9f\" (UID: \"6ef47549-ccde-4565-935c-8a92998c5eb3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nhs9f" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.362033 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/dd63c06e-9514-4718-8678-db55b7deedfc-audit\") pod \"apiserver-76f77b778f-h594s\" (UID: \"dd63c06e-9514-4718-8678-db55b7deedfc\") " pod="openshift-apiserver/apiserver-76f77b778f-h594s" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.362127 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd63c06e-9514-4718-8678-db55b7deedfc-config\") pod \"apiserver-76f77b778f-h594s\" (UID: \"dd63c06e-9514-4718-8678-db55b7deedfc\") " pod="openshift-apiserver/apiserver-76f77b778f-h594s" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.361782 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.362333 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9aca593e-3156-4661-9e09-5c9d355801d5-metrics-tls\") pod \"ingress-operator-5b745b69d9-2xvjl\" (UID: \"9aca593e-3156-4661-9e09-5c9d355801d5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2xvjl" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.362383 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dd63c06e-9514-4718-8678-db55b7deedfc-audit-dir\") pod \"apiserver-76f77b778f-h594s\" (UID: \"dd63c06e-9514-4718-8678-db55b7deedfc\") " pod="openshift-apiserver/apiserver-76f77b778f-h594s" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.362417 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2spk\" (UniqueName: \"kubernetes.io/projected/9aca593e-3156-4661-9e09-5c9d355801d5-kube-api-access-h2spk\") pod \"ingress-operator-5b745b69d9-2xvjl\" (UID: \"9aca593e-3156-4661-9e09-5c9d355801d5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2xvjl" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.362451 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/dd63c06e-9514-4718-8678-db55b7deedfc-image-import-ca\") pod \"apiserver-76f77b778f-h594s\" (UID: \"dd63c06e-9514-4718-8678-db55b7deedfc\") " pod="openshift-apiserver/apiserver-76f77b778f-h594s" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.362504 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd63c06e-9514-4718-8678-db55b7deedfc-serving-cert\") pod \"apiserver-76f77b778f-h594s\" (UID: \"dd63c06e-9514-4718-8678-db55b7deedfc\") " pod="openshift-apiserver/apiserver-76f77b778f-h594s" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.362530 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd63c06e-9514-4718-8678-db55b7deedfc-trusted-ca-bundle\") pod \"apiserver-76f77b778f-h594s\" (UID: \"dd63c06e-9514-4718-8678-db55b7deedfc\") " pod="openshift-apiserver/apiserver-76f77b778f-h594s" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.362562 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/dd63c06e-9514-4718-8678-db55b7deedfc-node-pullsecrets\") pod \"apiserver-76f77b778f-h594s\" (UID: \"dd63c06e-9514-4718-8678-db55b7deedfc\") " pod="openshift-apiserver/apiserver-76f77b778f-h594s" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.362601 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbm75\" (UniqueName: \"kubernetes.io/projected/6ef47549-ccde-4565-935c-8a92998c5eb3-kube-api-access-bbm75\") pod \"route-controller-manager-6576b87f9c-nhs9f\" (UID: \"6ef47549-ccde-4565-935c-8a92998c5eb3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nhs9f" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.362651 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6ef47549-ccde-4565-935c-8a92998c5eb3-client-ca\") pod \"route-controller-manager-6576b87f9c-nhs9f\" (UID: \"6ef47549-ccde-4565-935c-8a92998c5eb3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nhs9f" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.362690 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9dh5\" (UniqueName: \"kubernetes.io/projected/dd63c06e-9514-4718-8678-db55b7deedfc-kube-api-access-p9dh5\") pod \"apiserver-76f77b778f-h594s\" (UID: \"dd63c06e-9514-4718-8678-db55b7deedfc\") " pod="openshift-apiserver/apiserver-76f77b778f-h594s" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.362720 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/dd63c06e-9514-4718-8678-db55b7deedfc-encryption-config\") pod \"apiserver-76f77b778f-h594s\" (UID: \"dd63c06e-9514-4718-8678-db55b7deedfc\") " pod="openshift-apiserver/apiserver-76f77b778f-h594s" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.362749 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dd63c06e-9514-4718-8678-db55b7deedfc-etcd-client\") pod \"apiserver-76f77b778f-h594s\" (UID: \"dd63c06e-9514-4718-8678-db55b7deedfc\") " pod="openshift-apiserver/apiserver-76f77b778f-h594s" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.362839 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.362045 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vphlm" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.362838 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/dd63c06e-9514-4718-8678-db55b7deedfc-etcd-serving-ca\") pod \"apiserver-76f77b778f-h594s\" (UID: \"dd63c06e-9514-4718-8678-db55b7deedfc\") " pod="openshift-apiserver/apiserver-76f77b778f-h594s" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.363090 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ef47549-ccde-4565-935c-8a92998c5eb3-serving-cert\") pod \"route-controller-manager-6576b87f9c-nhs9f\" (UID: \"6ef47549-ccde-4565-935c-8a92998c5eb3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nhs9f" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.363224 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-xgdbs"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.364038 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-97gxp"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.361840 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.361884 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.362142 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c9qfb" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.364320 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xgdbs" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.364464 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-97gxp" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.365283 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pn5g9"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.365702 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pn5g9" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.411246 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.411525 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.411836 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.412190 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.412449 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.412652 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.413006 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rpzsp"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.414054 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rpzsp" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.425009 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.425744 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.426683 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.426831 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.426955 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.428244 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.428379 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.442582 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.443164 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.444023 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.444503 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.447423 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f5qmc"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.447515 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.447792 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.447900 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-v8msm"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.448291 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-hxspx"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.448503 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.448655 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-hxspx" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.448837 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.449017 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.449139 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.449254 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.449373 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.449508 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.449639 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.449766 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.449872 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.450001 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.450749 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.451840 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-vvg2d"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.451931 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f5qmc" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.452561 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-8d52f"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.452634 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-vvg2d" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.453132 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4rtwr"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.453234 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8d52f" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.453343 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.453365 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.453378 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.453542 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.453582 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.453544 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.453611 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.453726 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.453804 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.454409 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qv97d"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.454827 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qv97d" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.454896 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4rtwr" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.459455 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.463056 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-nhs9f"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.463626 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dd63c06e-9514-4718-8678-db55b7deedfc-audit-dir\") pod \"apiserver-76f77b778f-h594s\" (UID: \"dd63c06e-9514-4718-8678-db55b7deedfc\") " pod="openshift-apiserver/apiserver-76f77b778f-h594s" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.463679 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/784eec32-b943-4380-9c3a-7f72b1756906-etcd-client\") pod \"etcd-operator-b45778765-xh4tr\" (UID: \"784eec32-b943-4380-9c3a-7f72b1756906\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xh4tr" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.463713 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/115cccb9-61dc-46f6-bdf3-6bc992b22b54-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-xgdbs\" (UID: \"115cccb9-61dc-46f6-bdf3-6bc992b22b54\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xgdbs" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.463733 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb4655db-a383-4f46-afbb-05557deaeccf-config\") pod \"openshift-apiserver-operator-796bbdcf4f-vphlm\" (UID: \"cb4655db-a383-4f46-afbb-05557deaeccf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vphlm" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.463760 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn6rd\" (UniqueName: \"kubernetes.io/projected/6c058a72-250e-4d50-989a-50a9bb515182-kube-api-access-dn6rd\") pod \"kube-storage-version-migrator-operator-b67b599dd-jrjzh\" (UID: \"6c058a72-250e-4d50-989a-50a9bb515182\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jrjzh" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.463788 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-mzbqd\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.463810 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x5kd\" (UniqueName: \"kubernetes.io/projected/8723600d-fcf8-414f-a440-da9d65d41e07-kube-api-access-4x5kd\") pod \"console-f9d7485db-97gxp\" (UID: \"8723600d-fcf8-414f-a440-da9d65d41e07\") " pod="openshift-console/console-f9d7485db-97gxp" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.463829 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2spk\" (UniqueName: \"kubernetes.io/projected/9aca593e-3156-4661-9e09-5c9d355801d5-kube-api-access-h2spk\") pod \"ingress-operator-5b745b69d9-2xvjl\" (UID: \"9aca593e-3156-4661-9e09-5c9d355801d5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2xvjl" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.463851 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c8f6a0fa-3b24-45b5-8c5f-ded3c2d01a4a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-wgg26\" (UID: \"c8f6a0fa-3b24-45b5-8c5f-ded3c2d01a4a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wgg26" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.463869 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8723600d-fcf8-414f-a440-da9d65d41e07-service-ca\") pod \"console-f9d7485db-97gxp\" (UID: \"8723600d-fcf8-414f-a440-da9d65d41e07\") " pod="openshift-console/console-f9d7485db-97gxp" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.463890 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/784eec32-b943-4380-9c3a-7f72b1756906-serving-cert\") pod \"etcd-operator-b45778765-xh4tr\" (UID: \"784eec32-b943-4380-9c3a-7f72b1756906\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xh4tr" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.463906 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5vwf\" (UniqueName: \"kubernetes.io/projected/784eec32-b943-4380-9c3a-7f72b1756906-kube-api-access-j5vwf\") pod \"etcd-operator-b45778765-xh4tr\" (UID: \"784eec32-b943-4380-9c3a-7f72b1756906\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xh4tr" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.463921 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/115cccb9-61dc-46f6-bdf3-6bc992b22b54-proxy-tls\") pod \"machine-config-controller-84d6567774-xgdbs\" (UID: \"115cccb9-61dc-46f6-bdf3-6bc992b22b54\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xgdbs" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.463939 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-mzbqd\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.463959 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tl4n\" (UniqueName: \"kubernetes.io/projected/cb4655db-a383-4f46-afbb-05557deaeccf-kube-api-access-4tl4n\") pod \"openshift-apiserver-operator-796bbdcf4f-vphlm\" (UID: \"cb4655db-a383-4f46-afbb-05557deaeccf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vphlm" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.463978 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/dd63c06e-9514-4718-8678-db55b7deedfc-image-import-ca\") pod \"apiserver-76f77b778f-h594s\" (UID: \"dd63c06e-9514-4718-8678-db55b7deedfc\") " pod="openshift-apiserver/apiserver-76f77b778f-h594s" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.464017 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd63c06e-9514-4718-8678-db55b7deedfc-serving-cert\") pod \"apiserver-76f77b778f-h594s\" (UID: \"dd63c06e-9514-4718-8678-db55b7deedfc\") " pod="openshift-apiserver/apiserver-76f77b778f-h594s" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.464035 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd63c06e-9514-4718-8678-db55b7deedfc-trusted-ca-bundle\") pod \"apiserver-76f77b778f-h594s\" (UID: \"dd63c06e-9514-4718-8678-db55b7deedfc\") " pod="openshift-apiserver/apiserver-76f77b778f-h594s" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.464052 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-mzbqd\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.464071 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-mzbqd\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.464089 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8f6a0fa-3b24-45b5-8c5f-ded3c2d01a4a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-wgg26\" (UID: \"c8f6a0fa-3b24-45b5-8c5f-ded3c2d01a4a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wgg26" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.464109 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6f04204c-29c6-4b62-b4f4-28935672a20e-images\") pod \"machine-api-operator-5694c8668f-hwqgl\" (UID: \"6f04204c-29c6-4b62-b4f4-28935672a20e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hwqgl" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.464146 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/dd63c06e-9514-4718-8678-db55b7deedfc-node-pullsecrets\") pod \"apiserver-76f77b778f-h594s\" (UID: \"dd63c06e-9514-4718-8678-db55b7deedfc\") " pod="openshift-apiserver/apiserver-76f77b778f-h594s" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.464166 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3acd3a6f-6dab-4bff-8f82-b96a8abc1555-client-ca\") pod \"controller-manager-879f6c89f-mw769\" (UID: \"3acd3a6f-6dab-4bff-8f82-b96a8abc1555\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mw769" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.464187 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-mzbqd\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.464206 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbm75\" (UniqueName: \"kubernetes.io/projected/6ef47549-ccde-4565-935c-8a92998c5eb3-kube-api-access-bbm75\") pod \"route-controller-manager-6576b87f9c-nhs9f\" (UID: \"6ef47549-ccde-4565-935c-8a92998c5eb3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nhs9f" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.464225 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdb62c42-c462-451d-a594-29333e494057-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pn5g9\" (UID: \"fdb62c42-c462-451d-a594-29333e494057\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pn5g9" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.464244 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd82h\" (UniqueName: \"kubernetes.io/projected/3acd3a6f-6dab-4bff-8f82-b96a8abc1555-kube-api-access-vd82h\") pod \"controller-manager-879f6c89f-mw769\" (UID: \"3acd3a6f-6dab-4bff-8f82-b96a8abc1555\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mw769" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.464261 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/9a829457-575c-4b6b-bc3b-68c117691597-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-c9qfb\" (UID: \"9a829457-575c-4b6b-bc3b-68c117691597\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c9qfb" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.464279 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8723600d-fcf8-414f-a440-da9d65d41e07-trusted-ca-bundle\") pod \"console-f9d7485db-97gxp\" (UID: \"8723600d-fcf8-414f-a440-da9d65d41e07\") " pod="openshift-console/console-f9d7485db-97gxp" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.464296 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8723600d-fcf8-414f-a440-da9d65d41e07-oauth-serving-cert\") pod \"console-f9d7485db-97gxp\" (UID: \"8723600d-fcf8-414f-a440-da9d65d41e07\") " pod="openshift-console/console-f9d7485db-97gxp" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.464313 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2524c61e-9353-41bd-acda-66380b2381b2-audit-policies\") pod \"oauth-openshift-558db77b4-mzbqd\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.464331 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdb62c42-c462-451d-a594-29333e494057-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pn5g9\" (UID: \"fdb62c42-c462-451d-a594-29333e494057\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pn5g9" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.464347 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/784eec32-b943-4380-9c3a-7f72b1756906-config\") pod \"etcd-operator-b45778765-xh4tr\" (UID: \"784eec32-b943-4380-9c3a-7f72b1756906\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xh4tr" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.464364 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8723600d-fcf8-414f-a440-da9d65d41e07-console-serving-cert\") pod \"console-f9d7485db-97gxp\" (UID: \"8723600d-fcf8-414f-a440-da9d65d41e07\") " pod="openshift-console/console-f9d7485db-97gxp" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.464383 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cglf\" (UniqueName: \"kubernetes.io/projected/af2cb7cc-ae54-4798-a45c-5816a09d1009-kube-api-access-4cglf\") pod \"catalog-operator-68c6474976-rpzsp\" (UID: \"af2cb7cc-ae54-4798-a45c-5816a09d1009\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rpzsp" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.464398 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2524c61e-9353-41bd-acda-66380b2381b2-audit-dir\") pod \"oauth-openshift-558db77b4-mzbqd\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.464416 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-mzbqd\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.464432 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb4655db-a383-4f46-afbb-05557deaeccf-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-vphlm\" (UID: \"cb4655db-a383-4f46-afbb-05557deaeccf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vphlm" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.464459 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6ef47549-ccde-4565-935c-8a92998c5eb3-client-ca\") pod \"route-controller-manager-6576b87f9c-nhs9f\" (UID: \"6ef47549-ccde-4565-935c-8a92998c5eb3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nhs9f" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.464479 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c058a72-250e-4d50-989a-50a9bb515182-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-jrjzh\" (UID: \"6c058a72-250e-4d50-989a-50a9bb515182\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jrjzh" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.464559 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8723600d-fcf8-414f-a440-da9d65d41e07-console-oauth-config\") pod \"console-f9d7485db-97gxp\" (UID: \"8723600d-fcf8-414f-a440-da9d65d41e07\") " pod="openshift-console/console-f9d7485db-97gxp" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.464597 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt48w\" (UniqueName: \"kubernetes.io/projected/2524c61e-9353-41bd-acda-66380b2381b2-kube-api-access-vt48w\") pod \"oauth-openshift-558db77b4-mzbqd\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.464916 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9dh5\" (UniqueName: \"kubernetes.io/projected/dd63c06e-9514-4718-8678-db55b7deedfc-kube-api-access-p9dh5\") pod \"apiserver-76f77b778f-h594s\" (UID: \"dd63c06e-9514-4718-8678-db55b7deedfc\") " pod="openshift-apiserver/apiserver-76f77b778f-h594s" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.464963 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-mzbqd\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.465023 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/dd63c06e-9514-4718-8678-db55b7deedfc-encryption-config\") pod \"apiserver-76f77b778f-h594s\" (UID: \"dd63c06e-9514-4718-8678-db55b7deedfc\") " pod="openshift-apiserver/apiserver-76f77b778f-h594s" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.465044 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/af2cb7cc-ae54-4798-a45c-5816a09d1009-profile-collector-cert\") pod \"catalog-operator-68c6474976-rpzsp\" (UID: \"af2cb7cc-ae54-4798-a45c-5816a09d1009\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rpzsp" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.465063 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-mzbqd\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.465080 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7sgl\" (UniqueName: \"kubernetes.io/projected/115cccb9-61dc-46f6-bdf3-6bc992b22b54-kube-api-access-k7sgl\") pod \"machine-config-controller-84d6567774-xgdbs\" (UID: \"115cccb9-61dc-46f6-bdf3-6bc992b22b54\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xgdbs" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.465104 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dd63c06e-9514-4718-8678-db55b7deedfc-etcd-client\") pod \"apiserver-76f77b778f-h594s\" (UID: \"dd63c06e-9514-4718-8678-db55b7deedfc\") " pod="openshift-apiserver/apiserver-76f77b778f-h594s" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.465121 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/784eec32-b943-4380-9c3a-7f72b1756906-etcd-ca\") pod \"etcd-operator-b45778765-xh4tr\" (UID: \"784eec32-b943-4380-9c3a-7f72b1756906\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xh4tr" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.465138 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3acd3a6f-6dab-4bff-8f82-b96a8abc1555-serving-cert\") pod \"controller-manager-879f6c89f-mw769\" (UID: \"3acd3a6f-6dab-4bff-8f82-b96a8abc1555\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mw769" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.465156 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/dd63c06e-9514-4718-8678-db55b7deedfc-etcd-serving-ca\") pod \"apiserver-76f77b778f-h594s\" (UID: \"dd63c06e-9514-4718-8678-db55b7deedfc\") " pod="openshift-apiserver/apiserver-76f77b778f-h594s" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.465173 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fdb62c42-c462-451d-a594-29333e494057-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pn5g9\" (UID: \"fdb62c42-c462-451d-a594-29333e494057\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pn5g9" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.465190 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8f6a0fa-3b24-45b5-8c5f-ded3c2d01a4a-config\") pod \"kube-controller-manager-operator-78b949d7b-wgg26\" (UID: \"c8f6a0fa-3b24-45b5-8c5f-ded3c2d01a4a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wgg26" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.465208 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ef47549-ccde-4565-935c-8a92998c5eb3-serving-cert\") pod \"route-controller-manager-6576b87f9c-nhs9f\" (UID: \"6ef47549-ccde-4565-935c-8a92998c5eb3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nhs9f" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.465233 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wx9g\" (UniqueName: \"kubernetes.io/projected/6f04204c-29c6-4b62-b4f4-28935672a20e-kube-api-access-8wx9g\") pod \"machine-api-operator-5694c8668f-hwqgl\" (UID: \"6f04204c-29c6-4b62-b4f4-28935672a20e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hwqgl" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.465253 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9aca593e-3156-4661-9e09-5c9d355801d5-trusted-ca\") pod \"ingress-operator-5b745b69d9-2xvjl\" (UID: \"9aca593e-3156-4661-9e09-5c9d355801d5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2xvjl" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.465271 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/784eec32-b943-4380-9c3a-7f72b1756906-etcd-service-ca\") pod \"etcd-operator-b45778765-xh4tr\" (UID: \"784eec32-b943-4380-9c3a-7f72b1756906\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xh4tr" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.465290 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f04204c-29c6-4b62-b4f4-28935672a20e-config\") pod \"machine-api-operator-5694c8668f-hwqgl\" (UID: \"6f04204c-29c6-4b62-b4f4-28935672a20e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hwqgl" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.465357 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9aca593e-3156-4661-9e09-5c9d355801d5-bound-sa-token\") pod \"ingress-operator-5b745b69d9-2xvjl\" (UID: \"9aca593e-3156-4661-9e09-5c9d355801d5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2xvjl" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.465379 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c058a72-250e-4d50-989a-50a9bb515182-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-jrjzh\" (UID: \"6c058a72-250e-4d50-989a-50a9bb515182\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jrjzh" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.465416 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-mzbqd\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.465499 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fg95\" (UniqueName: \"kubernetes.io/projected/9a829457-575c-4b6b-bc3b-68c117691597-kube-api-access-9fg95\") pod \"control-plane-machine-set-operator-78cbb6b69f-c9qfb\" (UID: \"9a829457-575c-4b6b-bc3b-68c117691597\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c9qfb" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.465523 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ef47549-ccde-4565-935c-8a92998c5eb3-config\") pod \"route-controller-manager-6576b87f9c-nhs9f\" (UID: \"6ef47549-ccde-4565-935c-8a92998c5eb3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nhs9f" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.465542 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-mzbqd\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.465560 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6f04204c-29c6-4b62-b4f4-28935672a20e-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-hwqgl\" (UID: \"6f04204c-29c6-4b62-b4f4-28935672a20e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hwqgl" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.465578 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8723600d-fcf8-414f-a440-da9d65d41e07-console-config\") pod \"console-f9d7485db-97gxp\" (UID: \"8723600d-fcf8-414f-a440-da9d65d41e07\") " pod="openshift-console/console-f9d7485db-97gxp" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.465594 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/af2cb7cc-ae54-4798-a45c-5816a09d1009-srv-cert\") pod \"catalog-operator-68c6474976-rpzsp\" (UID: \"af2cb7cc-ae54-4798-a45c-5816a09d1009\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rpzsp" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.465613 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/dd63c06e-9514-4718-8678-db55b7deedfc-audit\") pod \"apiserver-76f77b778f-h594s\" (UID: \"dd63c06e-9514-4718-8678-db55b7deedfc\") " pod="openshift-apiserver/apiserver-76f77b778f-h594s" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.465633 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwn96\" (UniqueName: \"kubernetes.io/projected/b9bc7fc1-7d76-413c-9d29-58584564ac16-kube-api-access-lwn96\") pod \"migrator-59844c95c7-mgs4q\" (UID: \"b9bc7fc1-7d76-413c-9d29-58584564ac16\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mgs4q" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.465649 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3acd3a6f-6dab-4bff-8f82-b96a8abc1555-config\") pod \"controller-manager-879f6c89f-mw769\" (UID: \"3acd3a6f-6dab-4bff-8f82-b96a8abc1555\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mw769" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.465669 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3acd3a6f-6dab-4bff-8f82-b96a8abc1555-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-mw769\" (UID: \"3acd3a6f-6dab-4bff-8f82-b96a8abc1555\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mw769" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.465690 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd63c06e-9514-4718-8678-db55b7deedfc-config\") pod \"apiserver-76f77b778f-h594s\" (UID: \"dd63c06e-9514-4718-8678-db55b7deedfc\") " pod="openshift-apiserver/apiserver-76f77b778f-h594s" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.465806 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-mzbqd\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.465836 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9aca593e-3156-4661-9e09-5c9d355801d5-metrics-tls\") pod \"ingress-operator-5b745b69d9-2xvjl\" (UID: \"9aca593e-3156-4661-9e09-5c9d355801d5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2xvjl" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.465835 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dd63c06e-9514-4718-8678-db55b7deedfc-audit-dir\") pod \"apiserver-76f77b778f-h594s\" (UID: \"dd63c06e-9514-4718-8678-db55b7deedfc\") " pod="openshift-apiserver/apiserver-76f77b778f-h594s" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.466471 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.467275 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.467497 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.467747 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.468537 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.469210 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd63c06e-9514-4718-8678-db55b7deedfc-trusted-ca-bundle\") pod \"apiserver-76f77b778f-h594s\" (UID: \"dd63c06e-9514-4718-8678-db55b7deedfc\") " pod="openshift-apiserver/apiserver-76f77b778f-h594s" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.469430 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/dd63c06e-9514-4718-8678-db55b7deedfc-node-pullsecrets\") pod \"apiserver-76f77b778f-h594s\" (UID: \"dd63c06e-9514-4718-8678-db55b7deedfc\") " pod="openshift-apiserver/apiserver-76f77b778f-h594s" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.469770 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.472187 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9aca593e-3156-4661-9e09-5c9d355801d5-metrics-tls\") pod \"ingress-operator-5b745b69d9-2xvjl\" (UID: \"9aca593e-3156-4661-9e09-5c9d355801d5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2xvjl" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.472291 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.472814 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd63c06e-9514-4718-8678-db55b7deedfc-config\") pod \"apiserver-76f77b778f-h594s\" (UID: \"dd63c06e-9514-4718-8678-db55b7deedfc\") " pod="openshift-apiserver/apiserver-76f77b778f-h594s" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.473217 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/dd63c06e-9514-4718-8678-db55b7deedfc-audit\") pod \"apiserver-76f77b778f-h594s\" (UID: \"dd63c06e-9514-4718-8678-db55b7deedfc\") " pod="openshift-apiserver/apiserver-76f77b778f-h594s" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.473380 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9aca593e-3156-4661-9e09-5c9d355801d5-trusted-ca\") pod \"ingress-operator-5b745b69d9-2xvjl\" (UID: \"9aca593e-3156-4661-9e09-5c9d355801d5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2xvjl" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.473957 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.474088 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/dd63c06e-9514-4718-8678-db55b7deedfc-etcd-serving-ca\") pod \"apiserver-76f77b778f-h594s\" (UID: \"dd63c06e-9514-4718-8678-db55b7deedfc\") " pod="openshift-apiserver/apiserver-76f77b778f-h594s" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.474194 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6ef47549-ccde-4565-935c-8a92998c5eb3-client-ca\") pod \"route-controller-manager-6576b87f9c-nhs9f\" (UID: \"6ef47549-ccde-4565-935c-8a92998c5eb3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nhs9f" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.474563 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.474765 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.475312 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/dd63c06e-9514-4718-8678-db55b7deedfc-encryption-config\") pod \"apiserver-76f77b778f-h594s\" (UID: \"dd63c06e-9514-4718-8678-db55b7deedfc\") " pod="openshift-apiserver/apiserver-76f77b778f-h594s" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.475507 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.475734 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ef47549-ccde-4565-935c-8a92998c5eb3-config\") pod \"route-controller-manager-6576b87f9c-nhs9f\" (UID: \"6ef47549-ccde-4565-935c-8a92998c5eb3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nhs9f" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.476623 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/dd63c06e-9514-4718-8678-db55b7deedfc-image-import-ca\") pod \"apiserver-76f77b778f-h594s\" (UID: \"dd63c06e-9514-4718-8678-db55b7deedfc\") " pod="openshift-apiserver/apiserver-76f77b778f-h594s" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.477836 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dd63c06e-9514-4718-8678-db55b7deedfc-etcd-client\") pod \"apiserver-76f77b778f-h594s\" (UID: \"dd63c06e-9514-4718-8678-db55b7deedfc\") " pod="openshift-apiserver/apiserver-76f77b778f-h594s" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.479438 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9w259"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.479928 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ef47549-ccde-4565-935c-8a92998c5eb3-serving-cert\") pod \"route-controller-manager-6576b87f9c-nhs9f\" (UID: \"6ef47549-ccde-4565-935c-8a92998c5eb3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nhs9f" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.480164 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9w259" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.480829 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-kpfbw"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.484802 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-kpfbw" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.490300 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-qhtk2"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.491498 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-nt5q5"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.492887 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-jcdx4"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.501637 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd63c06e-9514-4718-8678-db55b7deedfc-serving-cert\") pod \"apiserver-76f77b778f-h594s\" (UID: \"dd63c06e-9514-4718-8678-db55b7deedfc\") " pod="openshift-apiserver/apiserver-76f77b778f-h594s" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.501867 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-qhtk2" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.502655 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nt5q5" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.502828 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jcdx4" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.507837 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.511395 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j79nx"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.512613 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-vtscj"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.514288 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vtscj" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.515090 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j79nx" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.530690 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.531125 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.532231 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rxb7h"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.536514 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rxb7h" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.556747 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rw4lh"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.557636 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-s7xc9"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.565069 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.567527 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wx9g\" (UniqueName: \"kubernetes.io/projected/6f04204c-29c6-4b62-b4f4-28935672a20e-kube-api-access-8wx9g\") pod \"machine-api-operator-5694c8668f-hwqgl\" (UID: \"6f04204c-29c6-4b62-b4f4-28935672a20e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hwqgl" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.567581 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62e4e6c5-0e68-43dd-b00a-6d1a07973d9a-config\") pod \"kube-apiserver-operator-766d6c64bb-9w259\" (UID: \"62e4e6c5-0e68-43dd-b00a-6d1a07973d9a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9w259" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.567607 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/67519387-ecfc-4c03-a199-6679431ac96f-audit-dir\") pod \"apiserver-7bbb656c7d-nt5q5\" (UID: \"67519387-ecfc-4c03-a199-6679431ac96f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nt5q5" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.567632 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/eb00ea97-33bb-448f-afb1-372543ca82f5-auth-proxy-config\") pod \"machine-approver-56656f9798-8d52f\" (UID: \"eb00ea97-33bb-448f-afb1-372543ca82f5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8d52f" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.567654 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f04204c-29c6-4b62-b4f4-28935672a20e-config\") pod \"machine-api-operator-5694c8668f-hwqgl\" (UID: \"6f04204c-29c6-4b62-b4f4-28935672a20e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hwqgl" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.567670 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/784eec32-b943-4380-9c3a-7f72b1756906-etcd-service-ca\") pod \"etcd-operator-b45778765-xh4tr\" (UID: \"784eec32-b943-4380-9c3a-7f72b1756906\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xh4tr" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.567689 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82128768-e91b-4e38-bc1b-2915bc9d4436-serving-cert\") pod \"openshift-config-operator-7777fb866f-vtscj\" (UID: \"82128768-e91b-4e38-bc1b-2915bc9d4436\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vtscj" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.567714 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c058a72-250e-4d50-989a-50a9bb515182-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-jrjzh\" (UID: \"6c058a72-250e-4d50-989a-50a9bb515182\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jrjzh" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.567729 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/eb00ea97-33bb-448f-afb1-372543ca82f5-machine-approver-tls\") pod \"machine-approver-56656f9798-8d52f\" (UID: \"eb00ea97-33bb-448f-afb1-372543ca82f5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8d52f" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.567749 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5drqm\" (UniqueName: \"kubernetes.io/projected/67519387-ecfc-4c03-a199-6679431ac96f-kube-api-access-5drqm\") pod \"apiserver-7bbb656c7d-nt5q5\" (UID: \"67519387-ecfc-4c03-a199-6679431ac96f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nt5q5" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.567764 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grdnk\" (UniqueName: \"kubernetes.io/projected/d7a020b5-5981-4927-a254-ecad13dce340-kube-api-access-grdnk\") pod \"dns-operator-744455d44c-kpfbw\" (UID: \"d7a020b5-5981-4927-a254-ecad13dce340\") " pod="openshift-dns-operator/dns-operator-744455d44c-kpfbw" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.567782 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-mzbqd\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.567801 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-mzbqd\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.567824 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fg95\" (UniqueName: \"kubernetes.io/projected/9a829457-575c-4b6b-bc3b-68c117691597-kube-api-access-9fg95\") pod \"control-plane-machine-set-operator-78cbb6b69f-c9qfb\" (UID: \"9a829457-575c-4b6b-bc3b-68c117691597\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c9qfb" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.567840 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e8f82106-cdc2-4dec-a30d-221e17bdc50d-default-certificate\") pod \"router-default-5444994796-qhtk2\" (UID: \"e8f82106-cdc2-4dec-a30d-221e17bdc50d\") " pod="openshift-ingress/router-default-5444994796-qhtk2" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.567857 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/67519387-ecfc-4c03-a199-6679431ac96f-audit-policies\") pod \"apiserver-7bbb656c7d-nt5q5\" (UID: \"67519387-ecfc-4c03-a199-6679431ac96f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nt5q5" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.567878 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dbea4fd-7c3f-4ce2-bf97-c25431e900e7-serving-cert\") pod \"authentication-operator-69f744f599-vvg2d\" (UID: \"2dbea4fd-7c3f-4ce2-bf97-c25431e900e7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vvg2d" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.567907 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6f04204c-29c6-4b62-b4f4-28935672a20e-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-hwqgl\" (UID: \"6f04204c-29c6-4b62-b4f4-28935672a20e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hwqgl" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.567933 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8723600d-fcf8-414f-a440-da9d65d41e07-console-config\") pod \"console-f9d7485db-97gxp\" (UID: \"8723600d-fcf8-414f-a440-da9d65d41e07\") " pod="openshift-console/console-f9d7485db-97gxp" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.567955 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/af2cb7cc-ae54-4798-a45c-5816a09d1009-srv-cert\") pod \"catalog-operator-68c6474976-rpzsp\" (UID: \"af2cb7cc-ae54-4798-a45c-5816a09d1009\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rpzsp" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.568011 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dbea4fd-7c3f-4ce2-bf97-c25431e900e7-config\") pod \"authentication-operator-69f744f599-vvg2d\" (UID: \"2dbea4fd-7c3f-4ce2-bf97-c25431e900e7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vvg2d" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.568040 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2dbea4fd-7c3f-4ce2-bf97-c25431e900e7-service-ca-bundle\") pod \"authentication-operator-69f744f599-vvg2d\" (UID: \"2dbea4fd-7c3f-4ce2-bf97-c25431e900e7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vvg2d" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.568066 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwn96\" (UniqueName: \"kubernetes.io/projected/b9bc7fc1-7d76-413c-9d29-58584564ac16-kube-api-access-lwn96\") pod \"migrator-59844c95c7-mgs4q\" (UID: \"b9bc7fc1-7d76-413c-9d29-58584564ac16\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mgs4q" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.568087 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3acd3a6f-6dab-4bff-8f82-b96a8abc1555-config\") pod \"controller-manager-879f6c89f-mw769\" (UID: \"3acd3a6f-6dab-4bff-8f82-b96a8abc1555\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mw769" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.568109 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3acd3a6f-6dab-4bff-8f82-b96a8abc1555-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-mw769\" (UID: \"3acd3a6f-6dab-4bff-8f82-b96a8abc1555\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mw769" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.568133 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-mzbqd\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.568138 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rw4lh" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.568153 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e8f82106-cdc2-4dec-a30d-221e17bdc50d-stats-auth\") pod \"router-default-5444994796-qhtk2\" (UID: \"e8f82106-cdc2-4dec-a30d-221e17bdc50d\") " pod="openshift-ingress/router-default-5444994796-qhtk2" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.568273 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d7a020b5-5981-4927-a254-ecad13dce340-metrics-tls\") pod \"dns-operator-744455d44c-kpfbw\" (UID: \"d7a020b5-5981-4927-a254-ecad13dce340\") " pod="openshift-dns-operator/dns-operator-744455d44c-kpfbw" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.568303 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2dbea4fd-7c3f-4ce2-bf97-c25431e900e7-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-vvg2d\" (UID: \"2dbea4fd-7c3f-4ce2-bf97-c25431e900e7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vvg2d" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.568327 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0c087aa7-b699-4d07-adb8-e3082e693727-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4rtwr\" (UID: \"0c087aa7-b699-4d07-adb8-e3082e693727\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4rtwr" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.568365 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/784eec32-b943-4380-9c3a-7f72b1756906-etcd-client\") pod \"etcd-operator-b45778765-xh4tr\" (UID: \"784eec32-b943-4380-9c3a-7f72b1756906\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xh4tr" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.568388 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/115cccb9-61dc-46f6-bdf3-6bc992b22b54-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-xgdbs\" (UID: \"115cccb9-61dc-46f6-bdf3-6bc992b22b54\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xgdbs" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.568408 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb4655db-a383-4f46-afbb-05557deaeccf-config\") pod \"openshift-apiserver-operator-796bbdcf4f-vphlm\" (UID: \"cb4655db-a383-4f46-afbb-05557deaeccf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vphlm" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.568427 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dn6rd\" (UniqueName: \"kubernetes.io/projected/6c058a72-250e-4d50-989a-50a9bb515182-kube-api-access-dn6rd\") pod \"kube-storage-version-migrator-operator-b67b599dd-jrjzh\" (UID: \"6c058a72-250e-4d50-989a-50a9bb515182\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jrjzh" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.568447 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-mzbqd\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.568469 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x5kd\" (UniqueName: \"kubernetes.io/projected/8723600d-fcf8-414f-a440-da9d65d41e07-kube-api-access-4x5kd\") pod \"console-f9d7485db-97gxp\" (UID: \"8723600d-fcf8-414f-a440-da9d65d41e07\") " pod="openshift-console/console-f9d7485db-97gxp" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.568486 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjcqd\" (UniqueName: \"kubernetes.io/projected/2dbea4fd-7c3f-4ce2-bf97-c25431e900e7-kube-api-access-mjcqd\") pod \"authentication-operator-69f744f599-vvg2d\" (UID: \"2dbea4fd-7c3f-4ce2-bf97-c25431e900e7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vvg2d" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.568520 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c8f6a0fa-3b24-45b5-8c5f-ded3c2d01a4a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-wgg26\" (UID: \"c8f6a0fa-3b24-45b5-8c5f-ded3c2d01a4a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wgg26" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.568537 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8723600d-fcf8-414f-a440-da9d65d41e07-service-ca\") pod \"console-f9d7485db-97gxp\" (UID: \"8723600d-fcf8-414f-a440-da9d65d41e07\") " pod="openshift-console/console-f9d7485db-97gxp" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.568560 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/784eec32-b943-4380-9c3a-7f72b1756906-serving-cert\") pod \"etcd-operator-b45778765-xh4tr\" (UID: \"784eec32-b943-4380-9c3a-7f72b1756906\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xh4tr" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.568580 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67519387-ecfc-4c03-a199-6679431ac96f-serving-cert\") pod \"apiserver-7bbb656c7d-nt5q5\" (UID: \"67519387-ecfc-4c03-a199-6679431ac96f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nt5q5" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.568600 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5vwf\" (UniqueName: \"kubernetes.io/projected/784eec32-b943-4380-9c3a-7f72b1756906-kube-api-access-j5vwf\") pod \"etcd-operator-b45778765-xh4tr\" (UID: \"784eec32-b943-4380-9c3a-7f72b1756906\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xh4tr" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.568603 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9jcnt"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.568977 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-mzbqd\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.569148 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-s7xc9" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.569285 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-bjq9r"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.569302 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.569714 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-wlpng"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.570337 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416380-k4nkv"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.570790 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-mwwth"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.571157 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-2xvjl"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.571172 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-h594s"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.571184 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jrjzh"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.571278 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-mwwth" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.568618 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/115cccb9-61dc-46f6-bdf3-6bc992b22b54-proxy-tls\") pod \"machine-config-controller-84d6567774-xgdbs\" (UID: \"115cccb9-61dc-46f6-bdf3-6bc992b22b54\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xgdbs" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.571677 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67519387-ecfc-4c03-a199-6679431ac96f-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-nt5q5\" (UID: \"67519387-ecfc-4c03-a199-6679431ac96f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nt5q5" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.571715 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-mzbqd\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.571743 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tl4n\" (UniqueName: \"kubernetes.io/projected/cb4655db-a383-4f46-afbb-05557deaeccf-kube-api-access-4tl4n\") pod \"openshift-apiserver-operator-796bbdcf4f-vphlm\" (UID: \"cb4655db-a383-4f46-afbb-05557deaeccf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vphlm" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.571767 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-mzbqd\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.571802 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-mzbqd\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.571831 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8f6a0fa-3b24-45b5-8c5f-ded3c2d01a4a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-wgg26\" (UID: \"c8f6a0fa-3b24-45b5-8c5f-ded3c2d01a4a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wgg26" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.571857 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6f04204c-29c6-4b62-b4f4-28935672a20e-images\") pod \"machine-api-operator-5694c8668f-hwqgl\" (UID: \"6f04204c-29c6-4b62-b4f4-28935672a20e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hwqgl" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.571889 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdb62c42-c462-451d-a594-29333e494057-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pn5g9\" (UID: \"fdb62c42-c462-451d-a594-29333e494057\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pn5g9" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.571910 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3acd3a6f-6dab-4bff-8f82-b96a8abc1555-client-ca\") pod \"controller-manager-879f6c89f-mw769\" (UID: \"3acd3a6f-6dab-4bff-8f82-b96a8abc1555\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mw769" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.571929 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-mzbqd\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.571949 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62e4e6c5-0e68-43dd-b00a-6d1a07973d9a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-9w259\" (UID: \"62e4e6c5-0e68-43dd-b00a-6d1a07973d9a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9w259" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.571972 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8723600d-fcf8-414f-a440-da9d65d41e07-trusted-ca-bundle\") pod \"console-f9d7485db-97gxp\" (UID: \"8723600d-fcf8-414f-a440-da9d65d41e07\") " pod="openshift-console/console-f9d7485db-97gxp" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.572549 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-mgs4q"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.573436 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-mzbqd\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.574173 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f04204c-29c6-4b62-b4f4-28935672a20e-config\") pod \"machine-api-operator-5694c8668f-hwqgl\" (UID: \"6f04204c-29c6-4b62-b4f4-28935672a20e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hwqgl" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.574627 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/115cccb9-61dc-46f6-bdf3-6bc992b22b54-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-xgdbs\" (UID: \"115cccb9-61dc-46f6-bdf3-6bc992b22b54\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xgdbs" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.575001 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/784eec32-b943-4380-9c3a-7f72b1756906-etcd-service-ca\") pod \"etcd-operator-b45778765-xh4tr\" (UID: \"784eec32-b943-4380-9c3a-7f72b1756906\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xh4tr" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.575610 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c058a72-250e-4d50-989a-50a9bb515182-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-jrjzh\" (UID: \"6c058a72-250e-4d50-989a-50a9bb515182\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jrjzh" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.575634 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb4655db-a383-4f46-afbb-05557deaeccf-config\") pod \"openshift-apiserver-operator-796bbdcf4f-vphlm\" (UID: \"cb4655db-a383-4f46-afbb-05557deaeccf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vphlm" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.576613 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-mzbqd\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.577412 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8723600d-fcf8-414f-a440-da9d65d41e07-console-config\") pod \"console-f9d7485db-97gxp\" (UID: \"8723600d-fcf8-414f-a440-da9d65d41e07\") " pod="openshift-console/console-f9d7485db-97gxp" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.578007 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416380-k4nkv" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.578307 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9jcnt" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.578497 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-bjq9r" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.578784 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-wlpng" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.579961 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-647zw"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.580065 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8723600d-fcf8-414f-a440-da9d65d41e07-service-ca\") pod \"console-f9d7485db-97gxp\" (UID: \"8723600d-fcf8-414f-a440-da9d65d41e07\") " pod="openshift-console/console-f9d7485db-97gxp" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.581147 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-hwqgl"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.581171 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4rtwr"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.581322 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-647zw" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.583053 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8723600d-fcf8-414f-a440-da9d65d41e07-oauth-serving-cert\") pod \"console-f9d7485db-97gxp\" (UID: \"8723600d-fcf8-414f-a440-da9d65d41e07\") " pod="openshift-console/console-f9d7485db-97gxp" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.583111 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd82h\" (UniqueName: \"kubernetes.io/projected/3acd3a6f-6dab-4bff-8f82-b96a8abc1555-kube-api-access-vd82h\") pod \"controller-manager-879f6c89f-mw769\" (UID: \"3acd3a6f-6dab-4bff-8f82-b96a8abc1555\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mw769" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.583139 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/9a829457-575c-4b6b-bc3b-68c117691597-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-c9qfb\" (UID: \"9a829457-575c-4b6b-bc3b-68c117691597\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c9qfb" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.583166 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdb62c42-c462-451d-a594-29333e494057-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pn5g9\" (UID: \"fdb62c42-c462-451d-a594-29333e494057\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pn5g9" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.583196 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/784eec32-b943-4380-9c3a-7f72b1756906-config\") pod \"etcd-operator-b45778765-xh4tr\" (UID: \"784eec32-b943-4380-9c3a-7f72b1756906\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xh4tr" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.583219 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2524c61e-9353-41bd-acda-66380b2381b2-audit-policies\") pod \"oauth-openshift-558db77b4-mzbqd\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.583242 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/67519387-ecfc-4c03-a199-6679431ac96f-etcd-client\") pod \"apiserver-7bbb656c7d-nt5q5\" (UID: \"67519387-ecfc-4c03-a199-6679431ac96f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nt5q5" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.583266 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb4655db-a383-4f46-afbb-05557deaeccf-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-vphlm\" (UID: \"cb4655db-a383-4f46-afbb-05557deaeccf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vphlm" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.583298 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8723600d-fcf8-414f-a440-da9d65d41e07-console-serving-cert\") pod \"console-f9d7485db-97gxp\" (UID: \"8723600d-fcf8-414f-a440-da9d65d41e07\") " pod="openshift-console/console-f9d7485db-97gxp" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.583322 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cglf\" (UniqueName: \"kubernetes.io/projected/af2cb7cc-ae54-4798-a45c-5816a09d1009-kube-api-access-4cglf\") pod \"catalog-operator-68c6474976-rpzsp\" (UID: \"af2cb7cc-ae54-4798-a45c-5816a09d1009\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rpzsp" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.583347 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2524c61e-9353-41bd-acda-66380b2381b2-audit-dir\") pod \"oauth-openshift-558db77b4-mzbqd\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.583367 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-mzbqd\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.583388 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8f82106-cdc2-4dec-a30d-221e17bdc50d-service-ca-bundle\") pod \"router-default-5444994796-qhtk2\" (UID: \"e8f82106-cdc2-4dec-a30d-221e17bdc50d\") " pod="openshift-ingress/router-default-5444994796-qhtk2" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.583410 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqtws\" (UniqueName: \"kubernetes.io/projected/eb00ea97-33bb-448f-afb1-372543ca82f5-kube-api-access-gqtws\") pod \"machine-approver-56656f9798-8d52f\" (UID: \"eb00ea97-33bb-448f-afb1-372543ca82f5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8d52f" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.583429 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/67519387-ecfc-4c03-a199-6679431ac96f-encryption-config\") pod \"apiserver-7bbb656c7d-nt5q5\" (UID: \"67519387-ecfc-4c03-a199-6679431ac96f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nt5q5" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.583459 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c058a72-250e-4d50-989a-50a9bb515182-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-jrjzh\" (UID: \"6c058a72-250e-4d50-989a-50a9bb515182\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jrjzh" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.583502 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8723600d-fcf8-414f-a440-da9d65d41e07-console-oauth-config\") pod \"console-f9d7485db-97gxp\" (UID: \"8723600d-fcf8-414f-a440-da9d65d41e07\") " pod="openshift-console/console-f9d7485db-97gxp" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.583529 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vt48w\" (UniqueName: \"kubernetes.io/projected/2524c61e-9353-41bd-acda-66380b2381b2-kube-api-access-vt48w\") pod \"oauth-openshift-558db77b4-mzbqd\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.583570 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-mzbqd\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.583594 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/82128768-e91b-4e38-bc1b-2915bc9d4436-available-featuregates\") pod \"openshift-config-operator-7777fb866f-vtscj\" (UID: \"82128768-e91b-4e38-bc1b-2915bc9d4436\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vtscj" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.583621 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvkzk\" (UniqueName: \"kubernetes.io/projected/0c087aa7-b699-4d07-adb8-e3082e693727-kube-api-access-mvkzk\") pod \"cluster-samples-operator-665b6dd947-4rtwr\" (UID: \"0c087aa7-b699-4d07-adb8-e3082e693727\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4rtwr" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.583649 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/af2cb7cc-ae54-4798-a45c-5816a09d1009-profile-collector-cert\") pod \"catalog-operator-68c6474976-rpzsp\" (UID: \"af2cb7cc-ae54-4798-a45c-5816a09d1009\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rpzsp" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.583673 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-mzbqd\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.583698 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7sgl\" (UniqueName: \"kubernetes.io/projected/115cccb9-61dc-46f6-bdf3-6bc992b22b54-kube-api-access-k7sgl\") pod \"machine-config-controller-84d6567774-xgdbs\" (UID: \"115cccb9-61dc-46f6-bdf3-6bc992b22b54\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xgdbs" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.583718 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/67519387-ecfc-4c03-a199-6679431ac96f-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-nt5q5\" (UID: \"67519387-ecfc-4c03-a199-6679431ac96f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nt5q5" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.583759 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/784eec32-b943-4380-9c3a-7f72b1756906-etcd-ca\") pod \"etcd-operator-b45778765-xh4tr\" (UID: \"784eec32-b943-4380-9c3a-7f72b1756906\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xh4tr" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.583780 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3acd3a6f-6dab-4bff-8f82-b96a8abc1555-serving-cert\") pod \"controller-manager-879f6c89f-mw769\" (UID: \"3acd3a6f-6dab-4bff-8f82-b96a8abc1555\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mw769" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.583808 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79p52\" (UniqueName: \"kubernetes.io/projected/82128768-e91b-4e38-bc1b-2915bc9d4436-kube-api-access-79p52\") pod \"openshift-config-operator-7777fb866f-vtscj\" (UID: \"82128768-e91b-4e38-bc1b-2915bc9d4436\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vtscj" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.583837 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fdb62c42-c462-451d-a594-29333e494057-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pn5g9\" (UID: \"fdb62c42-c462-451d-a594-29333e494057\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pn5g9" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.583859 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8f6a0fa-3b24-45b5-8c5f-ded3c2d01a4a-config\") pod \"kube-controller-manager-operator-78b949d7b-wgg26\" (UID: \"c8f6a0fa-3b24-45b5-8c5f-ded3c2d01a4a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wgg26" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.583882 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8f82106-cdc2-4dec-a30d-221e17bdc50d-metrics-certs\") pod \"router-default-5444994796-qhtk2\" (UID: \"e8f82106-cdc2-4dec-a30d-221e17bdc50d\") " pod="openshift-ingress/router-default-5444994796-qhtk2" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.583907 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdlkd\" (UniqueName: \"kubernetes.io/projected/e8f82106-cdc2-4dec-a30d-221e17bdc50d-kube-api-access-tdlkd\") pod \"router-default-5444994796-qhtk2\" (UID: \"e8f82106-cdc2-4dec-a30d-221e17bdc50d\") " pod="openshift-ingress/router-default-5444994796-qhtk2" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.583925 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb00ea97-33bb-448f-afb1-372543ca82f5-config\") pod \"machine-approver-56656f9798-8d52f\" (UID: \"eb00ea97-33bb-448f-afb1-372543ca82f5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8d52f" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.583945 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/62e4e6c5-0e68-43dd-b00a-6d1a07973d9a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-9w259\" (UID: \"62e4e6c5-0e68-43dd-b00a-6d1a07973d9a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9w259" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.587703 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-mzbqd\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.588092 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/784eec32-b943-4380-9c3a-7f72b1756906-etcd-client\") pod \"etcd-operator-b45778765-xh4tr\" (UID: \"784eec32-b943-4380-9c3a-7f72b1756906\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xh4tr" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.588257 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-mzbqd\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.588266 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wgg26"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.588618 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-v8msm"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.589260 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6f04204c-29c6-4b62-b4f4-28935672a20e-images\") pod \"machine-api-operator-5694c8668f-hwqgl\" (UID: \"6f04204c-29c6-4b62-b4f4-28935672a20e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hwqgl" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.589760 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/784eec32-b943-4380-9c3a-7f72b1756906-etcd-ca\") pod \"etcd-operator-b45778765-xh4tr\" (UID: \"784eec32-b943-4380-9c3a-7f72b1756906\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xh4tr" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.590008 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdb62c42-c462-451d-a594-29333e494057-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pn5g9\" (UID: \"fdb62c42-c462-451d-a594-29333e494057\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pn5g9" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.590567 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8723600d-fcf8-414f-a440-da9d65d41e07-oauth-serving-cert\") pod \"console-f9d7485db-97gxp\" (UID: \"8723600d-fcf8-414f-a440-da9d65d41e07\") " pod="openshift-console/console-f9d7485db-97gxp" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.591643 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/115cccb9-61dc-46f6-bdf3-6bc992b22b54-proxy-tls\") pod \"machine-config-controller-84d6567774-xgdbs\" (UID: \"115cccb9-61dc-46f6-bdf3-6bc992b22b54\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xgdbs" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.591852 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-mzbqd\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.591882 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8723600d-fcf8-414f-a440-da9d65d41e07-trusted-ca-bundle\") pod \"console-f9d7485db-97gxp\" (UID: \"8723600d-fcf8-414f-a440-da9d65d41e07\") " pod="openshift-console/console-f9d7485db-97gxp" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.592291 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-mzbqd\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.592333 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vphlm"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.592475 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2524c61e-9353-41bd-acda-66380b2381b2-audit-dir\") pod \"oauth-openshift-558db77b4-mzbqd\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.592868 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2524c61e-9353-41bd-acda-66380b2381b2-audit-policies\") pod \"oauth-openshift-558db77b4-mzbqd\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.593073 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.593422 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/784eec32-b943-4380-9c3a-7f72b1756906-config\") pod \"etcd-operator-b45778765-xh4tr\" (UID: \"784eec32-b943-4380-9c3a-7f72b1756906\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xh4tr" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.593621 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdb62c42-c462-451d-a594-29333e494057-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pn5g9\" (UID: \"fdb62c42-c462-451d-a594-29333e494057\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pn5g9" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.593668 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-hxspx"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.593701 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8723600d-fcf8-414f-a440-da9d65d41e07-console-oauth-config\") pod \"console-f9d7485db-97gxp\" (UID: \"8723600d-fcf8-414f-a440-da9d65d41e07\") " pod="openshift-console/console-f9d7485db-97gxp" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.593741 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8723600d-fcf8-414f-a440-da9d65d41e07-console-serving-cert\") pod \"console-f9d7485db-97gxp\" (UID: \"8723600d-fcf8-414f-a440-da9d65d41e07\") " pod="openshift-console/console-f9d7485db-97gxp" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.593836 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-mzbqd\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.594312 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-mzbqd\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.594387 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb4655db-a383-4f46-afbb-05557deaeccf-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-vphlm\" (UID: \"cb4655db-a383-4f46-afbb-05557deaeccf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vphlm" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.595322 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-kpfbw"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.596179 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c058a72-250e-4d50-989a-50a9bb515182-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-jrjzh\" (UID: \"6c058a72-250e-4d50-989a-50a9bb515182\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jrjzh" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.596677 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-mzbqd\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.596754 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mw769"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.597874 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rpzsp"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.598963 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mzbqd"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.599658 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-mzbqd\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.599909 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6f04204c-29c6-4b62-b4f4-28935672a20e-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-hwqgl\" (UID: \"6f04204c-29c6-4b62-b4f4-28935672a20e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hwqgl" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.600607 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/784eec32-b943-4380-9c3a-7f72b1756906-serving-cert\") pod \"etcd-operator-b45778765-xh4tr\" (UID: \"784eec32-b943-4380-9c3a-7f72b1756906\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xh4tr" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.601506 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3acd3a6f-6dab-4bff-8f82-b96a8abc1555-client-ca\") pod \"controller-manager-879f6c89f-mw769\" (UID: \"3acd3a6f-6dab-4bff-8f82-b96a8abc1555\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mw769" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.602364 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3acd3a6f-6dab-4bff-8f82-b96a8abc1555-config\") pod \"controller-manager-879f6c89f-mw769\" (UID: \"3acd3a6f-6dab-4bff-8f82-b96a8abc1555\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mw769" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.602616 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8f6a0fa-3b24-45b5-8c5f-ded3c2d01a4a-config\") pod \"kube-controller-manager-operator-78b949d7b-wgg26\" (UID: \"c8f6a0fa-3b24-45b5-8c5f-ded3c2d01a4a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wgg26" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.603429 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3acd3a6f-6dab-4bff-8f82-b96a8abc1555-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-mw769\" (UID: \"3acd3a6f-6dab-4bff-8f82-b96a8abc1555\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mw769" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.604057 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-s7xc9"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.605421 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f5qmc"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.605442 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/af2cb7cc-ae54-4798-a45c-5816a09d1009-profile-collector-cert\") pod \"catalog-operator-68c6474976-rpzsp\" (UID: \"af2cb7cc-ae54-4798-a45c-5816a09d1009\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rpzsp" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.605969 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3acd3a6f-6dab-4bff-8f82-b96a8abc1555-serving-cert\") pod \"controller-manager-879f6c89f-mw769\" (UID: \"3acd3a6f-6dab-4bff-8f82-b96a8abc1555\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mw769" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.606562 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/9a829457-575c-4b6b-bc3b-68c117691597-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-c9qfb\" (UID: \"9a829457-575c-4b6b-bc3b-68c117691597\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c9qfb" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.607329 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8f6a0fa-3b24-45b5-8c5f-ded3c2d01a4a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-wgg26\" (UID: \"c8f6a0fa-3b24-45b5-8c5f-ded3c2d01a4a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wgg26" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.608549 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/af2cb7cc-ae54-4798-a45c-5816a09d1009-srv-cert\") pod \"catalog-operator-68c6474976-rpzsp\" (UID: \"af2cb7cc-ae54-4798-a45c-5816a09d1009\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rpzsp" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.609188 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-nt5q5"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.611026 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xh4tr"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.613244 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-97gxp"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.614312 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.614348 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-xgdbs"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.615968 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-bjq9r"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.616552 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9jcnt"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.617615 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c9qfb"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.619073 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-vtscj"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.620327 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9w259"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.621258 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j79nx"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.622491 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qv97d"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.623298 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rw4lh"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.624836 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-9pxz5"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.628674 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.629809 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-9pxz5" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.630976 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-c6qz6"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.632374 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-c6qz6" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.636608 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-vvg2d"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.646045 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.648393 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-jcdx4"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.650675 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rxb7h"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.654130 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pn5g9"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.654181 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-c6qz6"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.655449 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-wlpng"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.657348 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416380-k4nkv"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.658469 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-mwwth"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.661261 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-647zw"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.662413 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-pnxx4"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.663331 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pnxx4" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.663446 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-pnxx4"] Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.665491 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.684868 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/82128768-e91b-4e38-bc1b-2915bc9d4436-available-featuregates\") pod \"openshift-config-operator-7777fb866f-vtscj\" (UID: \"82128768-e91b-4e38-bc1b-2915bc9d4436\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vtscj" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.684927 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvkzk\" (UniqueName: \"kubernetes.io/projected/0c087aa7-b699-4d07-adb8-e3082e693727-kube-api-access-mvkzk\") pod \"cluster-samples-operator-665b6dd947-4rtwr\" (UID: \"0c087aa7-b699-4d07-adb8-e3082e693727\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4rtwr" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.684973 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/67519387-ecfc-4c03-a199-6679431ac96f-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-nt5q5\" (UID: \"67519387-ecfc-4c03-a199-6679431ac96f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nt5q5" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.685022 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79p52\" (UniqueName: \"kubernetes.io/projected/82128768-e91b-4e38-bc1b-2915bc9d4436-kube-api-access-79p52\") pod \"openshift-config-operator-7777fb866f-vtscj\" (UID: \"82128768-e91b-4e38-bc1b-2915bc9d4436\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vtscj" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.685048 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8f82106-cdc2-4dec-a30d-221e17bdc50d-metrics-certs\") pod \"router-default-5444994796-qhtk2\" (UID: \"e8f82106-cdc2-4dec-a30d-221e17bdc50d\") " pod="openshift-ingress/router-default-5444994796-qhtk2" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.685082 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/62e4e6c5-0e68-43dd-b00a-6d1a07973d9a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-9w259\" (UID: \"62e4e6c5-0e68-43dd-b00a-6d1a07973d9a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9w259" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.685113 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdlkd\" (UniqueName: \"kubernetes.io/projected/e8f82106-cdc2-4dec-a30d-221e17bdc50d-kube-api-access-tdlkd\") pod \"router-default-5444994796-qhtk2\" (UID: \"e8f82106-cdc2-4dec-a30d-221e17bdc50d\") " pod="openshift-ingress/router-default-5444994796-qhtk2" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.685139 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb00ea97-33bb-448f-afb1-372543ca82f5-config\") pod \"machine-approver-56656f9798-8d52f\" (UID: \"eb00ea97-33bb-448f-afb1-372543ca82f5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8d52f" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.685169 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62e4e6c5-0e68-43dd-b00a-6d1a07973d9a-config\") pod \"kube-apiserver-operator-766d6c64bb-9w259\" (UID: \"62e4e6c5-0e68-43dd-b00a-6d1a07973d9a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9w259" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.685192 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/eb00ea97-33bb-448f-afb1-372543ca82f5-auth-proxy-config\") pod \"machine-approver-56656f9798-8d52f\" (UID: \"eb00ea97-33bb-448f-afb1-372543ca82f5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8d52f" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.685211 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/67519387-ecfc-4c03-a199-6679431ac96f-audit-dir\") pod \"apiserver-7bbb656c7d-nt5q5\" (UID: \"67519387-ecfc-4c03-a199-6679431ac96f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nt5q5" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.685231 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/eb00ea97-33bb-448f-afb1-372543ca82f5-machine-approver-tls\") pod \"machine-approver-56656f9798-8d52f\" (UID: \"eb00ea97-33bb-448f-afb1-372543ca82f5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8d52f" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.685249 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5drqm\" (UniqueName: \"kubernetes.io/projected/67519387-ecfc-4c03-a199-6679431ac96f-kube-api-access-5drqm\") pod \"apiserver-7bbb656c7d-nt5q5\" (UID: \"67519387-ecfc-4c03-a199-6679431ac96f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nt5q5" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.685307 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82128768-e91b-4e38-bc1b-2915bc9d4436-serving-cert\") pod \"openshift-config-operator-7777fb866f-vtscj\" (UID: \"82128768-e91b-4e38-bc1b-2915bc9d4436\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vtscj" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.685337 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grdnk\" (UniqueName: \"kubernetes.io/projected/d7a020b5-5981-4927-a254-ecad13dce340-kube-api-access-grdnk\") pod \"dns-operator-744455d44c-kpfbw\" (UID: \"d7a020b5-5981-4927-a254-ecad13dce340\") " pod="openshift-dns-operator/dns-operator-744455d44c-kpfbw" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.685359 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e8f82106-cdc2-4dec-a30d-221e17bdc50d-default-certificate\") pod \"router-default-5444994796-qhtk2\" (UID: \"e8f82106-cdc2-4dec-a30d-221e17bdc50d\") " pod="openshift-ingress/router-default-5444994796-qhtk2" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.685378 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/67519387-ecfc-4c03-a199-6679431ac96f-audit-policies\") pod \"apiserver-7bbb656c7d-nt5q5\" (UID: \"67519387-ecfc-4c03-a199-6679431ac96f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nt5q5" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.685403 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dbea4fd-7c3f-4ce2-bf97-c25431e900e7-config\") pod \"authentication-operator-69f744f599-vvg2d\" (UID: \"2dbea4fd-7c3f-4ce2-bf97-c25431e900e7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vvg2d" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.685420 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dbea4fd-7c3f-4ce2-bf97-c25431e900e7-serving-cert\") pod \"authentication-operator-69f744f599-vvg2d\" (UID: \"2dbea4fd-7c3f-4ce2-bf97-c25431e900e7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vvg2d" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.685438 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2dbea4fd-7c3f-4ce2-bf97-c25431e900e7-service-ca-bundle\") pod \"authentication-operator-69f744f599-vvg2d\" (UID: \"2dbea4fd-7c3f-4ce2-bf97-c25431e900e7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vvg2d" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.685468 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e8f82106-cdc2-4dec-a30d-221e17bdc50d-stats-auth\") pod \"router-default-5444994796-qhtk2\" (UID: \"e8f82106-cdc2-4dec-a30d-221e17bdc50d\") " pod="openshift-ingress/router-default-5444994796-qhtk2" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.685520 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2dbea4fd-7c3f-4ce2-bf97-c25431e900e7-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-vvg2d\" (UID: \"2dbea4fd-7c3f-4ce2-bf97-c25431e900e7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vvg2d" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.685540 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0c087aa7-b699-4d07-adb8-e3082e693727-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4rtwr\" (UID: \"0c087aa7-b699-4d07-adb8-e3082e693727\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4rtwr" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.685563 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d7a020b5-5981-4927-a254-ecad13dce340-metrics-tls\") pod \"dns-operator-744455d44c-kpfbw\" (UID: \"d7a020b5-5981-4927-a254-ecad13dce340\") " pod="openshift-dns-operator/dns-operator-744455d44c-kpfbw" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.685614 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjcqd\" (UniqueName: \"kubernetes.io/projected/2dbea4fd-7c3f-4ce2-bf97-c25431e900e7-kube-api-access-mjcqd\") pod \"authentication-operator-69f744f599-vvg2d\" (UID: \"2dbea4fd-7c3f-4ce2-bf97-c25431e900e7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vvg2d" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.685647 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67519387-ecfc-4c03-a199-6679431ac96f-serving-cert\") pod \"apiserver-7bbb656c7d-nt5q5\" (UID: \"67519387-ecfc-4c03-a199-6679431ac96f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nt5q5" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.685672 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67519387-ecfc-4c03-a199-6679431ac96f-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-nt5q5\" (UID: \"67519387-ecfc-4c03-a199-6679431ac96f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nt5q5" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.685710 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62e4e6c5-0e68-43dd-b00a-6d1a07973d9a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-9w259\" (UID: \"62e4e6c5-0e68-43dd-b00a-6d1a07973d9a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9w259" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.685751 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/67519387-ecfc-4c03-a199-6679431ac96f-etcd-client\") pod \"apiserver-7bbb656c7d-nt5q5\" (UID: \"67519387-ecfc-4c03-a199-6679431ac96f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nt5q5" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.685790 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8f82106-cdc2-4dec-a30d-221e17bdc50d-service-ca-bundle\") pod \"router-default-5444994796-qhtk2\" (UID: \"e8f82106-cdc2-4dec-a30d-221e17bdc50d\") " pod="openshift-ingress/router-default-5444994796-qhtk2" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.685815 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqtws\" (UniqueName: \"kubernetes.io/projected/eb00ea97-33bb-448f-afb1-372543ca82f5-kube-api-access-gqtws\") pod \"machine-approver-56656f9798-8d52f\" (UID: \"eb00ea97-33bb-448f-afb1-372543ca82f5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8d52f" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.685836 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/67519387-ecfc-4c03-a199-6679431ac96f-encryption-config\") pod \"apiserver-7bbb656c7d-nt5q5\" (UID: \"67519387-ecfc-4c03-a199-6679431ac96f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nt5q5" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.685922 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/82128768-e91b-4e38-bc1b-2915bc9d4436-available-featuregates\") pod \"openshift-config-operator-7777fb866f-vtscj\" (UID: \"82128768-e91b-4e38-bc1b-2915bc9d4436\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vtscj" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.686575 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.687077 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb00ea97-33bb-448f-afb1-372543ca82f5-config\") pod \"machine-approver-56656f9798-8d52f\" (UID: \"eb00ea97-33bb-448f-afb1-372543ca82f5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8d52f" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.687246 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/67519387-ecfc-4c03-a199-6679431ac96f-audit-dir\") pod \"apiserver-7bbb656c7d-nt5q5\" (UID: \"67519387-ecfc-4c03-a199-6679431ac96f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nt5q5" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.687627 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/eb00ea97-33bb-448f-afb1-372543ca82f5-auth-proxy-config\") pod \"machine-approver-56656f9798-8d52f\" (UID: \"eb00ea97-33bb-448f-afb1-372543ca82f5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8d52f" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.690201 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/eb00ea97-33bb-448f-afb1-372543ca82f5-machine-approver-tls\") pod \"machine-approver-56656f9798-8d52f\" (UID: \"eb00ea97-33bb-448f-afb1-372543ca82f5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8d52f" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.691384 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dbea4fd-7c3f-4ce2-bf97-c25431e900e7-serving-cert\") pod \"authentication-operator-69f744f599-vvg2d\" (UID: \"2dbea4fd-7c3f-4ce2-bf97-c25431e900e7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vvg2d" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.705228 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.708111 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dbea4fd-7c3f-4ce2-bf97-c25431e900e7-config\") pod \"authentication-operator-69f744f599-vvg2d\" (UID: \"2dbea4fd-7c3f-4ce2-bf97-c25431e900e7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vvg2d" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.726428 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.745907 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.747821 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2dbea4fd-7c3f-4ce2-bf97-c25431e900e7-service-ca-bundle\") pod \"authentication-operator-69f744f599-vvg2d\" (UID: \"2dbea4fd-7c3f-4ce2-bf97-c25431e900e7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vvg2d" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.767031 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.792885 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.798242 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2dbea4fd-7c3f-4ce2-bf97-c25431e900e7-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-vvg2d\" (UID: \"2dbea4fd-7c3f-4ce2-bf97-c25431e900e7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vvg2d" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.806412 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.827488 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.840417 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0c087aa7-b699-4d07-adb8-e3082e693727-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4rtwr\" (UID: \"0c087aa7-b699-4d07-adb8-e3082e693727\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4rtwr" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.847123 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.866046 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.886240 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.920892 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2spk\" (UniqueName: \"kubernetes.io/projected/9aca593e-3156-4661-9e09-5c9d355801d5-kube-api-access-h2spk\") pod \"ingress-operator-5b745b69d9-2xvjl\" (UID: \"9aca593e-3156-4661-9e09-5c9d355801d5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2xvjl" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.940920 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbm75\" (UniqueName: \"kubernetes.io/projected/6ef47549-ccde-4565-935c-8a92998c5eb3-kube-api-access-bbm75\") pod \"route-controller-manager-6576b87f9c-nhs9f\" (UID: \"6ef47549-ccde-4565-935c-8a92998c5eb3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nhs9f" Dec 06 01:04:36 crc kubenswrapper[4991]: I1206 01:04:36.985828 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9aca593e-3156-4661-9e09-5c9d355801d5-bound-sa-token\") pod \"ingress-operator-5b745b69d9-2xvjl\" (UID: \"9aca593e-3156-4661-9e09-5c9d355801d5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2xvjl" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.002256 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9dh5\" (UniqueName: \"kubernetes.io/projected/dd63c06e-9514-4718-8678-db55b7deedfc-kube-api-access-p9dh5\") pod \"apiserver-76f77b778f-h594s\" (UID: \"dd63c06e-9514-4718-8678-db55b7deedfc\") " pod="openshift-apiserver/apiserver-76f77b778f-h594s" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.005552 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.025312 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.046687 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.057509 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62e4e6c5-0e68-43dd-b00a-6d1a07973d9a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-9w259\" (UID: \"62e4e6c5-0e68-43dd-b00a-6d1a07973d9a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9w259" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.067799 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.078840 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62e4e6c5-0e68-43dd-b00a-6d1a07973d9a-config\") pod \"kube-apiserver-operator-766d6c64bb-9w259\" (UID: \"62e4e6c5-0e68-43dd-b00a-6d1a07973d9a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9w259" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.086136 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.107028 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.126903 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.132180 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d7a020b5-5981-4927-a254-ecad13dce340-metrics-tls\") pod \"dns-operator-744455d44c-kpfbw\" (UID: \"d7a020b5-5981-4927-a254-ecad13dce340\") " pod="openshift-dns-operator/dns-operator-744455d44c-kpfbw" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.149853 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.166004 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.186030 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.192139 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e8f82106-cdc2-4dec-a30d-221e17bdc50d-default-certificate\") pod \"router-default-5444994796-qhtk2\" (UID: \"e8f82106-cdc2-4dec-a30d-221e17bdc50d\") " pod="openshift-ingress/router-default-5444994796-qhtk2" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.206788 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.212562 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e8f82106-cdc2-4dec-a30d-221e17bdc50d-stats-auth\") pod \"router-default-5444994796-qhtk2\" (UID: \"e8f82106-cdc2-4dec-a30d-221e17bdc50d\") " pod="openshift-ingress/router-default-5444994796-qhtk2" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.219875 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nhs9f" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.227095 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.242263 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8f82106-cdc2-4dec-a30d-221e17bdc50d-metrics-certs\") pod \"router-default-5444994796-qhtk2\" (UID: \"e8f82106-cdc2-4dec-a30d-221e17bdc50d\") " pod="openshift-ingress/router-default-5444994796-qhtk2" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.243511 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2xvjl" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.246025 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.261227 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-h594s" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.266658 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.268683 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8f82106-cdc2-4dec-a30d-221e17bdc50d-service-ca-bundle\") pod \"router-default-5444994796-qhtk2\" (UID: \"e8f82106-cdc2-4dec-a30d-221e17bdc50d\") " pod="openshift-ingress/router-default-5444994796-qhtk2" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.285894 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.307206 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.318854 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/67519387-ecfc-4c03-a199-6679431ac96f-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-nt5q5\" (UID: \"67519387-ecfc-4c03-a199-6679431ac96f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nt5q5" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.327512 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.347714 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.366544 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.373072 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/67519387-ecfc-4c03-a199-6679431ac96f-etcd-client\") pod \"apiserver-7bbb656c7d-nt5q5\" (UID: \"67519387-ecfc-4c03-a199-6679431ac96f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nt5q5" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.386760 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.390460 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67519387-ecfc-4c03-a199-6679431ac96f-serving-cert\") pod \"apiserver-7bbb656c7d-nt5q5\" (UID: \"67519387-ecfc-4c03-a199-6679431ac96f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nt5q5" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.406675 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.420636 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/67519387-ecfc-4c03-a199-6679431ac96f-encryption-config\") pod \"apiserver-7bbb656c7d-nt5q5\" (UID: \"67519387-ecfc-4c03-a199-6679431ac96f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nt5q5" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.426829 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.428683 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67519387-ecfc-4c03-a199-6679431ac96f-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-nt5q5\" (UID: \"67519387-ecfc-4c03-a199-6679431ac96f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nt5q5" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.447213 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.458463 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-2xvjl"] Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.466463 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.489038 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.489783 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-nhs9f"] Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.498937 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/67519387-ecfc-4c03-a199-6679431ac96f-audit-policies\") pod \"apiserver-7bbb656c7d-nt5q5\" (UID: \"67519387-ecfc-4c03-a199-6679431ac96f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nt5q5" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.504163 4991 request.go:700] Waited for 1.000335432s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmachine-config-operator-dockercfg-98p87&limit=500&resourceVersion=0 Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.507113 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 06 01:04:37 crc kubenswrapper[4991]: W1206 01:04:37.516486 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ef47549_ccde_4565_935c_8a92998c5eb3.slice/crio-55eb1c88b0be911f701d88c8ed61ffb30352ccb0fff37e3de488a1e8822ab894 WatchSource:0}: Error finding container 55eb1c88b0be911f701d88c8ed61ffb30352ccb0fff37e3de488a1e8822ab894: Status 404 returned error can't find the container with id 55eb1c88b0be911f701d88c8ed61ffb30352ccb0fff37e3de488a1e8822ab894 Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.527282 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.538340 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-h594s"] Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.546479 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 06 01:04:37 crc kubenswrapper[4991]: W1206 01:04:37.555866 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd63c06e_9514_4718_8678_db55b7deedfc.slice/crio-f77e9ca3b545e0f58039db29114bb343c51119768327cec1c4fda5f404f393eb WatchSource:0}: Error finding container f77e9ca3b545e0f58039db29114bb343c51119768327cec1c4fda5f404f393eb: Status 404 returned error can't find the container with id f77e9ca3b545e0f58039db29114bb343c51119768327cec1c4fda5f404f393eb Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.566846 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.586461 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.606105 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.613168 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82128768-e91b-4e38-bc1b-2915bc9d4436-serving-cert\") pod \"openshift-config-operator-7777fb866f-vtscj\" (UID: \"82128768-e91b-4e38-bc1b-2915bc9d4436\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vtscj" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.626395 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.645842 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.666788 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.685660 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.705830 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.745436 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.765810 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.785425 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.810759 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.826336 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.861854 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wx9g\" (UniqueName: \"kubernetes.io/projected/6f04204c-29c6-4b62-b4f4-28935672a20e-kube-api-access-8wx9g\") pod \"machine-api-operator-5694c8668f-hwqgl\" (UID: \"6f04204c-29c6-4b62-b4f4-28935672a20e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hwqgl" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.865851 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.884891 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.905602 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.911881 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nhs9f" event={"ID":"6ef47549-ccde-4565-935c-8a92998c5eb3","Type":"ContainerStarted","Data":"0121e0344eecde6b772d3c8043ed674d1c1545009fc52db676ef55668f45abdc"} Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.911972 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nhs9f" event={"ID":"6ef47549-ccde-4565-935c-8a92998c5eb3","Type":"ContainerStarted","Data":"55eb1c88b0be911f701d88c8ed61ffb30352ccb0fff37e3de488a1e8822ab894"} Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.913129 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nhs9f" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.914878 4991 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-nhs9f container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.914927 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nhs9f" podUID="6ef47549-ccde-4565-935c-8a92998c5eb3" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.915019 4991 generic.go:334] "Generic (PLEG): container finished" podID="dd63c06e-9514-4718-8678-db55b7deedfc" containerID="2e89efc4dd828fad16f1e7e174ec77900f111f1ff448ac37753aa872fe5f7dde" exitCode=0 Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.915217 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-h594s" event={"ID":"dd63c06e-9514-4718-8678-db55b7deedfc","Type":"ContainerDied","Data":"2e89efc4dd828fad16f1e7e174ec77900f111f1ff448ac37753aa872fe5f7dde"} Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.915249 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-h594s" event={"ID":"dd63c06e-9514-4718-8678-db55b7deedfc","Type":"ContainerStarted","Data":"f77e9ca3b545e0f58039db29114bb343c51119768327cec1c4fda5f404f393eb"} Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.919220 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2xvjl" event={"ID":"9aca593e-3156-4661-9e09-5c9d355801d5","Type":"ContainerStarted","Data":"a1571e72725aeef520a4b6b8909470648bab8b06e4eb545ab112d2431b465b4f"} Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.919297 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2xvjl" event={"ID":"9aca593e-3156-4661-9e09-5c9d355801d5","Type":"ContainerStarted","Data":"4cffe723ebd23a9929194385a6ee0001518107cba8f7cc151296657965a4b228"} Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.926710 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.946546 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.967312 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 06 01:04:37 crc kubenswrapper[4991]: I1206 01:04:37.986776 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.027009 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwn96\" (UniqueName: \"kubernetes.io/projected/b9bc7fc1-7d76-413c-9d29-58584564ac16-kube-api-access-lwn96\") pod \"migrator-59844c95c7-mgs4q\" (UID: \"b9bc7fc1-7d76-413c-9d29-58584564ac16\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mgs4q" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.042001 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fg95\" (UniqueName: \"kubernetes.io/projected/9a829457-575c-4b6b-bc3b-68c117691597-kube-api-access-9fg95\") pod \"control-plane-machine-set-operator-78cbb6b69f-c9qfb\" (UID: \"9a829457-575c-4b6b-bc3b-68c117691597\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c9qfb" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.066214 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.066342 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn6rd\" (UniqueName: \"kubernetes.io/projected/6c058a72-250e-4d50-989a-50a9bb515182-kube-api-access-dn6rd\") pod \"kube-storage-version-migrator-operator-b67b599dd-jrjzh\" (UID: \"6c058a72-250e-4d50-989a-50a9bb515182\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jrjzh" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.080366 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-hwqgl" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.086767 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.106623 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.123246 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mgs4q" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.126085 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.146775 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.163408 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c9qfb" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.166663 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.188665 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.206844 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.226695 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.246131 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.268910 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.276106 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jrjzh" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.300693 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-hwqgl"] Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.311289 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x5kd\" (UniqueName: \"kubernetes.io/projected/8723600d-fcf8-414f-a440-da9d65d41e07-kube-api-access-4x5kd\") pod \"console-f9d7485db-97gxp\" (UID: \"8723600d-fcf8-414f-a440-da9d65d41e07\") " pod="openshift-console/console-f9d7485db-97gxp" Dec 06 01:04:38 crc kubenswrapper[4991]: W1206 01:04:38.324211 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f04204c_29c6_4b62_b4f4_28935672a20e.slice/crio-93f149ca0c57057e6b5b0b8c0e9d9e238c62361e9b7e4257d06fb1b927af2d75 WatchSource:0}: Error finding container 93f149ca0c57057e6b5b0b8c0e9d9e238c62361e9b7e4257d06fb1b927af2d75: Status 404 returned error can't find the container with id 93f149ca0c57057e6b5b0b8c0e9d9e238c62361e9b7e4257d06fb1b927af2d75 Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.324532 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c8f6a0fa-3b24-45b5-8c5f-ded3c2d01a4a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-wgg26\" (UID: \"c8f6a0fa-3b24-45b5-8c5f-ded3c2d01a4a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wgg26" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.343799 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5vwf\" (UniqueName: \"kubernetes.io/projected/784eec32-b943-4380-9c3a-7f72b1756906-kube-api-access-j5vwf\") pod \"etcd-operator-b45778765-xh4tr\" (UID: \"784eec32-b943-4380-9c3a-7f72b1756906\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xh4tr" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.361606 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tl4n\" (UniqueName: \"kubernetes.io/projected/cb4655db-a383-4f46-afbb-05557deaeccf-kube-api-access-4tl4n\") pod \"openshift-apiserver-operator-796bbdcf4f-vphlm\" (UID: \"cb4655db-a383-4f46-afbb-05557deaeccf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vphlm" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.367912 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.368186 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-mgs4q"] Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.386320 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.406368 4991 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.431445 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wgg26" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.441410 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c9qfb"] Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.446197 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt48w\" (UniqueName: \"kubernetes.io/projected/2524c61e-9353-41bd-acda-66380b2381b2-kube-api-access-vt48w\") pod \"oauth-openshift-558db77b4-mzbqd\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.449555 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vphlm" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.463406 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd82h\" (UniqueName: \"kubernetes.io/projected/3acd3a6f-6dab-4bff-8f82-b96a8abc1555-kube-api-access-vd82h\") pod \"controller-manager-879f6c89f-mw769\" (UID: \"3acd3a6f-6dab-4bff-8f82-b96a8abc1555\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mw769" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.476198 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-97gxp" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.487341 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cglf\" (UniqueName: \"kubernetes.io/projected/af2cb7cc-ae54-4798-a45c-5816a09d1009-kube-api-access-4cglf\") pod \"catalog-operator-68c6474976-rpzsp\" (UID: \"af2cb7cc-ae54-4798-a45c-5816a09d1009\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rpzsp" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.490000 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.499522 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rpzsp" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.504724 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7sgl\" (UniqueName: \"kubernetes.io/projected/115cccb9-61dc-46f6-bdf3-6bc992b22b54-kube-api-access-k7sgl\") pod \"machine-config-controller-84d6567774-xgdbs\" (UID: \"115cccb9-61dc-46f6-bdf3-6bc992b22b54\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xgdbs" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.523708 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fdb62c42-c462-451d-a594-29333e494057-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pn5g9\" (UID: \"fdb62c42-c462-451d-a594-29333e494057\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pn5g9" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.524223 4991 request.go:700] Waited for 1.893950779s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dnode-bootstrapper-token&limit=500&resourceVersion=0 Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.533634 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-xh4tr" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.533692 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.547339 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.568262 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.587589 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.607561 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.629494 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.646151 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.666973 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.672162 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jrjzh"] Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.685594 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.711681 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-mw769" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.712657 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.756582 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wgg26"] Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.764344 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xgdbs" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.765462 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/62e4e6c5-0e68-43dd-b00a-6d1a07973d9a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-9w259\" (UID: \"62e4e6c5-0e68-43dd-b00a-6d1a07973d9a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9w259" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.777290 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvkzk\" (UniqueName: \"kubernetes.io/projected/0c087aa7-b699-4d07-adb8-e3082e693727-kube-api-access-mvkzk\") pod \"cluster-samples-operator-665b6dd947-4rtwr\" (UID: \"0c087aa7-b699-4d07-adb8-e3082e693727\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4rtwr" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.784605 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pn5g9" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.786343 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdlkd\" (UniqueName: \"kubernetes.io/projected/e8f82106-cdc2-4dec-a30d-221e17bdc50d-kube-api-access-tdlkd\") pod \"router-default-5444994796-qhtk2\" (UID: \"e8f82106-cdc2-4dec-a30d-221e17bdc50d\") " pod="openshift-ingress/router-default-5444994796-qhtk2" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.800432 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vphlm"] Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.811937 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79p52\" (UniqueName: \"kubernetes.io/projected/82128768-e91b-4e38-bc1b-2915bc9d4436-kube-api-access-79p52\") pod \"openshift-config-operator-7777fb866f-vtscj\" (UID: \"82128768-e91b-4e38-bc1b-2915bc9d4436\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vtscj" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.845137 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjcqd\" (UniqueName: \"kubernetes.io/projected/2dbea4fd-7c3f-4ce2-bf97-c25431e900e7-kube-api-access-mjcqd\") pod \"authentication-operator-69f744f599-vvg2d\" (UID: \"2dbea4fd-7c3f-4ce2-bf97-c25431e900e7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vvg2d" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.852769 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4rtwr" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.853834 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grdnk\" (UniqueName: \"kubernetes.io/projected/d7a020b5-5981-4927-a254-ecad13dce340-kube-api-access-grdnk\") pod \"dns-operator-744455d44c-kpfbw\" (UID: \"d7a020b5-5981-4927-a254-ecad13dce340\") " pod="openshift-dns-operator/dns-operator-744455d44c-kpfbw" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.858564 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9w259" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.867160 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-kpfbw" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.877915 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-qhtk2" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.879252 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqtws\" (UniqueName: \"kubernetes.io/projected/eb00ea97-33bb-448f-afb1-372543ca82f5-kube-api-access-gqtws\") pod \"machine-approver-56656f9798-8d52f\" (UID: \"eb00ea97-33bb-448f-afb1-372543ca82f5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8d52f" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.904070 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vtscj" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.927430 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5drqm\" (UniqueName: \"kubernetes.io/projected/67519387-ecfc-4c03-a199-6679431ac96f-kube-api-access-5drqm\") pod \"apiserver-7bbb656c7d-nt5q5\" (UID: \"67519387-ecfc-4c03-a199-6679431ac96f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nt5q5" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.932133 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xgpn\" (UniqueName: \"kubernetes.io/projected/6a61cec3-9e94-43d4-8e8e-42674e6e6ef9-kube-api-access-8xgpn\") pod \"packageserver-d55dfcdfc-qv97d\" (UID: \"6a61cec3-9e94-43d4-8e8e-42674e6e6ef9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qv97d" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.932197 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8n24\" (UniqueName: \"kubernetes.io/projected/4afb29d9-e4be-4c64-88d4-823d89c15ec8-kube-api-access-b8n24\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.932223 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ae11df0-5cf0-4a53-b313-91c9182dde28-serving-cert\") pod \"console-operator-58897d9998-hxspx\" (UID: \"1ae11df0-5cf0-4a53-b313-91c9182dde28\") " pod="openshift-console-operator/console-operator-58897d9998-hxspx" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.932280 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49b7d\" (UniqueName: \"kubernetes.io/projected/102ad02d-65ff-4d28-9d31-c95f66848f71-kube-api-access-49b7d\") pod \"machine-config-operator-74547568cd-jcdx4\" (UID: \"102ad02d-65ff-4d28-9d31-c95f66848f71\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jcdx4" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.932313 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6a61cec3-9e94-43d4-8e8e-42674e6e6ef9-tmpfs\") pod \"packageserver-d55dfcdfc-qv97d\" (UID: \"6a61cec3-9e94-43d4-8e8e-42674e6e6ef9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qv97d" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.932364 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4afb29d9-e4be-4c64-88d4-823d89c15ec8-installation-pull-secrets\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.932388 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4afb29d9-e4be-4c64-88d4-823d89c15ec8-bound-sa-token\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.932419 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.932444 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/102ad02d-65ff-4d28-9d31-c95f66848f71-proxy-tls\") pod \"machine-config-operator-74547568cd-jcdx4\" (UID: \"102ad02d-65ff-4d28-9d31-c95f66848f71\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jcdx4" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.932468 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/102ad02d-65ff-4d28-9d31-c95f66848f71-auth-proxy-config\") pod \"machine-config-operator-74547568cd-jcdx4\" (UID: \"102ad02d-65ff-4d28-9d31-c95f66848f71\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jcdx4" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.932507 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9e67932d-8fce-413c-895a-c3e9c6fb0ca3-profile-collector-cert\") pod \"olm-operator-6b444d44fb-f5qmc\" (UID: \"9e67932d-8fce-413c-895a-c3e9c6fb0ca3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f5qmc" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.932551 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54fd8fc3-c45f-43d0-aeff-65e3b1669071-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-j79nx\" (UID: \"54fd8fc3-c45f-43d0-aeff-65e3b1669071\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j79nx" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.932573 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q6qn\" (UniqueName: \"kubernetes.io/projected/9e67932d-8fce-413c-895a-c3e9c6fb0ca3-kube-api-access-5q6qn\") pod \"olm-operator-6b444d44fb-f5qmc\" (UID: \"9e67932d-8fce-413c-895a-c3e9c6fb0ca3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f5qmc" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.932598 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6a61cec3-9e94-43d4-8e8e-42674e6e6ef9-apiservice-cert\") pod \"packageserver-d55dfcdfc-qv97d\" (UID: \"6a61cec3-9e94-43d4-8e8e-42674e6e6ef9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qv97d" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.932620 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ae11df0-5cf0-4a53-b313-91c9182dde28-config\") pod \"console-operator-58897d9998-hxspx\" (UID: \"1ae11df0-5cf0-4a53-b313-91c9182dde28\") " pod="openshift-console-operator/console-operator-58897d9998-hxspx" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.932666 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zg4s7\" (UniqueName: \"kubernetes.io/projected/1ae11df0-5cf0-4a53-b313-91c9182dde28-kube-api-access-zg4s7\") pod \"console-operator-58897d9998-hxspx\" (UID: \"1ae11df0-5cf0-4a53-b313-91c9182dde28\") " pod="openshift-console-operator/console-operator-58897d9998-hxspx" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.932685 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1ae11df0-5cf0-4a53-b313-91c9182dde28-trusted-ca\") pod \"console-operator-58897d9998-hxspx\" (UID: \"1ae11df0-5cf0-4a53-b313-91c9182dde28\") " pod="openshift-console-operator/console-operator-58897d9998-hxspx" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.932729 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4afb29d9-e4be-4c64-88d4-823d89c15ec8-registry-certificates\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.932792 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n6rl\" (UniqueName: \"kubernetes.io/projected/54fd8fc3-c45f-43d0-aeff-65e3b1669071-kube-api-access-8n6rl\") pod \"openshift-controller-manager-operator-756b6f6bc6-j79nx\" (UID: \"54fd8fc3-c45f-43d0-aeff-65e3b1669071\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j79nx" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.932827 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/102ad02d-65ff-4d28-9d31-c95f66848f71-images\") pod \"machine-config-operator-74547568cd-jcdx4\" (UID: \"102ad02d-65ff-4d28-9d31-c95f66848f71\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jcdx4" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.932865 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4afb29d9-e4be-4c64-88d4-823d89c15ec8-trusted-ca\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.933060 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4afb29d9-e4be-4c64-88d4-823d89c15ec8-registry-tls\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.933084 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9e67932d-8fce-413c-895a-c3e9c6fb0ca3-srv-cert\") pod \"olm-operator-6b444d44fb-f5qmc\" (UID: \"9e67932d-8fce-413c-895a-c3e9c6fb0ca3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f5qmc" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.933103 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4afb29d9-e4be-4c64-88d4-823d89c15ec8-ca-trust-extracted\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.933126 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6a61cec3-9e94-43d4-8e8e-42674e6e6ef9-webhook-cert\") pod \"packageserver-d55dfcdfc-qv97d\" (UID: \"6a61cec3-9e94-43d4-8e8e-42674e6e6ef9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qv97d" Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.933146 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54fd8fc3-c45f-43d0-aeff-65e3b1669071-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-j79nx\" (UID: \"54fd8fc3-c45f-43d0-aeff-65e3b1669071\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j79nx" Dec 06 01:04:38 crc kubenswrapper[4991]: E1206 01:04:38.943344 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 01:04:39.443314788 +0000 UTC m=+152.793605256 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v8msm" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.952271 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jrjzh" event={"ID":"6c058a72-250e-4d50-989a-50a9bb515182","Type":"ContainerStarted","Data":"2610627f6c40f50df577e192b0916879940b483b035d403a0f09cd60515d4641"} Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.968787 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rpzsp"] Dec 06 01:04:38 crc kubenswrapper[4991]: I1206 01:04:38.983395 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vphlm" event={"ID":"cb4655db-a383-4f46-afbb-05557deaeccf","Type":"ContainerStarted","Data":"99a27e5ddb986d53119c66e47fd0374ef70a430d21ae10a2d80f561577b4846a"} Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.015024 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wgg26" event={"ID":"c8f6a0fa-3b24-45b5-8c5f-ded3c2d01a4a","Type":"ContainerStarted","Data":"94283bc7f7f8918948a34520e38c2ed87345bb59da286043aa4a01f1d588a7f9"} Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.027215 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mgs4q" event={"ID":"b9bc7fc1-7d76-413c-9d29-58584564ac16","Type":"ContainerStarted","Data":"8b1d5093590a3c2ae86073f88053e411d77b6149d0d7808cd63a75dd932b9377"} Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.027282 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mgs4q" event={"ID":"b9bc7fc1-7d76-413c-9d29-58584564ac16","Type":"ContainerStarted","Data":"771ea6254a57443acf7fc4215ed1e266b8aa7b851d10eb31b721d4e45ea1ece6"} Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.034128 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:04:39 crc kubenswrapper[4991]: E1206 01:04:39.034206 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:04:39.534187059 +0000 UTC m=+152.884477527 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.035520 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0d67d95f-81bb-491f-87ca-e15aba253031-config-volume\") pod \"collect-profiles-29416380-k4nkv\" (UID: \"0d67d95f-81bb-491f-87ca-e15aba253031\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416380-k4nkv" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.035546 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3ad4cd41-d615-4bdd-bd75-baa9c5f2a141-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rxb7h\" (UID: \"3ad4cd41-d615-4bdd-bd75-baa9c5f2a141\") " pod="openshift-marketplace/marketplace-operator-79b997595-rxb7h" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.035567 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xgpn\" (UniqueName: \"kubernetes.io/projected/6a61cec3-9e94-43d4-8e8e-42674e6e6ef9-kube-api-access-8xgpn\") pod \"packageserver-d55dfcdfc-qv97d\" (UID: \"6a61cec3-9e94-43d4-8e8e-42674e6e6ef9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qv97d" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.035585 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e335b799-3e6d-4214-bcae-ee6dd64054d5-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-9jcnt\" (UID: \"e335b799-3e6d-4214-bcae-ee6dd64054d5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9jcnt" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.035604 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8n24\" (UniqueName: \"kubernetes.io/projected/4afb29d9-e4be-4c64-88d4-823d89c15ec8-kube-api-access-b8n24\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.035622 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ae11df0-5cf0-4a53-b313-91c9182dde28-serving-cert\") pod \"console-operator-58897d9998-hxspx\" (UID: \"1ae11df0-5cf0-4a53-b313-91c9182dde28\") " pod="openshift-console-operator/console-operator-58897d9998-hxspx" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.035646 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e335b799-3e6d-4214-bcae-ee6dd64054d5-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-9jcnt\" (UID: \"e335b799-3e6d-4214-bcae-ee6dd64054d5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9jcnt" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.035676 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49b7d\" (UniqueName: \"kubernetes.io/projected/102ad02d-65ff-4d28-9d31-c95f66848f71-kube-api-access-49b7d\") pod \"machine-config-operator-74547568cd-jcdx4\" (UID: \"102ad02d-65ff-4d28-9d31-c95f66848f71\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jcdx4" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.035715 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6a61cec3-9e94-43d4-8e8e-42674e6e6ef9-tmpfs\") pod \"packageserver-d55dfcdfc-qv97d\" (UID: \"6a61cec3-9e94-43d4-8e8e-42674e6e6ef9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qv97d" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.035734 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4afb29d9-e4be-4c64-88d4-823d89c15ec8-installation-pull-secrets\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.035750 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e335b799-3e6d-4214-bcae-ee6dd64054d5-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-9jcnt\" (UID: \"e335b799-3e6d-4214-bcae-ee6dd64054d5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9jcnt" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.035765 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3ad4cd41-d615-4bdd-bd75-baa9c5f2a141-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rxb7h\" (UID: \"3ad4cd41-d615-4bdd-bd75-baa9c5f2a141\") " pod="openshift-marketplace/marketplace-operator-79b997595-rxb7h" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.035784 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4afb29d9-e4be-4c64-88d4-823d89c15ec8-bound-sa-token\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.035807 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.035828 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/102ad02d-65ff-4d28-9d31-c95f66848f71-proxy-tls\") pod \"machine-config-operator-74547568cd-jcdx4\" (UID: \"102ad02d-65ff-4d28-9d31-c95f66848f71\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jcdx4" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.035880 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/102ad02d-65ff-4d28-9d31-c95f66848f71-auth-proxy-config\") pod \"machine-config-operator-74547568cd-jcdx4\" (UID: \"102ad02d-65ff-4d28-9d31-c95f66848f71\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jcdx4" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.035903 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-696dw\" (UniqueName: \"kubernetes.io/projected/0d67d95f-81bb-491f-87ca-e15aba253031-kube-api-access-696dw\") pod \"collect-profiles-29416380-k4nkv\" (UID: \"0d67d95f-81bb-491f-87ca-e15aba253031\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416380-k4nkv" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.035941 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9e67932d-8fce-413c-895a-c3e9c6fb0ca3-profile-collector-cert\") pod \"olm-operator-6b444d44fb-f5qmc\" (UID: \"9e67932d-8fce-413c-895a-c3e9c6fb0ca3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f5qmc" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.035972 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf947a51-75da-452e-8947-561b0230bcab-cert\") pod \"ingress-canary-pnxx4\" (UID: \"cf947a51-75da-452e-8947-561b0230bcab\") " pod="openshift-ingress-canary/ingress-canary-pnxx4" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.036059 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0d67d95f-81bb-491f-87ca-e15aba253031-secret-volume\") pod \"collect-profiles-29416380-k4nkv\" (UID: \"0d67d95f-81bb-491f-87ca-e15aba253031\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416380-k4nkv" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.036077 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/36232195-9b1f-42bb-92df-0487bd3bac44-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rw4lh\" (UID: \"36232195-9b1f-42bb-92df-0487bd3bac44\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rw4lh" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.036129 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54fd8fc3-c45f-43d0-aeff-65e3b1669071-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-j79nx\" (UID: \"54fd8fc3-c45f-43d0-aeff-65e3b1669071\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j79nx" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.036145 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q6qn\" (UniqueName: \"kubernetes.io/projected/9e67932d-8fce-413c-895a-c3e9c6fb0ca3-kube-api-access-5q6qn\") pod \"olm-operator-6b444d44fb-f5qmc\" (UID: \"9e67932d-8fce-413c-895a-c3e9c6fb0ca3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f5qmc" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.036164 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/65728f91-9c05-4fc3-af26-8ce6838a2d36-config-volume\") pod \"dns-default-c6qz6\" (UID: \"65728f91-9c05-4fc3-af26-8ce6838a2d36\") " pod="openshift-dns/dns-default-c6qz6" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.036180 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6a61cec3-9e94-43d4-8e8e-42674e6e6ef9-apiservice-cert\") pod \"packageserver-d55dfcdfc-qv97d\" (UID: \"6a61cec3-9e94-43d4-8e8e-42674e6e6ef9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qv97d" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.036208 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ae11df0-5cf0-4a53-b313-91c9182dde28-config\") pod \"console-operator-58897d9998-hxspx\" (UID: \"1ae11df0-5cf0-4a53-b313-91c9182dde28\") " pod="openshift-console-operator/console-operator-58897d9998-hxspx" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.036241 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcr2t\" (UniqueName: \"kubernetes.io/projected/36232195-9b1f-42bb-92df-0487bd3bac44-kube-api-access-qcr2t\") pod \"package-server-manager-789f6589d5-rw4lh\" (UID: \"36232195-9b1f-42bb-92df-0487bd3bac44\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rw4lh" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.036267 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zg4s7\" (UniqueName: \"kubernetes.io/projected/1ae11df0-5cf0-4a53-b313-91c9182dde28-kube-api-access-zg4s7\") pod \"console-operator-58897d9998-hxspx\" (UID: \"1ae11df0-5cf0-4a53-b313-91c9182dde28\") " pod="openshift-console-operator/console-operator-58897d9998-hxspx" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.036283 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1ae11df0-5cf0-4a53-b313-91c9182dde28-trusted-ca\") pod \"console-operator-58897d9998-hxspx\" (UID: \"1ae11df0-5cf0-4a53-b313-91c9182dde28\") " pod="openshift-console-operator/console-operator-58897d9998-hxspx" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.036310 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlm24\" (UniqueName: \"kubernetes.io/projected/e335b799-3e6d-4214-bcae-ee6dd64054d5-kube-api-access-dlm24\") pod \"cluster-image-registry-operator-dc59b4c8b-9jcnt\" (UID: \"e335b799-3e6d-4214-bcae-ee6dd64054d5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9jcnt" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.036333 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tv58\" (UniqueName: \"kubernetes.io/projected/cf947a51-75da-452e-8947-561b0230bcab-kube-api-access-7tv58\") pod \"ingress-canary-pnxx4\" (UID: \"cf947a51-75da-452e-8947-561b0230bcab\") " pod="openshift-ingress-canary/ingress-canary-pnxx4" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.036357 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjztb\" (UniqueName: \"kubernetes.io/projected/65728f91-9c05-4fc3-af26-8ce6838a2d36-kube-api-access-kjztb\") pod \"dns-default-c6qz6\" (UID: \"65728f91-9c05-4fc3-af26-8ce6838a2d36\") " pod="openshift-dns/dns-default-c6qz6" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.036568 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4afb29d9-e4be-4c64-88d4-823d89c15ec8-registry-certificates\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.036620 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8n6rl\" (UniqueName: \"kubernetes.io/projected/54fd8fc3-c45f-43d0-aeff-65e3b1669071-kube-api-access-8n6rl\") pod \"openshift-controller-manager-operator-756b6f6bc6-j79nx\" (UID: \"54fd8fc3-c45f-43d0-aeff-65e3b1669071\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j79nx" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.036636 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/65728f91-9c05-4fc3-af26-8ce6838a2d36-metrics-tls\") pod \"dns-default-c6qz6\" (UID: \"65728f91-9c05-4fc3-af26-8ce6838a2d36\") " pod="openshift-dns/dns-default-c6qz6" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.036651 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgsrf\" (UniqueName: \"kubernetes.io/projected/3ad4cd41-d615-4bdd-bd75-baa9c5f2a141-kube-api-access-tgsrf\") pod \"marketplace-operator-79b997595-rxb7h\" (UID: \"3ad4cd41-d615-4bdd-bd75-baa9c5f2a141\") " pod="openshift-marketplace/marketplace-operator-79b997595-rxb7h" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.036686 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/102ad02d-65ff-4d28-9d31-c95f66848f71-images\") pod \"machine-config-operator-74547568cd-jcdx4\" (UID: \"102ad02d-65ff-4d28-9d31-c95f66848f71\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jcdx4" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.036730 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4afb29d9-e4be-4c64-88d4-823d89c15ec8-trusted-ca\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.036766 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4afb29d9-e4be-4c64-88d4-823d89c15ec8-registry-tls\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.036785 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9e67932d-8fce-413c-895a-c3e9c6fb0ca3-srv-cert\") pod \"olm-operator-6b444d44fb-f5qmc\" (UID: \"9e67932d-8fce-413c-895a-c3e9c6fb0ca3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f5qmc" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.036811 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4afb29d9-e4be-4c64-88d4-823d89c15ec8-ca-trust-extracted\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.036827 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6a61cec3-9e94-43d4-8e8e-42674e6e6ef9-webhook-cert\") pod \"packageserver-d55dfcdfc-qv97d\" (UID: \"6a61cec3-9e94-43d4-8e8e-42674e6e6ef9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qv97d" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.036848 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54fd8fc3-c45f-43d0-aeff-65e3b1669071-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-j79nx\" (UID: \"54fd8fc3-c45f-43d0-aeff-65e3b1669071\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j79nx" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.039872 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54fd8fc3-c45f-43d0-aeff-65e3b1669071-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-j79nx\" (UID: \"54fd8fc3-c45f-43d0-aeff-65e3b1669071\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j79nx" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.044462 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mzbqd"] Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.044883 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ae11df0-5cf0-4a53-b313-91c9182dde28-config\") pod \"console-operator-58897d9998-hxspx\" (UID: \"1ae11df0-5cf0-4a53-b313-91c9182dde28\") " pod="openshift-console-operator/console-operator-58897d9998-hxspx" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.046667 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4afb29d9-e4be-4c64-88d4-823d89c15ec8-trusted-ca\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.047044 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/102ad02d-65ff-4d28-9d31-c95f66848f71-images\") pod \"machine-config-operator-74547568cd-jcdx4\" (UID: \"102ad02d-65ff-4d28-9d31-c95f66848f71\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jcdx4" Dec 06 01:04:39 crc kubenswrapper[4991]: E1206 01:04:39.047344 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 01:04:39.547328001 +0000 UTC m=+152.897618469 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v8msm" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.047867 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4afb29d9-e4be-4c64-88d4-823d89c15ec8-ca-trust-extracted\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.050975 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1ae11df0-5cf0-4a53-b313-91c9182dde28-trusted-ca\") pod \"console-operator-58897d9998-hxspx\" (UID: \"1ae11df0-5cf0-4a53-b313-91c9182dde28\") " pod="openshift-console-operator/console-operator-58897d9998-hxspx" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.053553 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4afb29d9-e4be-4c64-88d4-823d89c15ec8-registry-certificates\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.053860 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6a61cec3-9e94-43d4-8e8e-42674e6e6ef9-tmpfs\") pod \"packageserver-d55dfcdfc-qv97d\" (UID: \"6a61cec3-9e94-43d4-8e8e-42674e6e6ef9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qv97d" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.055797 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/102ad02d-65ff-4d28-9d31-c95f66848f71-auth-proxy-config\") pod \"machine-config-operator-74547568cd-jcdx4\" (UID: \"102ad02d-65ff-4d28-9d31-c95f66848f71\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jcdx4" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.056739 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54fd8fc3-c45f-43d0-aeff-65e3b1669071-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-j79nx\" (UID: \"54fd8fc3-c45f-43d0-aeff-65e3b1669071\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j79nx" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.059642 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9e67932d-8fce-413c-895a-c3e9c6fb0ca3-srv-cert\") pod \"olm-operator-6b444d44fb-f5qmc\" (UID: \"9e67932d-8fce-413c-895a-c3e9c6fb0ca3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f5qmc" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.060181 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4afb29d9-e4be-4c64-88d4-823d89c15ec8-installation-pull-secrets\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.061485 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6a61cec3-9e94-43d4-8e8e-42674e6e6ef9-webhook-cert\") pod \"packageserver-d55dfcdfc-qv97d\" (UID: \"6a61cec3-9e94-43d4-8e8e-42674e6e6ef9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qv97d" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.062606 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6a61cec3-9e94-43d4-8e8e-42674e6e6ef9-apiservice-cert\") pod \"packageserver-d55dfcdfc-qv97d\" (UID: \"6a61cec3-9e94-43d4-8e8e-42674e6e6ef9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qv97d" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.063219 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ae11df0-5cf0-4a53-b313-91c9182dde28-serving-cert\") pod \"console-operator-58897d9998-hxspx\" (UID: \"1ae11df0-5cf0-4a53-b313-91c9182dde28\") " pod="openshift-console-operator/console-operator-58897d9998-hxspx" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.064038 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/102ad02d-65ff-4d28-9d31-c95f66848f71-proxy-tls\") pod \"machine-config-operator-74547568cd-jcdx4\" (UID: \"102ad02d-65ff-4d28-9d31-c95f66848f71\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jcdx4" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.065489 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4afb29d9-e4be-4c64-88d4-823d89c15ec8-registry-tls\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.076044 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9e67932d-8fce-413c-895a-c3e9c6fb0ca3-profile-collector-cert\") pod \"olm-operator-6b444d44fb-f5qmc\" (UID: \"9e67932d-8fce-413c-895a-c3e9c6fb0ca3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f5qmc" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.084772 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-hwqgl" event={"ID":"6f04204c-29c6-4b62-b4f4-28935672a20e","Type":"ContainerStarted","Data":"db94611899daa2138088e3ff2103fc04938aadd676a64297802486673053e6f4"} Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.084827 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-hwqgl" event={"ID":"6f04204c-29c6-4b62-b4f4-28935672a20e","Type":"ContainerStarted","Data":"93f149ca0c57057e6b5b0b8c0e9d9e238c62361e9b7e4257d06fb1b927af2d75"} Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.107163 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q6qn\" (UniqueName: \"kubernetes.io/projected/9e67932d-8fce-413c-895a-c3e9c6fb0ca3-kube-api-access-5q6qn\") pod \"olm-operator-6b444d44fb-f5qmc\" (UID: \"9e67932d-8fce-413c-895a-c3e9c6fb0ca3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f5qmc" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.107692 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4afb29d9-e4be-4c64-88d4-823d89c15ec8-bound-sa-token\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.113336 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f5qmc" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.119164 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xh4tr"] Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.120609 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8d52f" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.129354 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xgpn\" (UniqueName: \"kubernetes.io/projected/6a61cec3-9e94-43d4-8e8e-42674e6e6ef9-kube-api-access-8xgpn\") pod \"packageserver-d55dfcdfc-qv97d\" (UID: \"6a61cec3-9e94-43d4-8e8e-42674e6e6ef9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qv97d" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.144118 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.144232 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0d67d95f-81bb-491f-87ca-e15aba253031-config-volume\") pod \"collect-profiles-29416380-k4nkv\" (UID: \"0d67d95f-81bb-491f-87ca-e15aba253031\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416380-k4nkv" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.144265 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3ad4cd41-d615-4bdd-bd75-baa9c5f2a141-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rxb7h\" (UID: \"3ad4cd41-d615-4bdd-bd75-baa9c5f2a141\") " pod="openshift-marketplace/marketplace-operator-79b997595-rxb7h" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.144296 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2zfr\" (UniqueName: \"kubernetes.io/projected/f130c551-8102-4733-8447-8f749c2f90f5-kube-api-access-w2zfr\") pod \"csi-hostpathplugin-647zw\" (UID: \"f130c551-8102-4733-8447-8f749c2f90f5\") " pod="hostpath-provisioner/csi-hostpathplugin-647zw" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.144317 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e335b799-3e6d-4214-bcae-ee6dd64054d5-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-9jcnt\" (UID: \"e335b799-3e6d-4214-bcae-ee6dd64054d5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9jcnt" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.144394 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/f130c551-8102-4733-8447-8f749c2f90f5-mountpoint-dir\") pod \"csi-hostpathplugin-647zw\" (UID: \"f130c551-8102-4733-8447-8f749c2f90f5\") " pod="hostpath-provisioner/csi-hostpathplugin-647zw" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.144412 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e335b799-3e6d-4214-bcae-ee6dd64054d5-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-9jcnt\" (UID: \"e335b799-3e6d-4214-bcae-ee6dd64054d5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9jcnt" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.144464 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b8fc9504-e68a-4d0a-99bc-3b8da1e78bac-signing-cabundle\") pod \"service-ca-9c57cc56f-s7xc9\" (UID: \"b8fc9504-e68a-4d0a-99bc-3b8da1e78bac\") " pod="openshift-service-ca/service-ca-9c57cc56f-s7xc9" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.144485 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c9qfb" event={"ID":"9a829457-575c-4b6b-bc3b-68c117691597","Type":"ContainerStarted","Data":"c3daf01d4f268d989f940f2657efc825b826c53dc1974c762c221fb42faa5115"} Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.144502 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e335b799-3e6d-4214-bcae-ee6dd64054d5-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-9jcnt\" (UID: \"e335b799-3e6d-4214-bcae-ee6dd64054d5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9jcnt" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.144579 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3ad4cd41-d615-4bdd-bd75-baa9c5f2a141-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rxb7h\" (UID: \"3ad4cd41-d615-4bdd-bd75-baa9c5f2a141\") " pod="openshift-marketplace/marketplace-operator-79b997595-rxb7h" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.144648 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/f130c551-8102-4733-8447-8f749c2f90f5-plugins-dir\") pod \"csi-hostpathplugin-647zw\" (UID: \"f130c551-8102-4733-8447-8f749c2f90f5\") " pod="hostpath-provisioner/csi-hostpathplugin-647zw" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.144692 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rtg2\" (UniqueName: \"kubernetes.io/projected/b8fc9504-e68a-4d0a-99bc-3b8da1e78bac-kube-api-access-2rtg2\") pod \"service-ca-9c57cc56f-s7xc9\" (UID: \"b8fc9504-e68a-4d0a-99bc-3b8da1e78bac\") " pod="openshift-service-ca/service-ca-9c57cc56f-s7xc9" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.144726 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b8fc9504-e68a-4d0a-99bc-3b8da1e78bac-signing-key\") pod \"service-ca-9c57cc56f-s7xc9\" (UID: \"b8fc9504-e68a-4d0a-99bc-3b8da1e78bac\") " pod="openshift-service-ca/service-ca-9c57cc56f-s7xc9" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.144748 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad326998-ebb8-427b-979c-d21b83f25778-serving-cert\") pod \"service-ca-operator-777779d784-bjq9r\" (UID: \"ad326998-ebb8-427b-979c-d21b83f25778\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bjq9r" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.144776 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-696dw\" (UniqueName: \"kubernetes.io/projected/0d67d95f-81bb-491f-87ca-e15aba253031-kube-api-access-696dw\") pod \"collect-profiles-29416380-k4nkv\" (UID: \"0d67d95f-81bb-491f-87ca-e15aba253031\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416380-k4nkv" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.144826 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7lgb\" (UniqueName: \"kubernetes.io/projected/ad326998-ebb8-427b-979c-d21b83f25778-kube-api-access-l7lgb\") pod \"service-ca-operator-777779d784-bjq9r\" (UID: \"ad326998-ebb8-427b-979c-d21b83f25778\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bjq9r" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.144846 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf947a51-75da-452e-8947-561b0230bcab-cert\") pod \"ingress-canary-pnxx4\" (UID: \"cf947a51-75da-452e-8947-561b0230bcab\") " pod="openshift-ingress-canary/ingress-canary-pnxx4" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.144938 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/36232195-9b1f-42bb-92df-0487bd3bac44-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rw4lh\" (UID: \"36232195-9b1f-42bb-92df-0487bd3bac44\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rw4lh" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.145004 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0d67d95f-81bb-491f-87ca-e15aba253031-secret-volume\") pod \"collect-profiles-29416380-k4nkv\" (UID: \"0d67d95f-81bb-491f-87ca-e15aba253031\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416380-k4nkv" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.145040 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/65728f91-9c05-4fc3-af26-8ce6838a2d36-config-volume\") pod \"dns-default-c6qz6\" (UID: \"65728f91-9c05-4fc3-af26-8ce6838a2d36\") " pod="openshift-dns/dns-default-c6qz6" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.145075 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcr2t\" (UniqueName: \"kubernetes.io/projected/36232195-9b1f-42bb-92df-0487bd3bac44-kube-api-access-qcr2t\") pod \"package-server-manager-789f6589d5-rw4lh\" (UID: \"36232195-9b1f-42bb-92df-0487bd3bac44\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rw4lh" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.145095 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a8e10183-5783-464a-96f1-2157a8690d9b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-wlpng\" (UID: \"a8e10183-5783-464a-96f1-2157a8690d9b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wlpng" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.145164 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-258ql\" (UniqueName: \"kubernetes.io/projected/a8e10183-5783-464a-96f1-2157a8690d9b-kube-api-access-258ql\") pod \"multus-admission-controller-857f4d67dd-wlpng\" (UID: \"a8e10183-5783-464a-96f1-2157a8690d9b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wlpng" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.145195 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlm24\" (UniqueName: \"kubernetes.io/projected/e335b799-3e6d-4214-bcae-ee6dd64054d5-kube-api-access-dlm24\") pod \"cluster-image-registry-operator-dc59b4c8b-9jcnt\" (UID: \"e335b799-3e6d-4214-bcae-ee6dd64054d5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9jcnt" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.145219 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tv58\" (UniqueName: \"kubernetes.io/projected/cf947a51-75da-452e-8947-561b0230bcab-kube-api-access-7tv58\") pod \"ingress-canary-pnxx4\" (UID: \"cf947a51-75da-452e-8947-561b0230bcab\") " pod="openshift-ingress-canary/ingress-canary-pnxx4" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.145240 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f130c551-8102-4733-8447-8f749c2f90f5-registration-dir\") pod \"csi-hostpathplugin-647zw\" (UID: \"f130c551-8102-4733-8447-8f749c2f90f5\") " pod="hostpath-provisioner/csi-hostpathplugin-647zw" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.145282 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjztb\" (UniqueName: \"kubernetes.io/projected/65728f91-9c05-4fc3-af26-8ce6838a2d36-kube-api-access-kjztb\") pod \"dns-default-c6qz6\" (UID: \"65728f91-9c05-4fc3-af26-8ce6838a2d36\") " pod="openshift-dns/dns-default-c6qz6" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.145376 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad326998-ebb8-427b-979c-d21b83f25778-config\") pod \"service-ca-operator-777779d784-bjq9r\" (UID: \"ad326998-ebb8-427b-979c-d21b83f25778\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bjq9r" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.145395 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z86xj\" (UniqueName: \"kubernetes.io/projected/bbd5656f-5e1f-4c7b-921d-6a13c8145217-kube-api-access-z86xj\") pod \"downloads-7954f5f757-mwwth\" (UID: \"bbd5656f-5e1f-4c7b-921d-6a13c8145217\") " pod="openshift-console/downloads-7954f5f757-mwwth" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.145460 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4785dba6-d446-4577-873f-7ef1b50887aa-node-bootstrap-token\") pod \"machine-config-server-9pxz5\" (UID: \"4785dba6-d446-4577-873f-7ef1b50887aa\") " pod="openshift-machine-config-operator/machine-config-server-9pxz5" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.145478 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f130c551-8102-4733-8447-8f749c2f90f5-socket-dir\") pod \"csi-hostpathplugin-647zw\" (UID: \"f130c551-8102-4733-8447-8f749c2f90f5\") " pod="hostpath-provisioner/csi-hostpathplugin-647zw" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.145493 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/f130c551-8102-4733-8447-8f749c2f90f5-csi-data-dir\") pod \"csi-hostpathplugin-647zw\" (UID: \"f130c551-8102-4733-8447-8f749c2f90f5\") " pod="hostpath-provisioner/csi-hostpathplugin-647zw" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.145530 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgsrf\" (UniqueName: \"kubernetes.io/projected/3ad4cd41-d615-4bdd-bd75-baa9c5f2a141-kube-api-access-tgsrf\") pod \"marketplace-operator-79b997595-rxb7h\" (UID: \"3ad4cd41-d615-4bdd-bd75-baa9c5f2a141\") " pod="openshift-marketplace/marketplace-operator-79b997595-rxb7h" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.145549 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/65728f91-9c05-4fc3-af26-8ce6838a2d36-metrics-tls\") pod \"dns-default-c6qz6\" (UID: \"65728f91-9c05-4fc3-af26-8ce6838a2d36\") " pod="openshift-dns/dns-default-c6qz6" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.145630 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4785dba6-d446-4577-873f-7ef1b50887aa-certs\") pod \"machine-config-server-9pxz5\" (UID: \"4785dba6-d446-4577-873f-7ef1b50887aa\") " pod="openshift-machine-config-operator/machine-config-server-9pxz5" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.145646 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdcj4\" (UniqueName: \"kubernetes.io/projected/4785dba6-d446-4577-873f-7ef1b50887aa-kube-api-access-rdcj4\") pod \"machine-config-server-9pxz5\" (UID: \"4785dba6-d446-4577-873f-7ef1b50887aa\") " pod="openshift-machine-config-operator/machine-config-server-9pxz5" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.149746 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8n24\" (UniqueName: \"kubernetes.io/projected/4afb29d9-e4be-4c64-88d4-823d89c15ec8-kube-api-access-b8n24\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:39 crc kubenswrapper[4991]: E1206 01:04:39.150222 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:04:39.650200413 +0000 UTC m=+153.000491051 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.144120 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qv97d" Dec 06 01:04:39 crc kubenswrapper[4991]: W1206 01:04:39.162641 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2524c61e_9353_41bd_acda_66380b2381b2.slice/crio-5b812d0c8b7c54ec0d577ab58f9d07de710d48958c2906a05159bf574b9accf1 WatchSource:0}: Error finding container 5b812d0c8b7c54ec0d577ab58f9d07de710d48958c2906a05159bf574b9accf1: Status 404 returned error can't find the container with id 5b812d0c8b7c54ec0d577ab58f9d07de710d48958c2906a05159bf574b9accf1 Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.164487 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-vvg2d" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.164895 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e335b799-3e6d-4214-bcae-ee6dd64054d5-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-9jcnt\" (UID: \"e335b799-3e6d-4214-bcae-ee6dd64054d5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9jcnt" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.165133 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0d67d95f-81bb-491f-87ca-e15aba253031-config-volume\") pod \"collect-profiles-29416380-k4nkv\" (UID: \"0d67d95f-81bb-491f-87ca-e15aba253031\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416380-k4nkv" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.165811 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3ad4cd41-d615-4bdd-bd75-baa9c5f2a141-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rxb7h\" (UID: \"3ad4cd41-d615-4bdd-bd75-baa9c5f2a141\") " pod="openshift-marketplace/marketplace-operator-79b997595-rxb7h" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.166693 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/65728f91-9c05-4fc3-af26-8ce6838a2d36-config-volume\") pod \"dns-default-c6qz6\" (UID: \"65728f91-9c05-4fc3-af26-8ce6838a2d36\") " pod="openshift-dns/dns-default-c6qz6" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.166705 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n6rl\" (UniqueName: \"kubernetes.io/projected/54fd8fc3-c45f-43d0-aeff-65e3b1669071-kube-api-access-8n6rl\") pod \"openshift-controller-manager-operator-756b6f6bc6-j79nx\" (UID: \"54fd8fc3-c45f-43d0-aeff-65e3b1669071\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j79nx" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.171137 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e335b799-3e6d-4214-bcae-ee6dd64054d5-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-9jcnt\" (UID: \"e335b799-3e6d-4214-bcae-ee6dd64054d5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9jcnt" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.172187 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf947a51-75da-452e-8947-561b0230bcab-cert\") pod \"ingress-canary-pnxx4\" (UID: \"cf947a51-75da-452e-8947-561b0230bcab\") " pod="openshift-ingress-canary/ingress-canary-pnxx4" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.175641 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3ad4cd41-d615-4bdd-bd75-baa9c5f2a141-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rxb7h\" (UID: \"3ad4cd41-d615-4bdd-bd75-baa9c5f2a141\") " pod="openshift-marketplace/marketplace-operator-79b997595-rxb7h" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.178153 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0d67d95f-81bb-491f-87ca-e15aba253031-secret-volume\") pod \"collect-profiles-29416380-k4nkv\" (UID: \"0d67d95f-81bb-491f-87ca-e15aba253031\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416380-k4nkv" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.180302 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/65728f91-9c05-4fc3-af26-8ce6838a2d36-metrics-tls\") pod \"dns-default-c6qz6\" (UID: \"65728f91-9c05-4fc3-af26-8ce6838a2d36\") " pod="openshift-dns/dns-default-c6qz6" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.182877 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/36232195-9b1f-42bb-92df-0487bd3bac44-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rw4lh\" (UID: \"36232195-9b1f-42bb-92df-0487bd3bac44\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rw4lh" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.183914 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nt5q5" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.195786 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zg4s7\" (UniqueName: \"kubernetes.io/projected/1ae11df0-5cf0-4a53-b313-91c9182dde28-kube-api-access-zg4s7\") pod \"console-operator-58897d9998-hxspx\" (UID: \"1ae11df0-5cf0-4a53-b313-91c9182dde28\") " pod="openshift-console-operator/console-operator-58897d9998-hxspx" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.203823 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-h594s" event={"ID":"dd63c06e-9514-4718-8678-db55b7deedfc","Type":"ContainerStarted","Data":"4b3e28f758315339e8a5eb21b877ee2e8e87df87904b03ba848897bb15b3ccba"} Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.203874 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-h594s" event={"ID":"dd63c06e-9514-4718-8678-db55b7deedfc","Type":"ContainerStarted","Data":"d5c8f1d1befd7d9d01b8925dcf471f2439dea2f4924a813bcf4051d20dc6c103"} Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.211068 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2xvjl" event={"ID":"9aca593e-3156-4661-9e09-5c9d355801d5","Type":"ContainerStarted","Data":"5a1fc1aee8c8c4a92094fc871acd497e46cc6ed4d6c39c2b695b925cd3f32f51"} Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.215455 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j79nx" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.241756 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e335b799-3e6d-4214-bcae-ee6dd64054d5-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-9jcnt\" (UID: \"e335b799-3e6d-4214-bcae-ee6dd64054d5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9jcnt" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.247560 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/f130c551-8102-4733-8447-8f749c2f90f5-plugins-dir\") pod \"csi-hostpathplugin-647zw\" (UID: \"f130c551-8102-4733-8447-8f749c2f90f5\") " pod="hostpath-provisioner/csi-hostpathplugin-647zw" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.247625 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.247655 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rtg2\" (UniqueName: \"kubernetes.io/projected/b8fc9504-e68a-4d0a-99bc-3b8da1e78bac-kube-api-access-2rtg2\") pod \"service-ca-9c57cc56f-s7xc9\" (UID: \"b8fc9504-e68a-4d0a-99bc-3b8da1e78bac\") " pod="openshift-service-ca/service-ca-9c57cc56f-s7xc9" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.247694 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b8fc9504-e68a-4d0a-99bc-3b8da1e78bac-signing-key\") pod \"service-ca-9c57cc56f-s7xc9\" (UID: \"b8fc9504-e68a-4d0a-99bc-3b8da1e78bac\") " pod="openshift-service-ca/service-ca-9c57cc56f-s7xc9" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.247736 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad326998-ebb8-427b-979c-d21b83f25778-serving-cert\") pod \"service-ca-operator-777779d784-bjq9r\" (UID: \"ad326998-ebb8-427b-979c-d21b83f25778\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bjq9r" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.247773 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7lgb\" (UniqueName: \"kubernetes.io/projected/ad326998-ebb8-427b-979c-d21b83f25778-kube-api-access-l7lgb\") pod \"service-ca-operator-777779d784-bjq9r\" (UID: \"ad326998-ebb8-427b-979c-d21b83f25778\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bjq9r" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.247877 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a8e10183-5783-464a-96f1-2157a8690d9b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-wlpng\" (UID: \"a8e10183-5783-464a-96f1-2157a8690d9b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wlpng" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.247913 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-258ql\" (UniqueName: \"kubernetes.io/projected/a8e10183-5783-464a-96f1-2157a8690d9b-kube-api-access-258ql\") pod \"multus-admission-controller-857f4d67dd-wlpng\" (UID: \"a8e10183-5783-464a-96f1-2157a8690d9b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wlpng" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.247966 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f130c551-8102-4733-8447-8f749c2f90f5-registration-dir\") pod \"csi-hostpathplugin-647zw\" (UID: \"f130c551-8102-4733-8447-8f749c2f90f5\") " pod="hostpath-provisioner/csi-hostpathplugin-647zw" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.248061 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad326998-ebb8-427b-979c-d21b83f25778-config\") pod \"service-ca-operator-777779d784-bjq9r\" (UID: \"ad326998-ebb8-427b-979c-d21b83f25778\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bjq9r" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.248117 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z86xj\" (UniqueName: \"kubernetes.io/projected/bbd5656f-5e1f-4c7b-921d-6a13c8145217-kube-api-access-z86xj\") pod \"downloads-7954f5f757-mwwth\" (UID: \"bbd5656f-5e1f-4c7b-921d-6a13c8145217\") " pod="openshift-console/downloads-7954f5f757-mwwth" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.248145 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4785dba6-d446-4577-873f-7ef1b50887aa-node-bootstrap-token\") pod \"machine-config-server-9pxz5\" (UID: \"4785dba6-d446-4577-873f-7ef1b50887aa\") " pod="openshift-machine-config-operator/machine-config-server-9pxz5" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.248206 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f130c551-8102-4733-8447-8f749c2f90f5-socket-dir\") pod \"csi-hostpathplugin-647zw\" (UID: \"f130c551-8102-4733-8447-8f749c2f90f5\") " pod="hostpath-provisioner/csi-hostpathplugin-647zw" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.248258 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/f130c551-8102-4733-8447-8f749c2f90f5-csi-data-dir\") pod \"csi-hostpathplugin-647zw\" (UID: \"f130c551-8102-4733-8447-8f749c2f90f5\") " pod="hostpath-provisioner/csi-hostpathplugin-647zw" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.248343 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4785dba6-d446-4577-873f-7ef1b50887aa-certs\") pod \"machine-config-server-9pxz5\" (UID: \"4785dba6-d446-4577-873f-7ef1b50887aa\") " pod="openshift-machine-config-operator/machine-config-server-9pxz5" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.248375 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdcj4\" (UniqueName: \"kubernetes.io/projected/4785dba6-d446-4577-873f-7ef1b50887aa-kube-api-access-rdcj4\") pod \"machine-config-server-9pxz5\" (UID: \"4785dba6-d446-4577-873f-7ef1b50887aa\") " pod="openshift-machine-config-operator/machine-config-server-9pxz5" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.248432 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2zfr\" (UniqueName: \"kubernetes.io/projected/f130c551-8102-4733-8447-8f749c2f90f5-kube-api-access-w2zfr\") pod \"csi-hostpathplugin-647zw\" (UID: \"f130c551-8102-4733-8447-8f749c2f90f5\") " pod="hostpath-provisioner/csi-hostpathplugin-647zw" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.248493 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/f130c551-8102-4733-8447-8f749c2f90f5-mountpoint-dir\") pod \"csi-hostpathplugin-647zw\" (UID: \"f130c551-8102-4733-8447-8f749c2f90f5\") " pod="hostpath-provisioner/csi-hostpathplugin-647zw" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.248533 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b8fc9504-e68a-4d0a-99bc-3b8da1e78bac-signing-cabundle\") pod \"service-ca-9c57cc56f-s7xc9\" (UID: \"b8fc9504-e68a-4d0a-99bc-3b8da1e78bac\") " pod="openshift-service-ca/service-ca-9c57cc56f-s7xc9" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.248897 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49b7d\" (UniqueName: \"kubernetes.io/projected/102ad02d-65ff-4d28-9d31-c95f66848f71-kube-api-access-49b7d\") pod \"machine-config-operator-74547568cd-jcdx4\" (UID: \"102ad02d-65ff-4d28-9d31-c95f66848f71\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jcdx4" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.249429 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b8fc9504-e68a-4d0a-99bc-3b8da1e78bac-signing-cabundle\") pod \"service-ca-9c57cc56f-s7xc9\" (UID: \"b8fc9504-e68a-4d0a-99bc-3b8da1e78bac\") " pod="openshift-service-ca/service-ca-9c57cc56f-s7xc9" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.249958 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad326998-ebb8-427b-979c-d21b83f25778-config\") pod \"service-ca-operator-777779d784-bjq9r\" (UID: \"ad326998-ebb8-427b-979c-d21b83f25778\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bjq9r" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.250323 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/f130c551-8102-4733-8447-8f749c2f90f5-plugins-dir\") pod \"csi-hostpathplugin-647zw\" (UID: \"f130c551-8102-4733-8447-8f749c2f90f5\") " pod="hostpath-provisioner/csi-hostpathplugin-647zw" Dec 06 01:04:39 crc kubenswrapper[4991]: E1206 01:04:39.250730 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 01:04:39.750712812 +0000 UTC m=+153.101003280 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v8msm" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.251910 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f130c551-8102-4733-8447-8f749c2f90f5-socket-dir\") pod \"csi-hostpathplugin-647zw\" (UID: \"f130c551-8102-4733-8447-8f749c2f90f5\") " pod="hostpath-provisioner/csi-hostpathplugin-647zw" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.251974 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f130c551-8102-4733-8447-8f749c2f90f5-registration-dir\") pod \"csi-hostpathplugin-647zw\" (UID: \"f130c551-8102-4733-8447-8f749c2f90f5\") " pod="hostpath-provisioner/csi-hostpathplugin-647zw" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.252049 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/f130c551-8102-4733-8447-8f749c2f90f5-csi-data-dir\") pod \"csi-hostpathplugin-647zw\" (UID: \"f130c551-8102-4733-8447-8f749c2f90f5\") " pod="hostpath-provisioner/csi-hostpathplugin-647zw" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.252730 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/f130c551-8102-4733-8447-8f749c2f90f5-mountpoint-dir\") pod \"csi-hostpathplugin-647zw\" (UID: \"f130c551-8102-4733-8447-8f749c2f90f5\") " pod="hostpath-provisioner/csi-hostpathplugin-647zw" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.270496 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pn5g9"] Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.274143 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a8e10183-5783-464a-96f1-2157a8690d9b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-wlpng\" (UID: \"a8e10183-5783-464a-96f1-2157a8690d9b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wlpng" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.276859 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlm24\" (UniqueName: \"kubernetes.io/projected/e335b799-3e6d-4214-bcae-ee6dd64054d5-kube-api-access-dlm24\") pod \"cluster-image-registry-operator-dc59b4c8b-9jcnt\" (UID: \"e335b799-3e6d-4214-bcae-ee6dd64054d5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9jcnt" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.279889 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad326998-ebb8-427b-979c-d21b83f25778-serving-cert\") pod \"service-ca-operator-777779d784-bjq9r\" (UID: \"ad326998-ebb8-427b-979c-d21b83f25778\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bjq9r" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.280777 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4785dba6-d446-4577-873f-7ef1b50887aa-node-bootstrap-token\") pod \"machine-config-server-9pxz5\" (UID: \"4785dba6-d446-4577-873f-7ef1b50887aa\") " pod="openshift-machine-config-operator/machine-config-server-9pxz5" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.285058 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4785dba6-d446-4577-873f-7ef1b50887aa-certs\") pod \"machine-config-server-9pxz5\" (UID: \"4785dba6-d446-4577-873f-7ef1b50887aa\") " pod="openshift-machine-config-operator/machine-config-server-9pxz5" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.286841 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tv58\" (UniqueName: \"kubernetes.io/projected/cf947a51-75da-452e-8947-561b0230bcab-kube-api-access-7tv58\") pod \"ingress-canary-pnxx4\" (UID: \"cf947a51-75da-452e-8947-561b0230bcab\") " pod="openshift-ingress-canary/ingress-canary-pnxx4" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.306385 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjztb\" (UniqueName: \"kubernetes.io/projected/65728f91-9c05-4fc3-af26-8ce6838a2d36-kube-api-access-kjztb\") pod \"dns-default-c6qz6\" (UID: \"65728f91-9c05-4fc3-af26-8ce6838a2d36\") " pod="openshift-dns/dns-default-c6qz6" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.306636 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b8fc9504-e68a-4d0a-99bc-3b8da1e78bac-signing-key\") pod \"service-ca-9c57cc56f-s7xc9\" (UID: \"b8fc9504-e68a-4d0a-99bc-3b8da1e78bac\") " pod="openshift-service-ca/service-ca-9c57cc56f-s7xc9" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.306845 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgsrf\" (UniqueName: \"kubernetes.io/projected/3ad4cd41-d615-4bdd-bd75-baa9c5f2a141-kube-api-access-tgsrf\") pod \"marketplace-operator-79b997595-rxb7h\" (UID: \"3ad4cd41-d615-4bdd-bd75-baa9c5f2a141\") " pod="openshift-marketplace/marketplace-operator-79b997595-rxb7h" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.351404 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-c6qz6" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.353633 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:04:39 crc kubenswrapper[4991]: E1206 01:04:39.354113 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:04:39.854074697 +0000 UTC m=+153.204365165 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.354488 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:39 crc kubenswrapper[4991]: E1206 01:04:39.355093 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 01:04:39.855075494 +0000 UTC m=+153.205365962 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v8msm" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.358350 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pnxx4" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.365971 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-97gxp"] Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.375848 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcr2t\" (UniqueName: \"kubernetes.io/projected/36232195-9b1f-42bb-92df-0487bd3bac44-kube-api-access-qcr2t\") pod \"package-server-manager-789f6589d5-rw4lh\" (UID: \"36232195-9b1f-42bb-92df-0487bd3bac44\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rw4lh" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.380275 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-696dw\" (UniqueName: \"kubernetes.io/projected/0d67d95f-81bb-491f-87ca-e15aba253031-kube-api-access-696dw\") pod \"collect-profiles-29416380-k4nkv\" (UID: \"0d67d95f-81bb-491f-87ca-e15aba253031\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416380-k4nkv" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.388928 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z86xj\" (UniqueName: \"kubernetes.io/projected/bbd5656f-5e1f-4c7b-921d-6a13c8145217-kube-api-access-z86xj\") pod \"downloads-7954f5f757-mwwth\" (UID: \"bbd5656f-5e1f-4c7b-921d-6a13c8145217\") " pod="openshift-console/downloads-7954f5f757-mwwth" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.406252 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-258ql\" (UniqueName: \"kubernetes.io/projected/a8e10183-5783-464a-96f1-2157a8690d9b-kube-api-access-258ql\") pod \"multus-admission-controller-857f4d67dd-wlpng\" (UID: \"a8e10183-5783-464a-96f1-2157a8690d9b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wlpng" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.406396 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-hxspx" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.413104 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nhs9f" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.472858 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:04:39 crc kubenswrapper[4991]: E1206 01:04:39.473331 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:04:39.973305077 +0000 UTC m=+153.323595545 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.494317 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jcdx4" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.504885 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2zfr\" (UniqueName: \"kubernetes.io/projected/f130c551-8102-4733-8447-8f749c2f90f5-kube-api-access-w2zfr\") pod \"csi-hostpathplugin-647zw\" (UID: \"f130c551-8102-4733-8447-8f749c2f90f5\") " pod="hostpath-provisioner/csi-hostpathplugin-647zw" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.508322 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rtg2\" (UniqueName: \"kubernetes.io/projected/b8fc9504-e68a-4d0a-99bc-3b8da1e78bac-kube-api-access-2rtg2\") pod \"service-ca-9c57cc56f-s7xc9\" (UID: \"b8fc9504-e68a-4d0a-99bc-3b8da1e78bac\") " pod="openshift-service-ca/service-ca-9c57cc56f-s7xc9" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.509346 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdcj4\" (UniqueName: \"kubernetes.io/projected/4785dba6-d446-4577-873f-7ef1b50887aa-kube-api-access-rdcj4\") pod \"machine-config-server-9pxz5\" (UID: \"4785dba6-d446-4577-873f-7ef1b50887aa\") " pod="openshift-machine-config-operator/machine-config-server-9pxz5" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.513701 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7lgb\" (UniqueName: \"kubernetes.io/projected/ad326998-ebb8-427b-979c-d21b83f25778-kube-api-access-l7lgb\") pod \"service-ca-operator-777779d784-bjq9r\" (UID: \"ad326998-ebb8-427b-979c-d21b83f25778\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bjq9r" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.528707 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rxb7h" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.540020 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-mwwth" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.544225 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-s7xc9" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.552215 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rw4lh" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.572652 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416380-k4nkv" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.573651 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9jcnt" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.574881 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:39 crc kubenswrapper[4991]: E1206 01:04:39.575537 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 01:04:40.075520741 +0000 UTC m=+153.425811209 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v8msm" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.581530 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-bjq9r" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.587927 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-wlpng" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.639067 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-647zw" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.642452 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-9pxz5" Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.678731 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:04:39 crc kubenswrapper[4991]: E1206 01:04:39.679800 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:04:40.179219696 +0000 UTC m=+153.529510164 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.778860 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mw769"] Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.780961 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:39 crc kubenswrapper[4991]: E1206 01:04:39.781434 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 01:04:40.28141872 +0000 UTC m=+153.631709188 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v8msm" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.891705 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-vtscj"] Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.893034 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:04:39 crc kubenswrapper[4991]: E1206 01:04:39.893556 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:04:40.393535868 +0000 UTC m=+153.743826336 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:39 crc kubenswrapper[4991]: I1206 01:04:39.959212 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9w259"] Dec 06 01:04:40 crc kubenswrapper[4991]: I1206 01:04:39.997320 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:40 crc kubenswrapper[4991]: E1206 01:04:39.997820 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 01:04:40.497796948 +0000 UTC m=+153.848087416 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v8msm" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:40 crc kubenswrapper[4991]: I1206 01:04:40.048836 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-xgdbs"] Dec 06 01:04:40 crc kubenswrapper[4991]: I1206 01:04:40.098412 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:04:40 crc kubenswrapper[4991]: E1206 01:04:40.098567 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:04:40.598543263 +0000 UTC m=+153.948833731 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:40 crc kubenswrapper[4991]: I1206 01:04:40.199790 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:40 crc kubenswrapper[4991]: E1206 01:04:40.200558 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 01:04:40.700530531 +0000 UTC m=+154.050820999 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v8msm" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:40 crc kubenswrapper[4991]: I1206 01:04:40.213050 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nhs9f" podStartSLOduration=128.212980244 podStartE2EDuration="2m8.212980244s" podCreationTimestamp="2025-12-06 01:02:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:04:40.210474157 +0000 UTC m=+153.560764625" watchObservedRunningTime="2025-12-06 01:04:40.212980244 +0000 UTC m=+153.563270712" Dec 06 01:04:40 crc kubenswrapper[4991]: I1206 01:04:40.236185 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9w259" event={"ID":"62e4e6c5-0e68-43dd-b00a-6d1a07973d9a","Type":"ContainerStarted","Data":"a42fb104297b06e774ad5a03f5ae5df61865bb38e353c9a39d143f0f114e1cf4"} Dec 06 01:04:40 crc kubenswrapper[4991]: I1206 01:04:40.279561 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jrjzh" event={"ID":"6c058a72-250e-4d50-989a-50a9bb515182","Type":"ContainerStarted","Data":"1c28e43ac948227673b254e80ab0b8ab5209261e38ea29c60e3193feae8d3600"} Dec 06 01:04:40 crc kubenswrapper[4991]: I1206 01:04:40.279958 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4rtwr"] Dec 06 01:04:40 crc kubenswrapper[4991]: I1206 01:04:40.309769 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:04:40 crc kubenswrapper[4991]: E1206 01:04:40.324391 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:04:40.824345674 +0000 UTC m=+154.174636292 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:40 crc kubenswrapper[4991]: I1206 01:04:40.412572 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:40 crc kubenswrapper[4991]: E1206 01:04:40.414234 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 01:04:40.914213598 +0000 UTC m=+154.264504066 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v8msm" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:40 crc kubenswrapper[4991]: I1206 01:04:40.431497 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-9pxz5" event={"ID":"4785dba6-d446-4577-873f-7ef1b50887aa","Type":"ContainerStarted","Data":"dd4b95a4e53a2c3012c2f531309417b51b1a248bad6ff0e167d0b5cdd76bcd65"} Dec 06 01:04:40 crc kubenswrapper[4991]: I1206 01:04:40.449679 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wgg26" event={"ID":"c8f6a0fa-3b24-45b5-8c5f-ded3c2d01a4a","Type":"ContainerStarted","Data":"1e203f3e96397e00aea5b104cac38cbb143633cee3a33d12bba9b203f88f938f"} Dec 06 01:04:40 crc kubenswrapper[4991]: I1206 01:04:40.517085 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mgs4q" event={"ID":"b9bc7fc1-7d76-413c-9d29-58584564ac16","Type":"ContainerStarted","Data":"9d169f80317482c0697cc214fc3f8c88ed09959bd4ade03e24a8d37996fda7ab"} Dec 06 01:04:40 crc kubenswrapper[4991]: I1206 01:04:40.519749 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:04:40 crc kubenswrapper[4991]: E1206 01:04:40.521153 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:04:41.021131128 +0000 UTC m=+154.371421596 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:40 crc kubenswrapper[4991]: I1206 01:04:40.570877 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-97gxp" event={"ID":"8723600d-fcf8-414f-a440-da9d65d41e07","Type":"ContainerStarted","Data":"320f8a90cc1d41152a902cb8345543ed323a0bad17247f6c31788749c52f7795"} Dec 06 01:04:40 crc kubenswrapper[4991]: I1206 01:04:40.621709 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:40 crc kubenswrapper[4991]: I1206 01:04:40.621917 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8d52f" event={"ID":"eb00ea97-33bb-448f-afb1-372543ca82f5","Type":"ContainerStarted","Data":"a03106c4771bd08ae1e1b37ade222a424506065672d10e1282095f7019df0fd8"} Dec 06 01:04:40 crc kubenswrapper[4991]: E1206 01:04:40.622133 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 01:04:41.12211418 +0000 UTC m=+154.472404648 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v8msm" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:40 crc kubenswrapper[4991]: I1206 01:04:40.643126 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rpzsp" event={"ID":"af2cb7cc-ae54-4798-a45c-5816a09d1009","Type":"ContainerStarted","Data":"fa0777b152c930a9b92a3e2a98f657a53d078db880ac42f65630cff9ea1026d1"} Dec 06 01:04:40 crc kubenswrapper[4991]: I1206 01:04:40.643196 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rpzsp" event={"ID":"af2cb7cc-ae54-4798-a45c-5816a09d1009","Type":"ContainerStarted","Data":"88a3c6ada0fadbc31853a6d4112a6f9dd45b0499630f263e4ad018266c090280"} Dec 06 01:04:40 crc kubenswrapper[4991]: I1206 01:04:40.645424 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rpzsp" Dec 06 01:04:40 crc kubenswrapper[4991]: I1206 01:04:40.682267 4991 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-rpzsp container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Dec 06 01:04:40 crc kubenswrapper[4991]: I1206 01:04:40.682360 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rpzsp" podUID="af2cb7cc-ae54-4798-a45c-5816a09d1009" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" Dec 06 01:04:40 crc kubenswrapper[4991]: I1206 01:04:40.684403 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vphlm" event={"ID":"cb4655db-a383-4f46-afbb-05557deaeccf","Type":"ContainerStarted","Data":"38eb916d0384661d71e202353e322fbf47eba853de04afbc181c35a2951a1636"} Dec 06 01:04:40 crc kubenswrapper[4991]: I1206 01:04:40.691026 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-h594s" podStartSLOduration=129.691001403 podStartE2EDuration="2m9.691001403s" podCreationTimestamp="2025-12-06 01:02:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:04:40.672227341 +0000 UTC m=+154.022517809" watchObservedRunningTime="2025-12-06 01:04:40.691001403 +0000 UTC m=+154.041291871" Dec 06 01:04:40 crc kubenswrapper[4991]: I1206 01:04:40.692439 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-kpfbw"] Dec 06 01:04:40 crc kubenswrapper[4991]: I1206 01:04:40.718119 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-hwqgl" event={"ID":"6f04204c-29c6-4b62-b4f4-28935672a20e","Type":"ContainerStarted","Data":"2614c0f7bd38895538aad3087c76d660eaa42e994e880929a6a5c005f4a787fe"} Dec 06 01:04:40 crc kubenswrapper[4991]: I1206 01:04:40.724712 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:04:40 crc kubenswrapper[4991]: E1206 01:04:40.725121 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:04:41.225102025 +0000 UTC m=+154.575392493 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:40 crc kubenswrapper[4991]: I1206 01:04:40.744564 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c9qfb" event={"ID":"9a829457-575c-4b6b-bc3b-68c117691597","Type":"ContainerStarted","Data":"70979308956adae0f87d4e940e8a568df5212b14ed6dab0fa5de0a9bc970ee5c"} Dec 06 01:04:40 crc kubenswrapper[4991]: I1206 01:04:40.753794 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-xh4tr" event={"ID":"784eec32-b943-4380-9c3a-7f72b1756906","Type":"ContainerStarted","Data":"d4e8cfd0ebb902e6fac5c59a01b6e1c66fdbe541ac0c6eda1fb7366868128214"} Dec 06 01:04:40 crc kubenswrapper[4991]: I1206 01:04:40.753894 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-xh4tr" event={"ID":"784eec32-b943-4380-9c3a-7f72b1756906","Type":"ContainerStarted","Data":"a5a06d49646dd3222cd539252381e086df87f4a78d12619c3502d3124fb0ef5d"} Dec 06 01:04:40 crc kubenswrapper[4991]: I1206 01:04:40.770771 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-mw769" event={"ID":"3acd3a6f-6dab-4bff-8f82-b96a8abc1555","Type":"ContainerStarted","Data":"6213860d80a4a1853515dff639c19fe1976a0094936fa472122680656692a3cd"} Dec 06 01:04:40 crc kubenswrapper[4991]: I1206 01:04:40.773745 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" event={"ID":"2524c61e-9353-41bd-acda-66380b2381b2","Type":"ContainerStarted","Data":"fb95cf865d9d86576a33d5821fc49072b36573ef0160317efad665457ad2ef3e"} Dec 06 01:04:40 crc kubenswrapper[4991]: I1206 01:04:40.773769 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" event={"ID":"2524c61e-9353-41bd-acda-66380b2381b2","Type":"ContainerStarted","Data":"5b812d0c8b7c54ec0d577ab58f9d07de710d48958c2906a05159bf574b9accf1"} Dec 06 01:04:40 crc kubenswrapper[4991]: I1206 01:04:40.775142 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:04:40 crc kubenswrapper[4991]: I1206 01:04:40.776261 4991 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-mzbqd container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" start-of-body= Dec 06 01:04:40 crc kubenswrapper[4991]: I1206 01:04:40.776325 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" podUID="2524c61e-9353-41bd-acda-66380b2381b2" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" Dec 06 01:04:40 crc kubenswrapper[4991]: I1206 01:04:40.783163 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vtscj" event={"ID":"82128768-e91b-4e38-bc1b-2915bc9d4436","Type":"ContainerStarted","Data":"6f7ee1cc49b7ff8da7a6748bff505b9628344f439e141bfff9999c30c6cc1efc"} Dec 06 01:04:40 crc kubenswrapper[4991]: I1206 01:04:40.791566 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-qhtk2" event={"ID":"e8f82106-cdc2-4dec-a30d-221e17bdc50d","Type":"ContainerStarted","Data":"0721df91c6c86e045b0eab4563bb8701fd039d56cb13407eb8032ca1f52f4162"} Dec 06 01:04:40 crc kubenswrapper[4991]: I1206 01:04:40.791623 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-qhtk2" event={"ID":"e8f82106-cdc2-4dec-a30d-221e17bdc50d","Type":"ContainerStarted","Data":"d2595d26b9779bfc6c23aec42163e5aa8b95f56e183f0ef49e25cdc05385e925"} Dec 06 01:04:40 crc kubenswrapper[4991]: I1206 01:04:40.793772 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pn5g9" event={"ID":"fdb62c42-c462-451d-a594-29333e494057","Type":"ContainerStarted","Data":"aa23f46540957f667660b7c3712dd2dce3395afd1160cf88dd49b051f7d4c73a"} Dec 06 01:04:40 crc kubenswrapper[4991]: I1206 01:04:40.827334 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:40 crc kubenswrapper[4991]: I1206 01:04:40.842182 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2xvjl" podStartSLOduration=128.842142017 podStartE2EDuration="2m8.842142017s" podCreationTimestamp="2025-12-06 01:02:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:04:40.803946405 +0000 UTC m=+154.154236873" watchObservedRunningTime="2025-12-06 01:04:40.842142017 +0000 UTC m=+154.192432475" Dec 06 01:04:40 crc kubenswrapper[4991]: E1206 01:04:40.845310 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 01:04:41.345286441 +0000 UTC m=+154.695576909 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v8msm" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:40 crc kubenswrapper[4991]: I1206 01:04:40.879412 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-qhtk2" Dec 06 01:04:40 crc kubenswrapper[4991]: I1206 01:04:40.928609 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:04:40 crc kubenswrapper[4991]: I1206 01:04:40.929052 4991 patch_prober.go:28] interesting pod/router-default-5444994796-qhtk2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 01:04:40 crc kubenswrapper[4991]: [-]has-synced failed: reason withheld Dec 06 01:04:40 crc kubenswrapper[4991]: [+]process-running ok Dec 06 01:04:40 crc kubenswrapper[4991]: healthz check failed Dec 06 01:04:40 crc kubenswrapper[4991]: I1206 01:04:40.929122 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhtk2" podUID="e8f82106-cdc2-4dec-a30d-221e17bdc50d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 01:04:40 crc kubenswrapper[4991]: E1206 01:04:40.930328 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:04:41.430304665 +0000 UTC m=+154.780595123 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.040387 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:41 crc kubenswrapper[4991]: E1206 01:04:41.040813 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 01:04:41.540796211 +0000 UTC m=+154.891086679 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v8msm" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.136600 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c9qfb" podStartSLOduration=129.136578014 podStartE2EDuration="2m9.136578014s" podCreationTimestamp="2025-12-06 01:02:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:04:41.134106277 +0000 UTC m=+154.484396745" watchObservedRunningTime="2025-12-06 01:04:41.136578014 +0000 UTC m=+154.486868482" Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.147722 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:04:41 crc kubenswrapper[4991]: E1206 01:04:41.148212 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:04:41.648193524 +0000 UTC m=+154.998483992 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.199176 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-nt5q5"] Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.229954 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-pnxx4"] Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.236520 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qv97d"] Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.255625 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:41 crc kubenswrapper[4991]: E1206 01:04:41.256102 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 01:04:41.756086701 +0000 UTC m=+155.106377169 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v8msm" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.270293 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f5qmc"] Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.279222 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-hxspx"] Dec 06 01:04:41 crc kubenswrapper[4991]: W1206 01:04:41.279941 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf947a51_75da_452e_8947_561b0230bcab.slice/crio-82fcafdf4a6ddde241722f81cb7d9264a3affcce169630556e3e9ed5dc20810f WatchSource:0}: Error finding container 82fcafdf4a6ddde241722f81cb7d9264a3affcce169630556e3e9ed5dc20810f: Status 404 returned error can't find the container with id 82fcafdf4a6ddde241722f81cb7d9264a3affcce169630556e3e9ed5dc20810f Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.282420 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jrjzh" podStartSLOduration=129.282396395 podStartE2EDuration="2m9.282396395s" podCreationTimestamp="2025-12-06 01:02:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:04:41.269218722 +0000 UTC m=+154.619509190" watchObservedRunningTime="2025-12-06 01:04:41.282396395 +0000 UTC m=+154.632686863" Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.340706 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rpzsp" podStartSLOduration=129.340688144 podStartE2EDuration="2m9.340688144s" podCreationTimestamp="2025-12-06 01:02:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:04:41.339351498 +0000 UTC m=+154.689641976" watchObservedRunningTime="2025-12-06 01:04:41.340688144 +0000 UTC m=+154.690978612" Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.341044 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-qhtk2" podStartSLOduration=129.341037663 podStartE2EDuration="2m9.341037663s" podCreationTimestamp="2025-12-06 01:02:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:04:41.310080835 +0000 UTC m=+154.660371303" watchObservedRunningTime="2025-12-06 01:04:41.341037663 +0000 UTC m=+154.691328141" Dec 06 01:04:41 crc kubenswrapper[4991]: W1206 01:04:41.351213 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a61cec3_9e94_43d4_8e8e_42674e6e6ef9.slice/crio-4f91c642d797d1c5be01252783e4f63cf1814540dd93172aa79c8203327e7cef WatchSource:0}: Error finding container 4f91c642d797d1c5be01252783e4f63cf1814540dd93172aa79c8203327e7cef: Status 404 returned error can't find the container with id 4f91c642d797d1c5be01252783e4f63cf1814540dd93172aa79c8203327e7cef Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.356930 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rw4lh"] Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.360389 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j79nx"] Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.360957 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:04:41 crc kubenswrapper[4991]: E1206 01:04:41.361490 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:04:41.86146938 +0000 UTC m=+155.211759848 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.369065 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-vvg2d"] Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.371483 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-xh4tr" podStartSLOduration=129.371466597 podStartE2EDuration="2m9.371466597s" podCreationTimestamp="2025-12-06 01:02:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:04:41.362796425 +0000 UTC m=+154.713086893" watchObservedRunningTime="2025-12-06 01:04:41.371466597 +0000 UTC m=+154.721757065" Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.383849 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mgs4q" podStartSLOduration=129.383827258 podStartE2EDuration="2m9.383827258s" podCreationTimestamp="2025-12-06 01:02:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:04:41.382907494 +0000 UTC m=+154.733197962" watchObservedRunningTime="2025-12-06 01:04:41.383827258 +0000 UTC m=+154.734117726" Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.424944 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-hwqgl" podStartSLOduration=129.424920308 podStartE2EDuration="2m9.424920308s" podCreationTimestamp="2025-12-06 01:02:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:04:41.422876003 +0000 UTC m=+154.773166471" watchObservedRunningTime="2025-12-06 01:04:41.424920308 +0000 UTC m=+154.775210766" Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.466932 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:41 crc kubenswrapper[4991]: E1206 01:04:41.467747 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 01:04:41.967731813 +0000 UTC m=+155.318022281 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v8msm" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.487480 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" podStartSLOduration=130.487459031 podStartE2EDuration="2m10.487459031s" podCreationTimestamp="2025-12-06 01:02:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:04:41.485596441 +0000 UTC m=+154.835886909" watchObservedRunningTime="2025-12-06 01:04:41.487459031 +0000 UTC m=+154.837749499" Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.488082 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vphlm" podStartSLOduration=130.488076037 podStartE2EDuration="2m10.488076037s" podCreationTimestamp="2025-12-06 01:02:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:04:41.450886492 +0000 UTC m=+154.801176960" watchObservedRunningTime="2025-12-06 01:04:41.488076037 +0000 UTC m=+154.838366505" Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.566285 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wgg26" podStartSLOduration=129.566261259 podStartE2EDuration="2m9.566261259s" podCreationTimestamp="2025-12-06 01:02:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:04:41.537237012 +0000 UTC m=+154.887527480" watchObservedRunningTime="2025-12-06 01:04:41.566261259 +0000 UTC m=+154.916551727" Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.569206 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:04:41 crc kubenswrapper[4991]: E1206 01:04:41.575678 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:04:42.07563202 +0000 UTC m=+155.425922488 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.577740 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-c6qz6"] Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.642730 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-wlpng"] Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.671700 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:41 crc kubenswrapper[4991]: E1206 01:04:41.672412 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 01:04:42.172396218 +0000 UTC m=+155.522686686 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v8msm" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.679975 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-mwwth"] Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.695092 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-jcdx4"] Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.714682 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-647zw"] Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.722606 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rxb7h"] Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.780821 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:04:41 crc kubenswrapper[4991]: E1206 01:04:41.781208 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:04:42.281190339 +0000 UTC m=+155.631480807 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.784749 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416380-k4nkv"] Dec 06 01:04:41 crc kubenswrapper[4991]: W1206 01:04:41.785131 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf130c551_8102_4733_8447_8f749c2f90f5.slice/crio-b98a065022110a0ee9412d27218596e2bd0b333457d1d235871785b7912a32ee WatchSource:0}: Error finding container b98a065022110a0ee9412d27218596e2bd0b333457d1d235871785b7912a32ee: Status 404 returned error can't find the container with id b98a065022110a0ee9412d27218596e2bd0b333457d1d235871785b7912a32ee Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.799334 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-bjq9r"] Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.816995 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9jcnt"] Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.818792 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-s7xc9"] Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.834431 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j79nx" event={"ID":"54fd8fc3-c45f-43d0-aeff-65e3b1669071","Type":"ContainerStarted","Data":"86a89e5dded5081b928f0cd26c3a4105e33930338ab48f2768ac7444056b46a7"} Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.855658 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4rtwr" event={"ID":"0c087aa7-b699-4d07-adb8-e3082e693727","Type":"ContainerStarted","Data":"8db04a8ed9562f62813e1123343d2125ec27c92d3a4b0db7082caa17030164e1"} Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.855719 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4rtwr" event={"ID":"0c087aa7-b699-4d07-adb8-e3082e693727","Type":"ContainerStarted","Data":"4f66ae9fd86d86d013ae567b25365bb8d5d542bb625fd09fb77c3ebab129ab91"} Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.857688 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qv97d" event={"ID":"6a61cec3-9e94-43d4-8e8e-42674e6e6ef9","Type":"ContainerStarted","Data":"4f91c642d797d1c5be01252783e4f63cf1814540dd93172aa79c8203327e7cef"} Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.859716 4991 generic.go:334] "Generic (PLEG): container finished" podID="82128768-e91b-4e38-bc1b-2915bc9d4436" containerID="3082510b4c7c5a7afd031bcda0a6e0322cd1e0627ed5d2e3f3f9ba2ec41eefc6" exitCode=0 Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.859775 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vtscj" event={"ID":"82128768-e91b-4e38-bc1b-2915bc9d4436","Type":"ContainerDied","Data":"3082510b4c7c5a7afd031bcda0a6e0322cd1e0627ed5d2e3f3f9ba2ec41eefc6"} Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.872890 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-wlpng" event={"ID":"a8e10183-5783-464a-96f1-2157a8690d9b","Type":"ContainerStarted","Data":"c59c09de35784bbc6c4415ecc98d77fa68d17d8b4fbdd83e9987ac730411de69"} Dec 06 01:04:41 crc kubenswrapper[4991]: W1206 01:04:41.873952 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbd5656f_5e1f_4c7b_921d_6a13c8145217.slice/crio-2bb278e9b6c7dd02c2ad84e2943fb8528c532c93449d6d44db52a96c09997576 WatchSource:0}: Error finding container 2bb278e9b6c7dd02c2ad84e2943fb8528c532c93449d6d44db52a96c09997576: Status 404 returned error can't find the container with id 2bb278e9b6c7dd02c2ad84e2943fb8528c532c93449d6d44db52a96c09997576 Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.882759 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:41 crc kubenswrapper[4991]: E1206 01:04:41.883134 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 01:04:42.383119516 +0000 UTC m=+155.733409984 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v8msm" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.890899 4991 patch_prober.go:28] interesting pod/router-default-5444994796-qhtk2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 01:04:41 crc kubenswrapper[4991]: [-]has-synced failed: reason withheld Dec 06 01:04:41 crc kubenswrapper[4991]: [+]process-running ok Dec 06 01:04:41 crc kubenswrapper[4991]: healthz check failed Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.891422 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhtk2" podUID="e8f82106-cdc2-4dec-a30d-221e17bdc50d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.901405 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-647zw" event={"ID":"f130c551-8102-4733-8447-8f749c2f90f5","Type":"ContainerStarted","Data":"b98a065022110a0ee9412d27218596e2bd0b333457d1d235871785b7912a32ee"} Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.919539 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-vvg2d" event={"ID":"2dbea4fd-7c3f-4ce2-bf97-c25431e900e7","Type":"ContainerStarted","Data":"ec67cf955893f3394170db0299b85e999a90d7d007807670fd5cf51762be693c"} Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.932961 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-pnxx4" event={"ID":"cf947a51-75da-452e-8947-561b0230bcab","Type":"ContainerStarted","Data":"82fcafdf4a6ddde241722f81cb7d9264a3affcce169630556e3e9ed5dc20810f"} Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.955014 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f5qmc" event={"ID":"9e67932d-8fce-413c-895a-c3e9c6fb0ca3","Type":"ContainerStarted","Data":"2696d84dabc2a3b8feca9ec9a832f84ca0cdf0305db0ed93674310ecaf350334"} Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.959216 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8d52f" event={"ID":"eb00ea97-33bb-448f-afb1-372543ca82f5","Type":"ContainerStarted","Data":"9c520f861cc9d0f9aefa5b8e4185984f3639226dc88cba359d2f2650cda76c0a"} Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.959243 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8d52f" event={"ID":"eb00ea97-33bb-448f-afb1-372543ca82f5","Type":"ContainerStarted","Data":"881f5b2864c45232bfa423bab03db97dfa46a5efd20e3d0805146815b5b3f690"} Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.967826 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-hxspx" event={"ID":"1ae11df0-5cf0-4a53-b313-91c9182dde28","Type":"ContainerStarted","Data":"e40969299a863c275a0091709f16f02c5033c445388802bc00c4f88c54766355"} Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.973570 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-pnxx4" podStartSLOduration=5.973552855 podStartE2EDuration="5.973552855s" podCreationTimestamp="2025-12-06 01:04:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:04:41.970330789 +0000 UTC m=+155.320621257" watchObservedRunningTime="2025-12-06 01:04:41.973552855 +0000 UTC m=+155.323843323" Dec 06 01:04:41 crc kubenswrapper[4991]: W1206 01:04:41.974659 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8fc9504_e68a_4d0a_99bc_3b8da1e78bac.slice/crio-c910362c2ae72bbccdfb95fc6b06e521421a0310d1c01c9853583d75ff15f712 WatchSource:0}: Error finding container c910362c2ae72bbccdfb95fc6b06e521421a0310d1c01c9853583d75ff15f712: Status 404 returned error can't find the container with id c910362c2ae72bbccdfb95fc6b06e521421a0310d1c01c9853583d75ff15f712 Dec 06 01:04:41 crc kubenswrapper[4991]: I1206 01:04:41.983895 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:04:41 crc kubenswrapper[4991]: E1206 01:04:41.984896 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:04:42.484874588 +0000 UTC m=+155.835165056 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.019783 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xgdbs" event={"ID":"115cccb9-61dc-46f6-bdf3-6bc992b22b54","Type":"ContainerStarted","Data":"8ee498d604ee3fbe7db46a1a4d61af249369a6a81449f720135e37fafab263b6"} Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.019838 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xgdbs" event={"ID":"115cccb9-61dc-46f6-bdf3-6bc992b22b54","Type":"ContainerStarted","Data":"c6160b9c18f69f44f5f4cbd04a2edb72af12c0661eecd8e70898392bb1aae053"} Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.019851 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xgdbs" event={"ID":"115cccb9-61dc-46f6-bdf3-6bc992b22b54","Type":"ContainerStarted","Data":"225ae8942d86598dff7fea71d1f867c236992d5d545a3a9a0ecddee43c743176"} Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.032141 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jcdx4" event={"ID":"102ad02d-65ff-4d28-9d31-c95f66848f71","Type":"ContainerStarted","Data":"13a285b893c55db8a84108f19b26fed756eea52bca275b96fe1686064c4a7297"} Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.058870 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xgdbs" podStartSLOduration=130.058852077 podStartE2EDuration="2m10.058852077s" podCreationTimestamp="2025-12-06 01:02:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:04:42.058470237 +0000 UTC m=+155.408760705" watchObservedRunningTime="2025-12-06 01:04:42.058852077 +0000 UTC m=+155.409142545" Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.059905 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8d52f" podStartSLOduration=131.059899245 podStartE2EDuration="2m11.059899245s" podCreationTimestamp="2025-12-06 01:02:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:04:41.992999585 +0000 UTC m=+155.343290053" watchObservedRunningTime="2025-12-06 01:04:42.059899245 +0000 UTC m=+155.410189713" Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.074751 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-mw769" event={"ID":"3acd3a6f-6dab-4bff-8f82-b96a8abc1555","Type":"ContainerStarted","Data":"e47c7d5cc241809a44f7ec9e7d0a97f6033840782844d9a11ea835f8dbe5c6cb"} Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.078048 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-mw769" Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.085868 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.086104 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-c6qz6" event={"ID":"65728f91-9c05-4fc3-af26-8ce6838a2d36","Type":"ContainerStarted","Data":"0a0fa67813aef024a5a1ce60a9723635a970cf6c8ce95ad8c562272b4d6e2e08"} Dec 06 01:04:42 crc kubenswrapper[4991]: E1206 01:04:42.087223 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 01:04:42.587205976 +0000 UTC m=+155.937496444 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v8msm" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.089206 4991 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-mw769 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.089354 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-mw769" podUID="3acd3a6f-6dab-4bff-8f82-b96a8abc1555" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.108029 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-mw769" podStartSLOduration=131.108002092 podStartE2EDuration="2m11.108002092s" podCreationTimestamp="2025-12-06 01:02:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:04:42.10716407 +0000 UTC m=+155.457454538" watchObservedRunningTime="2025-12-06 01:04:42.108002092 +0000 UTC m=+155.458292560" Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.118352 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-9pxz5" event={"ID":"4785dba6-d446-4577-873f-7ef1b50887aa","Type":"ContainerStarted","Data":"77cff1ef501cecd09c64506cf9e00359c539faf80d88be930a9aebbcb882b0ff"} Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.143210 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pn5g9" event={"ID":"fdb62c42-c462-451d-a594-29333e494057","Type":"ContainerStarted","Data":"a6d9851d55356c6500cc7bd02dea574cf6e7b10b8bc8d0012583b6695f710f52"} Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.150499 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-kpfbw" event={"ID":"d7a020b5-5981-4927-a254-ecad13dce340","Type":"ContainerStarted","Data":"d6fe11b9186afebc7ef7a4c4ad9ac9fa04e0782a67c38d98b06a766b2dcd389e"} Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.150561 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-kpfbw" event={"ID":"d7a020b5-5981-4927-a254-ecad13dce340","Type":"ContainerStarted","Data":"5dac2305a1894f215a1b6f12c32c63537137251ac58830ddaf981268f67e4444"} Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.158352 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rw4lh" event={"ID":"36232195-9b1f-42bb-92df-0487bd3bac44","Type":"ContainerStarted","Data":"51a39f3e0d2508d8141d9e087b140e804e62ce0031f2d5dc3e5dd50e42f33c82"} Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.170487 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-97gxp" event={"ID":"8723600d-fcf8-414f-a440-da9d65d41e07","Type":"ContainerStarted","Data":"4b9f1c62629240e98ebb8a24fe30ae5c2f69731f649b4d02200779344c8772a5"} Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.198121 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.199378 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pn5g9" podStartSLOduration=130.199356426 podStartE2EDuration="2m10.199356426s" podCreationTimestamp="2025-12-06 01:02:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:04:42.19799619 +0000 UTC m=+155.548286648" watchObservedRunningTime="2025-12-06 01:04:42.199356426 +0000 UTC m=+155.549646894" Dec 06 01:04:42 crc kubenswrapper[4991]: E1206 01:04:42.199698 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:04:42.699667434 +0000 UTC m=+156.049958092 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.199851 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9w259" event={"ID":"62e4e6c5-0e68-43dd-b00a-6d1a07973d9a","Type":"ContainerStarted","Data":"9dc6fba64d1c3a8b75a3aa6efc51c5326cbc9c0c7acb124a6703f35fd2cf9791"} Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.200565 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-9pxz5" podStartSLOduration=6.200552538 podStartE2EDuration="6.200552538s" podCreationTimestamp="2025-12-06 01:04:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:04:42.158582675 +0000 UTC m=+155.508873153" watchObservedRunningTime="2025-12-06 01:04:42.200552538 +0000 UTC m=+155.550843006" Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.234165 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nt5q5" event={"ID":"67519387-ecfc-4c03-a199-6679431ac96f","Type":"ContainerStarted","Data":"2ff1e645fcdbd7ab533b0a2706f279c286acaa7a018e382487c0f8e34084bcc4"} Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.267307 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-h594s" Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.268617 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rpzsp" Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.268677 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-h594s" Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.269906 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.283441 4991 patch_prober.go:28] interesting pod/apiserver-76f77b778f-h594s container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 06 01:04:42 crc kubenswrapper[4991]: [+]log ok Dec 06 01:04:42 crc kubenswrapper[4991]: [+]etcd ok Dec 06 01:04:42 crc kubenswrapper[4991]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 06 01:04:42 crc kubenswrapper[4991]: [+]poststarthook/generic-apiserver-start-informers ok Dec 06 01:04:42 crc kubenswrapper[4991]: [+]poststarthook/max-in-flight-filter ok Dec 06 01:04:42 crc kubenswrapper[4991]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 06 01:04:42 crc kubenswrapper[4991]: [+]poststarthook/image.openshift.io-apiserver-caches ok Dec 06 01:04:42 crc kubenswrapper[4991]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Dec 06 01:04:42 crc kubenswrapper[4991]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Dec 06 01:04:42 crc kubenswrapper[4991]: [+]poststarthook/project.openshift.io-projectcache ok Dec 06 01:04:42 crc kubenswrapper[4991]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Dec 06 01:04:42 crc kubenswrapper[4991]: [+]poststarthook/openshift.io-startinformers ok Dec 06 01:04:42 crc kubenswrapper[4991]: [+]poststarthook/openshift.io-restmapperupdater ok Dec 06 01:04:42 crc kubenswrapper[4991]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 06 01:04:42 crc kubenswrapper[4991]: livez check failed Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.283966 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-h594s" podUID="dd63c06e-9514-4718-8678-db55b7deedfc" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.295018 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9w259" podStartSLOduration=130.294961164 podStartE2EDuration="2m10.294961164s" podCreationTimestamp="2025-12-06 01:02:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:04:42.256965227 +0000 UTC m=+155.607255695" watchObservedRunningTime="2025-12-06 01:04:42.294961164 +0000 UTC m=+155.645251632" Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.300766 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:42 crc kubenswrapper[4991]: E1206 01:04:42.304263 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 01:04:42.804245432 +0000 UTC m=+156.154535900 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v8msm" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.336469 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-97gxp" podStartSLOduration=131.336434063 podStartE2EDuration="2m11.336434063s" podCreationTimestamp="2025-12-06 01:02:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:04:42.241101203 +0000 UTC m=+155.591391671" watchObservedRunningTime="2025-12-06 01:04:42.336434063 +0000 UTC m=+155.686724531" Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.404188 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:04:42 crc kubenswrapper[4991]: E1206 01:04:42.405963 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:04:42.905937223 +0000 UTC m=+156.256227691 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.511047 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:42 crc kubenswrapper[4991]: E1206 01:04:42.511929 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 01:04:43.011901758 +0000 UTC m=+156.362192226 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v8msm" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.589462 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wmtfj"] Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.590694 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wmtfj" Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.601810 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.612601 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:04:42 crc kubenswrapper[4991]: E1206 01:04:42.612954 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:04:43.112937951 +0000 UTC m=+156.463228419 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.714278 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.714356 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aeca113e-21c6-4114-9130-111b9c2e051a-catalog-content\") pod \"certified-operators-wmtfj\" (UID: \"aeca113e-21c6-4114-9130-111b9c2e051a\") " pod="openshift-marketplace/certified-operators-wmtfj" Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.714399 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4v6x\" (UniqueName: \"kubernetes.io/projected/aeca113e-21c6-4114-9130-111b9c2e051a-kube-api-access-r4v6x\") pod \"certified-operators-wmtfj\" (UID: \"aeca113e-21c6-4114-9130-111b9c2e051a\") " pod="openshift-marketplace/certified-operators-wmtfj" Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.714423 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aeca113e-21c6-4114-9130-111b9c2e051a-utilities\") pod \"certified-operators-wmtfj\" (UID: \"aeca113e-21c6-4114-9130-111b9c2e051a\") " pod="openshift-marketplace/certified-operators-wmtfj" Dec 06 01:04:42 crc kubenswrapper[4991]: E1206 01:04:42.715185 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 01:04:43.215171776 +0000 UTC m=+156.565462234 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v8msm" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.730709 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wmtfj"] Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.753447 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6cvk9"] Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.754807 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6cvk9" Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.757245 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.817106 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.817449 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aeca113e-21c6-4114-9130-111b9c2e051a-catalog-content\") pod \"certified-operators-wmtfj\" (UID: \"aeca113e-21c6-4114-9130-111b9c2e051a\") " pod="openshift-marketplace/certified-operators-wmtfj" Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.817489 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/271a982a-6005-4d08-b284-6b140ba25f07-utilities\") pod \"community-operators-6cvk9\" (UID: \"271a982a-6005-4d08-b284-6b140ba25f07\") " pod="openshift-marketplace/community-operators-6cvk9" Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.817532 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4v6x\" (UniqueName: \"kubernetes.io/projected/aeca113e-21c6-4114-9130-111b9c2e051a-kube-api-access-r4v6x\") pod \"certified-operators-wmtfj\" (UID: \"aeca113e-21c6-4114-9130-111b9c2e051a\") " pod="openshift-marketplace/certified-operators-wmtfj" Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.817558 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/271a982a-6005-4d08-b284-6b140ba25f07-catalog-content\") pod \"community-operators-6cvk9\" (UID: \"271a982a-6005-4d08-b284-6b140ba25f07\") " pod="openshift-marketplace/community-operators-6cvk9" Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.817584 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckjwt\" (UniqueName: \"kubernetes.io/projected/271a982a-6005-4d08-b284-6b140ba25f07-kube-api-access-ckjwt\") pod \"community-operators-6cvk9\" (UID: \"271a982a-6005-4d08-b284-6b140ba25f07\") " pod="openshift-marketplace/community-operators-6cvk9" Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.817608 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aeca113e-21c6-4114-9130-111b9c2e051a-utilities\") pod \"certified-operators-wmtfj\" (UID: \"aeca113e-21c6-4114-9130-111b9c2e051a\") " pod="openshift-marketplace/certified-operators-wmtfj" Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.818228 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aeca113e-21c6-4114-9130-111b9c2e051a-utilities\") pod \"certified-operators-wmtfj\" (UID: \"aeca113e-21c6-4114-9130-111b9c2e051a\") " pod="openshift-marketplace/certified-operators-wmtfj" Dec 06 01:04:42 crc kubenswrapper[4991]: E1206 01:04:42.818315 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:04:43.318298075 +0000 UTC m=+156.668588533 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.818546 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aeca113e-21c6-4114-9130-111b9c2e051a-catalog-content\") pod \"certified-operators-wmtfj\" (UID: \"aeca113e-21c6-4114-9130-111b9c2e051a\") " pod="openshift-marketplace/certified-operators-wmtfj" Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.820087 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6cvk9"] Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.859247 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4v6x\" (UniqueName: \"kubernetes.io/projected/aeca113e-21c6-4114-9130-111b9c2e051a-kube-api-access-r4v6x\") pod \"certified-operators-wmtfj\" (UID: \"aeca113e-21c6-4114-9130-111b9c2e051a\") " pod="openshift-marketplace/certified-operators-wmtfj" Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.899762 4991 patch_prober.go:28] interesting pod/router-default-5444994796-qhtk2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 01:04:42 crc kubenswrapper[4991]: [-]has-synced failed: reason withheld Dec 06 01:04:42 crc kubenswrapper[4991]: [+]process-running ok Dec 06 01:04:42 crc kubenswrapper[4991]: healthz check failed Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.899896 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhtk2" podUID="e8f82106-cdc2-4dec-a30d-221e17bdc50d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.912105 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qzvgr"] Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.913351 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qzvgr" Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.921167 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.921255 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/271a982a-6005-4d08-b284-6b140ba25f07-utilities\") pod \"community-operators-6cvk9\" (UID: \"271a982a-6005-4d08-b284-6b140ba25f07\") " pod="openshift-marketplace/community-operators-6cvk9" Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.921291 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/271a982a-6005-4d08-b284-6b140ba25f07-catalog-content\") pod \"community-operators-6cvk9\" (UID: \"271a982a-6005-4d08-b284-6b140ba25f07\") " pod="openshift-marketplace/community-operators-6cvk9" Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.921310 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckjwt\" (UniqueName: \"kubernetes.io/projected/271a982a-6005-4d08-b284-6b140ba25f07-kube-api-access-ckjwt\") pod \"community-operators-6cvk9\" (UID: \"271a982a-6005-4d08-b284-6b140ba25f07\") " pod="openshift-marketplace/community-operators-6cvk9" Dec 06 01:04:42 crc kubenswrapper[4991]: E1206 01:04:42.922004 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 01:04:43.421965648 +0000 UTC m=+156.772256136 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v8msm" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.922486 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/271a982a-6005-4d08-b284-6b140ba25f07-utilities\") pod \"community-operators-6cvk9\" (UID: \"271a982a-6005-4d08-b284-6b140ba25f07\") " pod="openshift-marketplace/community-operators-6cvk9" Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.922713 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/271a982a-6005-4d08-b284-6b140ba25f07-catalog-content\") pod \"community-operators-6cvk9\" (UID: \"271a982a-6005-4d08-b284-6b140ba25f07\") " pod="openshift-marketplace/community-operators-6cvk9" Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.923846 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qzvgr"] Dec 06 01:04:42 crc kubenswrapper[4991]: I1206 01:04:42.981911 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckjwt\" (UniqueName: \"kubernetes.io/projected/271a982a-6005-4d08-b284-6b140ba25f07-kube-api-access-ckjwt\") pod \"community-operators-6cvk9\" (UID: \"271a982a-6005-4d08-b284-6b140ba25f07\") " pod="openshift-marketplace/community-operators-6cvk9" Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.037631 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wmtfj" Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.037824 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.038144 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7m8f\" (UniqueName: \"kubernetes.io/projected/ab7acc10-27d6-4f6b-9ec1-faf4fb60771d-kube-api-access-w7m8f\") pod \"certified-operators-qzvgr\" (UID: \"ab7acc10-27d6-4f6b-9ec1-faf4fb60771d\") " pod="openshift-marketplace/certified-operators-qzvgr" Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.038204 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab7acc10-27d6-4f6b-9ec1-faf4fb60771d-catalog-content\") pod \"certified-operators-qzvgr\" (UID: \"ab7acc10-27d6-4f6b-9ec1-faf4fb60771d\") " pod="openshift-marketplace/certified-operators-qzvgr" Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.038264 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab7acc10-27d6-4f6b-9ec1-faf4fb60771d-utilities\") pod \"certified-operators-qzvgr\" (UID: \"ab7acc10-27d6-4f6b-9ec1-faf4fb60771d\") " pod="openshift-marketplace/certified-operators-qzvgr" Dec 06 01:04:43 crc kubenswrapper[4991]: E1206 01:04:43.038455 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:04:43.538434734 +0000 UTC m=+156.888725202 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.101319 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rdt9r"] Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.102311 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rdt9r" Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.140817 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7m8f\" (UniqueName: \"kubernetes.io/projected/ab7acc10-27d6-4f6b-9ec1-faf4fb60771d-kube-api-access-w7m8f\") pod \"certified-operators-qzvgr\" (UID: \"ab7acc10-27d6-4f6b-9ec1-faf4fb60771d\") " pod="openshift-marketplace/certified-operators-qzvgr" Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.141366 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.141405 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab7acc10-27d6-4f6b-9ec1-faf4fb60771d-catalog-content\") pod \"certified-operators-qzvgr\" (UID: \"ab7acc10-27d6-4f6b-9ec1-faf4fb60771d\") " pod="openshift-marketplace/certified-operators-qzvgr" Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.141460 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab7acc10-27d6-4f6b-9ec1-faf4fb60771d-utilities\") pod \"certified-operators-qzvgr\" (UID: \"ab7acc10-27d6-4f6b-9ec1-faf4fb60771d\") " pod="openshift-marketplace/certified-operators-qzvgr" Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.142156 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab7acc10-27d6-4f6b-9ec1-faf4fb60771d-utilities\") pod \"certified-operators-qzvgr\" (UID: \"ab7acc10-27d6-4f6b-9ec1-faf4fb60771d\") " pod="openshift-marketplace/certified-operators-qzvgr" Dec 06 01:04:43 crc kubenswrapper[4991]: E1206 01:04:43.143557 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 01:04:43.643527725 +0000 UTC m=+156.993818373 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v8msm" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.144409 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6cvk9" Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.145070 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab7acc10-27d6-4f6b-9ec1-faf4fb60771d-catalog-content\") pod \"certified-operators-qzvgr\" (UID: \"ab7acc10-27d6-4f6b-9ec1-faf4fb60771d\") " pod="openshift-marketplace/certified-operators-qzvgr" Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.203478 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rdt9r"] Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.208503 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7m8f\" (UniqueName: \"kubernetes.io/projected/ab7acc10-27d6-4f6b-9ec1-faf4fb60771d-kube-api-access-w7m8f\") pod \"certified-operators-qzvgr\" (UID: \"ab7acc10-27d6-4f6b-9ec1-faf4fb60771d\") " pod="openshift-marketplace/certified-operators-qzvgr" Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.242721 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.243004 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thqvw\" (UniqueName: \"kubernetes.io/projected/6ca586f2-2d2c-4772-9af3-c870daaf5285-kube-api-access-thqvw\") pod \"community-operators-rdt9r\" (UID: \"6ca586f2-2d2c-4772-9af3-c870daaf5285\") " pod="openshift-marketplace/community-operators-rdt9r" Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.243074 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ca586f2-2d2c-4772-9af3-c870daaf5285-utilities\") pod \"community-operators-rdt9r\" (UID: \"6ca586f2-2d2c-4772-9af3-c870daaf5285\") " pod="openshift-marketplace/community-operators-rdt9r" Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.243168 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ca586f2-2d2c-4772-9af3-c870daaf5285-catalog-content\") pod \"community-operators-rdt9r\" (UID: \"6ca586f2-2d2c-4772-9af3-c870daaf5285\") " pod="openshift-marketplace/community-operators-rdt9r" Dec 06 01:04:43 crc kubenswrapper[4991]: E1206 01:04:43.243357 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:04:43.743332346 +0000 UTC m=+157.093622814 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.271605 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-pnxx4" event={"ID":"cf947a51-75da-452e-8947-561b0230bcab","Type":"ContainerStarted","Data":"8343f58077e0a7f5d900221241e1933f3ec9700a8bcc13f0fe708532be673b69"} Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.288440 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416380-k4nkv" event={"ID":"0d67d95f-81bb-491f-87ca-e15aba253031","Type":"ContainerStarted","Data":"3ccb1a5398bff2f86531f39a2156bcb413fb4c5b8e59a75d24b0037488a6e844"} Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.291566 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rxb7h" event={"ID":"3ad4cd41-d615-4bdd-bd75-baa9c5f2a141","Type":"ContainerStarted","Data":"12db8e228facd084989f34c8d8ac1bfedea5b12f57cc6e4b5ac5be7a5351e850"} Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.321485 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4rtwr" event={"ID":"0c087aa7-b699-4d07-adb8-e3082e693727","Type":"ContainerStarted","Data":"32e4ad2d82bbbd174fcd02bc5667f42a80458c5c5a561ca15325a2a7f911cb33"} Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.342419 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-mwwth" event={"ID":"bbd5656f-5e1f-4c7b-921d-6a13c8145217","Type":"ContainerStarted","Data":"929813a2e18e5d088f09e122041668a21b955677eb37ace176deca99b858d09e"} Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.342496 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-mwwth" event={"ID":"bbd5656f-5e1f-4c7b-921d-6a13c8145217","Type":"ContainerStarted","Data":"2bb278e9b6c7dd02c2ad84e2943fb8528c532c93449d6d44db52a96c09997576"} Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.345071 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-mwwth" Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.345925 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thqvw\" (UniqueName: \"kubernetes.io/projected/6ca586f2-2d2c-4772-9af3-c870daaf5285-kube-api-access-thqvw\") pod \"community-operators-rdt9r\" (UID: \"6ca586f2-2d2c-4772-9af3-c870daaf5285\") " pod="openshift-marketplace/community-operators-rdt9r" Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.345977 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ca586f2-2d2c-4772-9af3-c870daaf5285-utilities\") pod \"community-operators-rdt9r\" (UID: \"6ca586f2-2d2c-4772-9af3-c870daaf5285\") " pod="openshift-marketplace/community-operators-rdt9r" Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.346053 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.346119 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ca586f2-2d2c-4772-9af3-c870daaf5285-catalog-content\") pod \"community-operators-rdt9r\" (UID: \"6ca586f2-2d2c-4772-9af3-c870daaf5285\") " pod="openshift-marketplace/community-operators-rdt9r" Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.346511 4991 patch_prober.go:28] interesting pod/downloads-7954f5f757-mwwth container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.346626 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mwwth" podUID="bbd5656f-5e1f-4c7b-921d-6a13c8145217" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.346765 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ca586f2-2d2c-4772-9af3-c870daaf5285-utilities\") pod \"community-operators-rdt9r\" (UID: \"6ca586f2-2d2c-4772-9af3-c870daaf5285\") " pod="openshift-marketplace/community-operators-rdt9r" Dec 06 01:04:43 crc kubenswrapper[4991]: E1206 01:04:43.347496 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 01:04:43.847482182 +0000 UTC m=+157.197772650 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v8msm" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.347496 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ca586f2-2d2c-4772-9af3-c870daaf5285-catalog-content\") pod \"community-operators-rdt9r\" (UID: \"6ca586f2-2d2c-4772-9af3-c870daaf5285\") " pod="openshift-marketplace/community-operators-rdt9r" Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.362689 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9jcnt" event={"ID":"e335b799-3e6d-4214-bcae-ee6dd64054d5","Type":"ContainerStarted","Data":"1a2866e226d862b60d19d119960c27eb2babd13388fea3f4fcad13c325d25a47"} Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.381304 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-hxspx" event={"ID":"1ae11df0-5cf0-4a53-b313-91c9182dde28","Type":"ContainerStarted","Data":"cbd502a75b6f15f9b40d59ba41faba3029b8128e7e260e24a9a6752353e418f6"} Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.383447 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-hxspx" Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.383534 4991 patch_prober.go:28] interesting pod/console-operator-58897d9998-hxspx container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.383570 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-hxspx" podUID="1ae11df0-5cf0-4a53-b313-91c9182dde28" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.389973 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thqvw\" (UniqueName: \"kubernetes.io/projected/6ca586f2-2d2c-4772-9af3-c870daaf5285-kube-api-access-thqvw\") pod \"community-operators-rdt9r\" (UID: \"6ca586f2-2d2c-4772-9af3-c870daaf5285\") " pod="openshift-marketplace/community-operators-rdt9r" Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.403722 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qv97d" event={"ID":"6a61cec3-9e94-43d4-8e8e-42674e6e6ef9","Type":"ContainerStarted","Data":"6df49038add3372d43cf9e95e4bb408bd14ec48c061a4a9ac70d1d307a0fd80b"} Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.405112 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qv97d" Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.419877 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-mwwth" podStartSLOduration=132.419860107 podStartE2EDuration="2m12.419860107s" podCreationTimestamp="2025-12-06 01:02:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:04:43.418356627 +0000 UTC m=+156.768647095" watchObservedRunningTime="2025-12-06 01:04:43.419860107 +0000 UTC m=+156.770150575" Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.420783 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4rtwr" podStartSLOduration=132.420777462 podStartE2EDuration="2m12.420777462s" podCreationTimestamp="2025-12-06 01:02:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:04:43.378721758 +0000 UTC m=+156.729012226" watchObservedRunningTime="2025-12-06 01:04:43.420777462 +0000 UTC m=+156.771067930" Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.429762 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-vvg2d" event={"ID":"2dbea4fd-7c3f-4ce2-bf97-c25431e900e7","Type":"ContainerStarted","Data":"2b7dab64f00dde6ad8b133fa805e8626ac1a2a0fd63170b4a81942035a2e10fb"} Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.440846 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rw4lh" event={"ID":"36232195-9b1f-42bb-92df-0487bd3bac44","Type":"ContainerStarted","Data":"454b5816aaf4cee4039f1d6e5fca29cd1fe707eb8118662630efc30d4dba42ee"} Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.448661 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:04:43 crc kubenswrapper[4991]: E1206 01:04:43.449785 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:04:43.949764077 +0000 UTC m=+157.300054545 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.449840 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.450353 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qv97d" podStartSLOduration=131.450333543 podStartE2EDuration="2m11.450333543s" podCreationTimestamp="2025-12-06 01:02:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:04:43.44949935 +0000 UTC m=+156.799789818" watchObservedRunningTime="2025-12-06 01:04:43.450333543 +0000 UTC m=+156.800624011" Dec 06 01:04:43 crc kubenswrapper[4991]: E1206 01:04:43.451031 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 01:04:43.951007251 +0000 UTC m=+157.301297719 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v8msm" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.464342 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j79nx" event={"ID":"54fd8fc3-c45f-43d0-aeff-65e3b1669071","Type":"ContainerStarted","Data":"2ec65cbaefadf7d52cdaf2886d9a95884c96f00e17bf7e8875a1e31aa0ffc184"} Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.495568 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-hxspx" podStartSLOduration=132.495540782 podStartE2EDuration="2m12.495540782s" podCreationTimestamp="2025-12-06 01:02:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:04:43.494203526 +0000 UTC m=+156.844493994" watchObservedRunningTime="2025-12-06 01:04:43.495540782 +0000 UTC m=+156.845831250" Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.505869 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qzvgr" Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.530297 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-vvg2d" podStartSLOduration=132.530273971 podStartE2EDuration="2m12.530273971s" podCreationTimestamp="2025-12-06 01:02:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:04:43.529186392 +0000 UTC m=+156.879476860" watchObservedRunningTime="2025-12-06 01:04:43.530273971 +0000 UTC m=+156.880564439" Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.551776 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:04:43 crc kubenswrapper[4991]: E1206 01:04:43.553666 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:04:44.053638816 +0000 UTC m=+157.403929294 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.594842 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rdt9r" Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.600263 4991 generic.go:334] "Generic (PLEG): container finished" podID="67519387-ecfc-4c03-a199-6679431ac96f" containerID="1c847a4f74e5c380328de21788c51da86eb5931320c056b3829fe4765db92913" exitCode=0 Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.600402 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nt5q5" event={"ID":"67519387-ecfc-4c03-a199-6679431ac96f","Type":"ContainerDied","Data":"1c847a4f74e5c380328de21788c51da86eb5931320c056b3829fe4765db92913"} Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.655646 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:43 crc kubenswrapper[4991]: E1206 01:04:43.657069 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 01:04:44.157046613 +0000 UTC m=+157.507337081 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v8msm" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.665727 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-j79nx" podStartSLOduration=132.665697584 podStartE2EDuration="2m12.665697584s" podCreationTimestamp="2025-12-06 01:02:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:04:43.56500762 +0000 UTC m=+156.915298088" watchObservedRunningTime="2025-12-06 01:04:43.665697584 +0000 UTC m=+157.015988052" Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.716809 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-c6qz6" event={"ID":"65728f91-9c05-4fc3-af26-8ce6838a2d36","Type":"ContainerStarted","Data":"d40e1b6d4cbf97fa5b0d5e15cbc2ec063a64db1ea9d1e0238ae34e967b1def67"} Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.756597 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:04:43 crc kubenswrapper[4991]: E1206 01:04:43.757633 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:04:44.257614463 +0000 UTC m=+157.607904941 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.782932 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f5qmc" event={"ID":"9e67932d-8fce-413c-895a-c3e9c6fb0ca3","Type":"ContainerStarted","Data":"9f8f438aeb76fc0561305527d369fa5b2789a804daffdb7903c513019af2297b"} Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.784433 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f5qmc" Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.830188 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f5qmc" Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.861083 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:43 crc kubenswrapper[4991]: E1206 01:04:43.863727 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 01:04:44.363704461 +0000 UTC m=+157.713994929 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v8msm" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.890706 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-s7xc9" event={"ID":"b8fc9504-e68a-4d0a-99bc-3b8da1e78bac","Type":"ContainerStarted","Data":"c910362c2ae72bbccdfb95fc6b06e521421a0310d1c01c9853583d75ff15f712"} Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.900705 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f5qmc" podStartSLOduration=131.90066702 podStartE2EDuration="2m11.90066702s" podCreationTimestamp="2025-12-06 01:02:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:04:43.85467793 +0000 UTC m=+157.204968408" watchObservedRunningTime="2025-12-06 01:04:43.90066702 +0000 UTC m=+157.250957478" Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.901772 4991 patch_prober.go:28] interesting pod/router-default-5444994796-qhtk2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 01:04:43 crc kubenswrapper[4991]: [-]has-synced failed: reason withheld Dec 06 01:04:43 crc kubenswrapper[4991]: [+]process-running ok Dec 06 01:04:43 crc kubenswrapper[4991]: healthz check failed Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.902309 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhtk2" podUID="e8f82106-cdc2-4dec-a30d-221e17bdc50d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.938336 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-bjq9r" event={"ID":"ad326998-ebb8-427b-979c-d21b83f25778","Type":"ContainerStarted","Data":"d1c05890ff5516891272152904c5d6c5388f05991c9657618cb3636ce6e63876"} Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.951889 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vtscj" event={"ID":"82128768-e91b-4e38-bc1b-2915bc9d4436","Type":"ContainerStarted","Data":"b71bf6d3d2a88c6a5683d82bccfa923cb0aba65b81ac96a4940a0252c825771c"} Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.951936 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vtscj" Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.974050 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:04:43 crc kubenswrapper[4991]: E1206 01:04:43.975785 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:04:44.475754299 +0000 UTC m=+157.826044767 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:43 crc kubenswrapper[4991]: I1206 01:04:43.984753 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-mw769" Dec 06 01:04:44 crc kubenswrapper[4991]: I1206 01:04:44.015838 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-bjq9r" podStartSLOduration=132.015803631 podStartE2EDuration="2m12.015803631s" podCreationTimestamp="2025-12-06 01:02:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:04:43.996059562 +0000 UTC m=+157.346350050" watchObservedRunningTime="2025-12-06 01:04:44.015803631 +0000 UTC m=+157.366094109" Dec 06 01:04:44 crc kubenswrapper[4991]: I1206 01:04:44.076180 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:44 crc kubenswrapper[4991]: E1206 01:04:44.085779 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 01:04:44.585758432 +0000 UTC m=+157.936048900 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v8msm" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:44 crc kubenswrapper[4991]: I1206 01:04:44.124429 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vtscj" podStartSLOduration=133.124409326 podStartE2EDuration="2m13.124409326s" podCreationTimestamp="2025-12-06 01:02:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:04:44.123140432 +0000 UTC m=+157.473430900" watchObservedRunningTime="2025-12-06 01:04:44.124409326 +0000 UTC m=+157.474699794" Dec 06 01:04:44 crc kubenswrapper[4991]: I1206 01:04:44.178366 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:04:44 crc kubenswrapper[4991]: E1206 01:04:44.178795 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:04:44.678777591 +0000 UTC m=+158.029068059 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:44 crc kubenswrapper[4991]: I1206 01:04:44.219908 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wmtfj"] Dec 06 01:04:44 crc kubenswrapper[4991]: I1206 01:04:44.279782 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:44 crc kubenswrapper[4991]: E1206 01:04:44.280368 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 01:04:44.780348108 +0000 UTC m=+158.130638576 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v8msm" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:44 crc kubenswrapper[4991]: I1206 01:04:44.334774 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6cvk9"] Dec 06 01:04:44 crc kubenswrapper[4991]: I1206 01:04:44.395167 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:04:44 crc kubenswrapper[4991]: E1206 01:04:44.397329 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:04:44.897263386 +0000 UTC m=+158.247553854 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:44 crc kubenswrapper[4991]: I1206 01:04:44.409404 4991 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-qv97d container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 06 01:04:44 crc kubenswrapper[4991]: I1206 01:04:44.409502 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qv97d" podUID="6a61cec3-9e94-43d4-8e8e-42674e6e6ef9" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.30:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 06 01:04:44 crc kubenswrapper[4991]: I1206 01:04:44.497613 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:44 crc kubenswrapper[4991]: E1206 01:04:44.498083 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 01:04:44.998067812 +0000 UTC m=+158.348358280 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v8msm" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:44 crc kubenswrapper[4991]: I1206 01:04:44.555043 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-52fr4"] Dec 06 01:04:44 crc kubenswrapper[4991]: I1206 01:04:44.559160 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-52fr4" Dec 06 01:04:44 crc kubenswrapper[4991]: I1206 01:04:44.576713 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 06 01:04:44 crc kubenswrapper[4991]: I1206 01:04:44.600740 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:04:44 crc kubenswrapper[4991]: I1206 01:04:44.601079 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a5f3df1-9238-407b-9fb1-97d41e3a7f11-catalog-content\") pod \"redhat-marketplace-52fr4\" (UID: \"7a5f3df1-9238-407b-9fb1-97d41e3a7f11\") " pod="openshift-marketplace/redhat-marketplace-52fr4" Dec 06 01:04:44 crc kubenswrapper[4991]: I1206 01:04:44.601169 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a5f3df1-9238-407b-9fb1-97d41e3a7f11-utilities\") pod \"redhat-marketplace-52fr4\" (UID: \"7a5f3df1-9238-407b-9fb1-97d41e3a7f11\") " pod="openshift-marketplace/redhat-marketplace-52fr4" Dec 06 01:04:44 crc kubenswrapper[4991]: I1206 01:04:44.601241 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n8l7\" (UniqueName: \"kubernetes.io/projected/7a5f3df1-9238-407b-9fb1-97d41e3a7f11-kube-api-access-7n8l7\") pod \"redhat-marketplace-52fr4\" (UID: \"7a5f3df1-9238-407b-9fb1-97d41e3a7f11\") " pod="openshift-marketplace/redhat-marketplace-52fr4" Dec 06 01:04:44 crc kubenswrapper[4991]: E1206 01:04:44.601498 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:04:45.101473369 +0000 UTC m=+158.451763837 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:44 crc kubenswrapper[4991]: I1206 01:04:44.604964 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-52fr4"] Dec 06 01:04:44 crc kubenswrapper[4991]: I1206 01:04:44.702236 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:44 crc kubenswrapper[4991]: E1206 01:04:44.706184 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 01:04:45.20616251 +0000 UTC m=+158.556452978 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v8msm" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:44 crc kubenswrapper[4991]: I1206 01:04:44.708521 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a5f3df1-9238-407b-9fb1-97d41e3a7f11-catalog-content\") pod \"redhat-marketplace-52fr4\" (UID: \"7a5f3df1-9238-407b-9fb1-97d41e3a7f11\") " pod="openshift-marketplace/redhat-marketplace-52fr4" Dec 06 01:04:44 crc kubenswrapper[4991]: I1206 01:04:44.708561 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a5f3df1-9238-407b-9fb1-97d41e3a7f11-utilities\") pod \"redhat-marketplace-52fr4\" (UID: \"7a5f3df1-9238-407b-9fb1-97d41e3a7f11\") " pod="openshift-marketplace/redhat-marketplace-52fr4" Dec 06 01:04:44 crc kubenswrapper[4991]: I1206 01:04:44.708582 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7n8l7\" (UniqueName: \"kubernetes.io/projected/7a5f3df1-9238-407b-9fb1-97d41e3a7f11-kube-api-access-7n8l7\") pod \"redhat-marketplace-52fr4\" (UID: \"7a5f3df1-9238-407b-9fb1-97d41e3a7f11\") " pod="openshift-marketplace/redhat-marketplace-52fr4" Dec 06 01:04:44 crc kubenswrapper[4991]: I1206 01:04:44.709362 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a5f3df1-9238-407b-9fb1-97d41e3a7f11-catalog-content\") pod \"redhat-marketplace-52fr4\" (UID: \"7a5f3df1-9238-407b-9fb1-97d41e3a7f11\") " pod="openshift-marketplace/redhat-marketplace-52fr4" Dec 06 01:04:44 crc kubenswrapper[4991]: I1206 01:04:44.709761 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a5f3df1-9238-407b-9fb1-97d41e3a7f11-utilities\") pod \"redhat-marketplace-52fr4\" (UID: \"7a5f3df1-9238-407b-9fb1-97d41e3a7f11\") " pod="openshift-marketplace/redhat-marketplace-52fr4" Dec 06 01:04:44 crc kubenswrapper[4991]: I1206 01:04:44.779754 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n8l7\" (UniqueName: \"kubernetes.io/projected/7a5f3df1-9238-407b-9fb1-97d41e3a7f11-kube-api-access-7n8l7\") pod \"redhat-marketplace-52fr4\" (UID: \"7a5f3df1-9238-407b-9fb1-97d41e3a7f11\") " pod="openshift-marketplace/redhat-marketplace-52fr4" Dec 06 01:04:44 crc kubenswrapper[4991]: I1206 01:04:44.809751 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:04:44 crc kubenswrapper[4991]: E1206 01:04:44.810210 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:04:45.310194833 +0000 UTC m=+158.660485301 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:44 crc kubenswrapper[4991]: I1206 01:04:44.818578 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qzvgr"] Dec 06 01:04:44 crc kubenswrapper[4991]: I1206 01:04:44.890384 4991 patch_prober.go:28] interesting pod/router-default-5444994796-qhtk2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 01:04:44 crc kubenswrapper[4991]: [-]has-synced failed: reason withheld Dec 06 01:04:44 crc kubenswrapper[4991]: [+]process-running ok Dec 06 01:04:44 crc kubenswrapper[4991]: healthz check failed Dec 06 01:04:44 crc kubenswrapper[4991]: I1206 01:04:44.890441 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhtk2" podUID="e8f82106-cdc2-4dec-a30d-221e17bdc50d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 01:04:44 crc kubenswrapper[4991]: I1206 01:04:44.906068 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-b72bk"] Dec 06 01:04:44 crc kubenswrapper[4991]: I1206 01:04:44.908756 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b72bk" Dec 06 01:04:44 crc kubenswrapper[4991]: I1206 01:04:44.914129 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:44 crc kubenswrapper[4991]: E1206 01:04:44.914631 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 01:04:45.414614706 +0000 UTC m=+158.764905174 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v8msm" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:44 crc kubenswrapper[4991]: I1206 01:04:44.915756 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-52fr4" Dec 06 01:04:44 crc kubenswrapper[4991]: I1206 01:04:44.986769 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.001319 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-647zw" event={"ID":"f130c551-8102-4733-8447-8f749c2f90f5","Type":"ContainerStarted","Data":"a972e355fb20740402174a9e9941fd0d01972b1f10d7248524eefbc5e5ff243c"} Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.001381 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6cvk9" event={"ID":"271a982a-6005-4d08-b284-6b140ba25f07","Type":"ContainerStarted","Data":"b53d9d8a3a656f17888627f62f97748bd3013161a76c2c95673aeef2eb8fbac2"} Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.001403 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6cvk9" event={"ID":"271a982a-6005-4d08-b284-6b140ba25f07","Type":"ContainerStarted","Data":"8038e52bbcca226d5646d41345f2853f3b6998cdadc3cb1c0b432a13fb630402"} Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.001541 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.022202 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.022469 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.023193 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.023516 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/47fdd6b8-9c70-4bdc-a103-c8db8445f397-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"47fdd6b8-9c70-4bdc-a103-c8db8445f397\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.023550 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4317558f-42d2-4dd3-8d1e-1e13c4355cc4-catalog-content\") pod \"redhat-marketplace-b72bk\" (UID: \"4317558f-42d2-4dd3-8d1e-1e13c4355cc4\") " pod="openshift-marketplace/redhat-marketplace-b72bk" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.023607 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5nt2\" (UniqueName: \"kubernetes.io/projected/4317558f-42d2-4dd3-8d1e-1e13c4355cc4-kube-api-access-s5nt2\") pod \"redhat-marketplace-b72bk\" (UID: \"4317558f-42d2-4dd3-8d1e-1e13c4355cc4\") " pod="openshift-marketplace/redhat-marketplace-b72bk" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.023642 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/47fdd6b8-9c70-4bdc-a103-c8db8445f397-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"47fdd6b8-9c70-4bdc-a103-c8db8445f397\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.023681 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4317558f-42d2-4dd3-8d1e-1e13c4355cc4-utilities\") pod \"redhat-marketplace-b72bk\" (UID: \"4317558f-42d2-4dd3-8d1e-1e13c4355cc4\") " pod="openshift-marketplace/redhat-marketplace-b72bk" Dec 06 01:04:45 crc kubenswrapper[4991]: E1206 01:04:45.023867 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:04:45.523845918 +0000 UTC m=+158.874136386 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.025114 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b72bk"] Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.025440 4991 generic.go:334] "Generic (PLEG): container finished" podID="aeca113e-21c6-4114-9130-111b9c2e051a" containerID="6e525f9840cda74b087fab59022cc662c2ba4f41c037e860476319fe4257fea1" exitCode=0 Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.025571 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wmtfj" event={"ID":"aeca113e-21c6-4114-9130-111b9c2e051a","Type":"ContainerDied","Data":"6e525f9840cda74b087fab59022cc662c2ba4f41c037e860476319fe4257fea1"} Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.025595 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wmtfj" event={"ID":"aeca113e-21c6-4114-9130-111b9c2e051a","Type":"ContainerStarted","Data":"a933c843ac99e140d4d26dcbcefee59c0b8452eaea3fb196321d20b783e74663"} Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.035371 4991 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.086918 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rw4lh" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.086956 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.086975 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rw4lh" event={"ID":"36232195-9b1f-42bb-92df-0487bd3bac44","Type":"ContainerStarted","Data":"ce49af8cc3259581145758d373e10531ca726ca43711683a915ad56f99a656a5"} Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.100925 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-kpfbw" event={"ID":"d7a020b5-5981-4927-a254-ecad13dce340","Type":"ContainerStarted","Data":"efb54d7d838a29f8678d7a96955d23dd3d2b28a6ee609e0b79ca541a393b8f37"} Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.116564 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rw4lh" podStartSLOduration=133.116537958 podStartE2EDuration="2m13.116537958s" podCreationTimestamp="2025-12-06 01:02:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:04:45.11322761 +0000 UTC m=+158.463518078" watchObservedRunningTime="2025-12-06 01:04:45.116537958 +0000 UTC m=+158.466828426" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.129567 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9jcnt" event={"ID":"e335b799-3e6d-4214-bcae-ee6dd64054d5","Type":"ContainerStarted","Data":"d607ba3d4c6abca298d7071f7636758f7503b0356aee6a943c453648d9ac6daa"} Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.135011 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/47fdd6b8-9c70-4bdc-a103-c8db8445f397-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"47fdd6b8-9c70-4bdc-a103-c8db8445f397\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.136518 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/47fdd6b8-9c70-4bdc-a103-c8db8445f397-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"47fdd6b8-9c70-4bdc-a103-c8db8445f397\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.135067 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.142269 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4317558f-42d2-4dd3-8d1e-1e13c4355cc4-utilities\") pod \"redhat-marketplace-b72bk\" (UID: \"4317558f-42d2-4dd3-8d1e-1e13c4355cc4\") " pod="openshift-marketplace/redhat-marketplace-b72bk" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.142421 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/47fdd6b8-9c70-4bdc-a103-c8db8445f397-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"47fdd6b8-9c70-4bdc-a103-c8db8445f397\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.142472 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4317558f-42d2-4dd3-8d1e-1e13c4355cc4-catalog-content\") pod \"redhat-marketplace-b72bk\" (UID: \"4317558f-42d2-4dd3-8d1e-1e13c4355cc4\") " pod="openshift-marketplace/redhat-marketplace-b72bk" Dec 06 01:04:45 crc kubenswrapper[4991]: E1206 01:04:45.142579 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 01:04:45.642554474 +0000 UTC m=+158.992844942 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v8msm" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.153598 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5nt2\" (UniqueName: \"kubernetes.io/projected/4317558f-42d2-4dd3-8d1e-1e13c4355cc4-kube-api-access-s5nt2\") pod \"redhat-marketplace-b72bk\" (UID: \"4317558f-42d2-4dd3-8d1e-1e13c4355cc4\") " pod="openshift-marketplace/redhat-marketplace-b72bk" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.177460 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4317558f-42d2-4dd3-8d1e-1e13c4355cc4-catalog-content\") pod \"redhat-marketplace-b72bk\" (UID: \"4317558f-42d2-4dd3-8d1e-1e13c4355cc4\") " pod="openshift-marketplace/redhat-marketplace-b72bk" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.191775 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4317558f-42d2-4dd3-8d1e-1e13c4355cc4-utilities\") pod \"redhat-marketplace-b72bk\" (UID: \"4317558f-42d2-4dd3-8d1e-1e13c4355cc4\") " pod="openshift-marketplace/redhat-marketplace-b72bk" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.195162 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jcdx4" event={"ID":"102ad02d-65ff-4d28-9d31-c95f66848f71","Type":"ContainerStarted","Data":"178fdb8ac5b9c3cbf27f076b779b0fee2b243e09de2ef8e3f8ad4ad00c761311"} Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.195267 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jcdx4" event={"ID":"102ad02d-65ff-4d28-9d31-c95f66848f71","Type":"ContainerStarted","Data":"48d1ccecb99b259025b26b8773d7b502734ab80245ddd855ced59a4924cbdf48"} Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.208550 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5nt2\" (UniqueName: \"kubernetes.io/projected/4317558f-42d2-4dd3-8d1e-1e13c4355cc4-kube-api-access-s5nt2\") pod \"redhat-marketplace-b72bk\" (UID: \"4317558f-42d2-4dd3-8d1e-1e13c4355cc4\") " pod="openshift-marketplace/redhat-marketplace-b72bk" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.216715 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/47fdd6b8-9c70-4bdc-a103-c8db8445f397-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"47fdd6b8-9c70-4bdc-a103-c8db8445f397\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.250961 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rdt9r"] Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.256609 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416380-k4nkv" event={"ID":"0d67d95f-81bb-491f-87ca-e15aba253031","Type":"ContainerStarted","Data":"418db56e6e056c536019e0f33d387cd948704052752b80a1e797bb4c32ee374a"} Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.258739 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:04:45 crc kubenswrapper[4991]: E1206 01:04:45.261053 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:04:45.761023573 +0000 UTC m=+159.111314041 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.292775 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9jcnt" podStartSLOduration=134.292752262 podStartE2EDuration="2m14.292752262s" podCreationTimestamp="2025-12-06 01:02:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:04:45.291472108 +0000 UTC m=+158.641762576" watchObservedRunningTime="2025-12-06 01:04:45.292752262 +0000 UTC m=+158.643042730" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.304161 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-rxb7h" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.304217 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nt5q5" event={"ID":"67519387-ecfc-4c03-a199-6679431ac96f","Type":"ContainerStarted","Data":"431f11ff6baa4c7f25609b58c1774027c1d68ce873184fd44ea870bbf2a21fd1"} Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.304247 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rxb7h" event={"ID":"3ad4cd41-d615-4bdd-bd75-baa9c5f2a141","Type":"ContainerStarted","Data":"4dd4142a6e0a5e5ee98e00853d18280493948f18798e563154f022b9d049909f"} Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.306946 4991 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-rxb7h container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.307040 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-rxb7h" podUID="3ad4cd41-d615-4bdd-bd75-baa9c5f2a141" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.320917 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-s7xc9" event={"ID":"b8fc9504-e68a-4d0a-99bc-3b8da1e78bac","Type":"ContainerStarted","Data":"e68e02f3bde19671ef6f679deece0e4e1dbd35ec23fa2304cdba4874e45c064b"} Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.323697 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jcdx4" podStartSLOduration=133.32368594 podStartE2EDuration="2m13.32368594s" podCreationTimestamp="2025-12-06 01:02:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:04:45.320591357 +0000 UTC m=+158.670881825" watchObservedRunningTime="2025-12-06 01:04:45.32368594 +0000 UTC m=+158.673976408" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.358886 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-bjq9r" event={"ID":"ad326998-ebb8-427b-979c-d21b83f25778","Type":"ContainerStarted","Data":"0909ff803dc65cdfcc7de629201972bd57e2be61cc61f806372f9ffa29358bd4"} Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.361354 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:45 crc kubenswrapper[4991]: E1206 01:04:45.363905 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 01:04:45.863890396 +0000 UTC m=+159.214180864 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v8msm" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.382304 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-c6qz6" event={"ID":"65728f91-9c05-4fc3-af26-8ce6838a2d36","Type":"ContainerStarted","Data":"15de66221747f9d419057df6b0d76af0d592c431af1a9db58a16ac6c7e2def40"} Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.380107 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-kpfbw" podStartSLOduration=134.380075389 podStartE2EDuration="2m14.380075389s" podCreationTimestamp="2025-12-06 01:02:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:04:45.373748859 +0000 UTC m=+158.724039327" watchObservedRunningTime="2025-12-06 01:04:45.380075389 +0000 UTC m=+158.730365857" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.383213 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-c6qz6" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.401746 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-s7xc9" podStartSLOduration=133.401724548 podStartE2EDuration="2m13.401724548s" podCreationTimestamp="2025-12-06 01:02:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:04:45.397044712 +0000 UTC m=+158.747335180" watchObservedRunningTime="2025-12-06 01:04:45.401724548 +0000 UTC m=+158.752015026" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.407775 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-52fr4"] Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.408655 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b72bk" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.417651 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-wlpng" event={"ID":"a8e10183-5783-464a-96f1-2157a8690d9b","Type":"ContainerStarted","Data":"4b869a279c879aab776571e691008f5c8e78ac5a6b833c98963df13ae2a4741e"} Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.417700 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-wlpng" event={"ID":"a8e10183-5783-464a-96f1-2157a8690d9b","Type":"ContainerStarted","Data":"0a3d9af81f5f420ee44f8a6f50c4466061ecc8135f1b56ce40e686cfd472aa3e"} Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.463394 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:04:45 crc kubenswrapper[4991]: E1206 01:04:45.467832 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:04:45.967801705 +0000 UTC m=+159.318092173 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.467842 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qzvgr" event={"ID":"ab7acc10-27d6-4f6b-9ec1-faf4fb60771d","Type":"ContainerStarted","Data":"37299552df8e6bc75231d7552e3f82537137191e8d8ddd4bd49e6ab01af96c78"} Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.469708 4991 patch_prober.go:28] interesting pod/downloads-7954f5f757-mwwth container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.469910 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mwwth" podUID="bbd5656f-5e1f-4c7b-921d-6a13c8145217" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.494022 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29416380-k4nkv" podStartSLOduration=134.493962285 podStartE2EDuration="2m14.493962285s" podCreationTimestamp="2025-12-06 01:02:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:04:45.44854313 +0000 UTC m=+158.798833598" watchObservedRunningTime="2025-12-06 01:04:45.493962285 +0000 UTC m=+158.844252753" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.500919 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-rxb7h" podStartSLOduration=133.500896981 podStartE2EDuration="2m13.500896981s" podCreationTimestamp="2025-12-06 01:02:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:04:45.492453115 +0000 UTC m=+158.842743583" watchObservedRunningTime="2025-12-06 01:04:45.500896981 +0000 UTC m=+158.851187449" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.497121 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.504537 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-hxspx" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.508367 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vtscj" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.511016 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qv97d" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.545916 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nt5q5" podStartSLOduration=133.545888965 podStartE2EDuration="2m13.545888965s" podCreationTimestamp="2025-12-06 01:02:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:04:45.531969482 +0000 UTC m=+158.882259950" watchObservedRunningTime="2025-12-06 01:04:45.545888965 +0000 UTC m=+158.896179423" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.567578 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:45 crc kubenswrapper[4991]: E1206 01:04:45.580673 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 01:04:46.080652385 +0000 UTC m=+159.430942853 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v8msm" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.653948 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-c6qz6" podStartSLOduration=9.653927475 podStartE2EDuration="9.653927475s" podCreationTimestamp="2025-12-06 01:04:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:04:45.584039445 +0000 UTC m=+158.934329913" watchObservedRunningTime="2025-12-06 01:04:45.653927475 +0000 UTC m=+159.004217963" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.669724 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:04:45 crc kubenswrapper[4991]: E1206 01:04:45.670881 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:04:46.170862588 +0000 UTC m=+159.521153056 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.749817 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-wlpng" podStartSLOduration=133.749733618 podStartE2EDuration="2m13.749733618s" podCreationTimestamp="2025-12-06 01:02:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:04:45.739012851 +0000 UTC m=+159.089303319" watchObservedRunningTime="2025-12-06 01:04:45.749733618 +0000 UTC m=+159.100024096" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.753847 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-p8x59"] Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.755084 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p8x59" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.763469 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.772004 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.772252 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c39eb27-3dbb-46d7-a874-ca6605d84134-catalog-content\") pod \"redhat-operators-p8x59\" (UID: \"5c39eb27-3dbb-46d7-a874-ca6605d84134\") " pod="openshift-marketplace/redhat-operators-p8x59" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.772355 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c39eb27-3dbb-46d7-a874-ca6605d84134-utilities\") pod \"redhat-operators-p8x59\" (UID: \"5c39eb27-3dbb-46d7-a874-ca6605d84134\") " pod="openshift-marketplace/redhat-operators-p8x59" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.772450 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzn62\" (UniqueName: \"kubernetes.io/projected/5c39eb27-3dbb-46d7-a874-ca6605d84134-kube-api-access-fzn62\") pod \"redhat-operators-p8x59\" (UID: \"5c39eb27-3dbb-46d7-a874-ca6605d84134\") " pod="openshift-marketplace/redhat-operators-p8x59" Dec 06 01:04:45 crc kubenswrapper[4991]: E1206 01:04:45.772732 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 01:04:46.272711513 +0000 UTC m=+159.623001981 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v8msm" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.796003 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p8x59"] Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.876211 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.876586 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzn62\" (UniqueName: \"kubernetes.io/projected/5c39eb27-3dbb-46d7-a874-ca6605d84134-kube-api-access-fzn62\") pod \"redhat-operators-p8x59\" (UID: \"5c39eb27-3dbb-46d7-a874-ca6605d84134\") " pod="openshift-marketplace/redhat-operators-p8x59" Dec 06 01:04:45 crc kubenswrapper[4991]: E1206 01:04:45.876659 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:04:46.376620053 +0000 UTC m=+159.726910521 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.876719 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.876900 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c39eb27-3dbb-46d7-a874-ca6605d84134-catalog-content\") pod \"redhat-operators-p8x59\" (UID: \"5c39eb27-3dbb-46d7-a874-ca6605d84134\") " pod="openshift-marketplace/redhat-operators-p8x59" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.876954 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c39eb27-3dbb-46d7-a874-ca6605d84134-utilities\") pod \"redhat-operators-p8x59\" (UID: \"5c39eb27-3dbb-46d7-a874-ca6605d84134\") " pod="openshift-marketplace/redhat-operators-p8x59" Dec 06 01:04:45 crc kubenswrapper[4991]: E1206 01:04:45.877207 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 01:04:46.377193748 +0000 UTC m=+159.727484216 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v8msm" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.877586 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c39eb27-3dbb-46d7-a874-ca6605d84134-utilities\") pod \"redhat-operators-p8x59\" (UID: \"5c39eb27-3dbb-46d7-a874-ca6605d84134\") " pod="openshift-marketplace/redhat-operators-p8x59" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.877699 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c39eb27-3dbb-46d7-a874-ca6605d84134-catalog-content\") pod \"redhat-operators-p8x59\" (UID: \"5c39eb27-3dbb-46d7-a874-ca6605d84134\") " pod="openshift-marketplace/redhat-operators-p8x59" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.898552 4991 patch_prober.go:28] interesting pod/router-default-5444994796-qhtk2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 01:04:45 crc kubenswrapper[4991]: [-]has-synced failed: reason withheld Dec 06 01:04:45 crc kubenswrapper[4991]: [+]process-running ok Dec 06 01:04:45 crc kubenswrapper[4991]: healthz check failed Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.898666 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhtk2" podUID="e8f82106-cdc2-4dec-a30d-221e17bdc50d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.963685 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzn62\" (UniqueName: \"kubernetes.io/projected/5c39eb27-3dbb-46d7-a874-ca6605d84134-kube-api-access-fzn62\") pod \"redhat-operators-p8x59\" (UID: \"5c39eb27-3dbb-46d7-a874-ca6605d84134\") " pod="openshift-marketplace/redhat-operators-p8x59" Dec 06 01:04:45 crc kubenswrapper[4991]: I1206 01:04:45.978366 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:04:45 crc kubenswrapper[4991]: E1206 01:04:45.979217 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:04:46.479195277 +0000 UTC m=+159.829485745 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:46 crc kubenswrapper[4991]: I1206 01:04:46.081478 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:46 crc kubenswrapper[4991]: E1206 01:04:46.082342 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 01:04:46.582328776 +0000 UTC m=+159.932619244 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v8msm" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:46 crc kubenswrapper[4991]: I1206 01:04:46.118048 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p8x59" Dec 06 01:04:46 crc kubenswrapper[4991]: I1206 01:04:46.131461 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bgvdn"] Dec 06 01:04:46 crc kubenswrapper[4991]: I1206 01:04:46.132779 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bgvdn" Dec 06 01:04:46 crc kubenswrapper[4991]: I1206 01:04:46.143175 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bgvdn"] Dec 06 01:04:46 crc kubenswrapper[4991]: I1206 01:04:46.186643 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:04:46 crc kubenswrapper[4991]: I1206 01:04:46.187102 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cb731e7-ddf5-47db-b5b1-607ae7a9f146-utilities\") pod \"redhat-operators-bgvdn\" (UID: \"2cb731e7-ddf5-47db-b5b1-607ae7a9f146\") " pod="openshift-marketplace/redhat-operators-bgvdn" Dec 06 01:04:46 crc kubenswrapper[4991]: I1206 01:04:46.187182 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxvm5\" (UniqueName: \"kubernetes.io/projected/2cb731e7-ddf5-47db-b5b1-607ae7a9f146-kube-api-access-rxvm5\") pod \"redhat-operators-bgvdn\" (UID: \"2cb731e7-ddf5-47db-b5b1-607ae7a9f146\") " pod="openshift-marketplace/redhat-operators-bgvdn" Dec 06 01:04:46 crc kubenswrapper[4991]: I1206 01:04:46.187277 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cb731e7-ddf5-47db-b5b1-607ae7a9f146-catalog-content\") pod \"redhat-operators-bgvdn\" (UID: \"2cb731e7-ddf5-47db-b5b1-607ae7a9f146\") " pod="openshift-marketplace/redhat-operators-bgvdn" Dec 06 01:04:46 crc kubenswrapper[4991]: E1206 01:04:46.187434 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:04:46.687412627 +0000 UTC m=+160.037703285 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:46 crc kubenswrapper[4991]: I1206 01:04:46.291126 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cb731e7-ddf5-47db-b5b1-607ae7a9f146-catalog-content\") pod \"redhat-operators-bgvdn\" (UID: \"2cb731e7-ddf5-47db-b5b1-607ae7a9f146\") " pod="openshift-marketplace/redhat-operators-bgvdn" Dec 06 01:04:46 crc kubenswrapper[4991]: I1206 01:04:46.291180 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cb731e7-ddf5-47db-b5b1-607ae7a9f146-utilities\") pod \"redhat-operators-bgvdn\" (UID: \"2cb731e7-ddf5-47db-b5b1-607ae7a9f146\") " pod="openshift-marketplace/redhat-operators-bgvdn" Dec 06 01:04:46 crc kubenswrapper[4991]: I1206 01:04:46.291232 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxvm5\" (UniqueName: \"kubernetes.io/projected/2cb731e7-ddf5-47db-b5b1-607ae7a9f146-kube-api-access-rxvm5\") pod \"redhat-operators-bgvdn\" (UID: \"2cb731e7-ddf5-47db-b5b1-607ae7a9f146\") " pod="openshift-marketplace/redhat-operators-bgvdn" Dec 06 01:04:46 crc kubenswrapper[4991]: I1206 01:04:46.291269 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:46 crc kubenswrapper[4991]: E1206 01:04:46.291684 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 01:04:46.791669766 +0000 UTC m=+160.141960234 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v8msm" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:46 crc kubenswrapper[4991]: I1206 01:04:46.292239 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cb731e7-ddf5-47db-b5b1-607ae7a9f146-catalog-content\") pod \"redhat-operators-bgvdn\" (UID: \"2cb731e7-ddf5-47db-b5b1-607ae7a9f146\") " pod="openshift-marketplace/redhat-operators-bgvdn" Dec 06 01:04:46 crc kubenswrapper[4991]: I1206 01:04:46.292481 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cb731e7-ddf5-47db-b5b1-607ae7a9f146-utilities\") pod \"redhat-operators-bgvdn\" (UID: \"2cb731e7-ddf5-47db-b5b1-607ae7a9f146\") " pod="openshift-marketplace/redhat-operators-bgvdn" Dec 06 01:04:46 crc kubenswrapper[4991]: I1206 01:04:46.341070 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxvm5\" (UniqueName: \"kubernetes.io/projected/2cb731e7-ddf5-47db-b5b1-607ae7a9f146-kube-api-access-rxvm5\") pod \"redhat-operators-bgvdn\" (UID: \"2cb731e7-ddf5-47db-b5b1-607ae7a9f146\") " pod="openshift-marketplace/redhat-operators-bgvdn" Dec 06 01:04:46 crc kubenswrapper[4991]: I1206 01:04:46.398878 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:04:46 crc kubenswrapper[4991]: E1206 01:04:46.399862 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:04:46.89984046 +0000 UTC m=+160.250130928 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:46 crc kubenswrapper[4991]: I1206 01:04:46.494410 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b72bk"] Dec 06 01:04:46 crc kubenswrapper[4991]: I1206 01:04:46.504381 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:46 crc kubenswrapper[4991]: E1206 01:04:46.504792 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 01:04:47.004777908 +0000 UTC m=+160.355068376 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v8msm" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:46 crc kubenswrapper[4991]: I1206 01:04:46.550451 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-647zw" event={"ID":"f130c551-8102-4733-8447-8f749c2f90f5","Type":"ContainerStarted","Data":"36657c302843f22481bbba4c4639a83e8af496d3f955b4a0465e0701879dae60"} Dec 06 01:04:46 crc kubenswrapper[4991]: I1206 01:04:46.573465 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bgvdn" Dec 06 01:04:46 crc kubenswrapper[4991]: I1206 01:04:46.573666 4991 generic.go:334] "Generic (PLEG): container finished" podID="271a982a-6005-4d08-b284-6b140ba25f07" containerID="b53d9d8a3a656f17888627f62f97748bd3013161a76c2c95673aeef2eb8fbac2" exitCode=0 Dec 06 01:04:46 crc kubenswrapper[4991]: I1206 01:04:46.574243 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6cvk9" event={"ID":"271a982a-6005-4d08-b284-6b140ba25f07","Type":"ContainerDied","Data":"b53d9d8a3a656f17888627f62f97748bd3013161a76c2c95673aeef2eb8fbac2"} Dec 06 01:04:46 crc kubenswrapper[4991]: I1206 01:04:46.606504 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:04:46 crc kubenswrapper[4991]: E1206 01:04:46.606727 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:04:47.106704915 +0000 UTC m=+160.456995383 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:46 crc kubenswrapper[4991]: I1206 01:04:46.606916 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:46 crc kubenswrapper[4991]: E1206 01:04:46.607371 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 01:04:47.107364162 +0000 UTC m=+160.457654630 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v8msm" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:46 crc kubenswrapper[4991]: W1206 01:04:46.636491 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4317558f_42d2_4dd3_8d1e_1e13c4355cc4.slice/crio-3da8328782a21eb259f2beb23675f695d15cc71cb9945a001df2ff766bbcc049 WatchSource:0}: Error finding container 3da8328782a21eb259f2beb23675f695d15cc71cb9945a001df2ff766bbcc049: Status 404 returned error can't find the container with id 3da8328782a21eb259f2beb23675f695d15cc71cb9945a001df2ff766bbcc049 Dec 06 01:04:46 crc kubenswrapper[4991]: I1206 01:04:46.650504 4991 generic.go:334] "Generic (PLEG): container finished" podID="6ca586f2-2d2c-4772-9af3-c870daaf5285" containerID="b59d3be3bcaadd91c7ee963da4863c7e7c430f9617109dd86a59f1a0fa667954" exitCode=0 Dec 06 01:04:46 crc kubenswrapper[4991]: I1206 01:04:46.650594 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rdt9r" event={"ID":"6ca586f2-2d2c-4772-9af3-c870daaf5285","Type":"ContainerDied","Data":"b59d3be3bcaadd91c7ee963da4863c7e7c430f9617109dd86a59f1a0fa667954"} Dec 06 01:04:46 crc kubenswrapper[4991]: I1206 01:04:46.650630 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rdt9r" event={"ID":"6ca586f2-2d2c-4772-9af3-c870daaf5285","Type":"ContainerStarted","Data":"4d4bb9c08e8fea5636c14076eb629406fd33b5fbc3e6b0b4768bbe6d969f9a68"} Dec 06 01:04:46 crc kubenswrapper[4991]: I1206 01:04:46.698258 4991 generic.go:334] "Generic (PLEG): container finished" podID="ab7acc10-27d6-4f6b-9ec1-faf4fb60771d" containerID="5d4dc4ab89acecc898408a73c987ae239047ac63a91cd3820c367cbfe16fa911" exitCode=0 Dec 06 01:04:46 crc kubenswrapper[4991]: I1206 01:04:46.698332 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qzvgr" event={"ID":"ab7acc10-27d6-4f6b-9ec1-faf4fb60771d","Type":"ContainerDied","Data":"5d4dc4ab89acecc898408a73c987ae239047ac63a91cd3820c367cbfe16fa911"} Dec 06 01:04:46 crc kubenswrapper[4991]: I1206 01:04:46.710469 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:04:46 crc kubenswrapper[4991]: E1206 01:04:46.711253 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:04:47.211233321 +0000 UTC m=+160.561523789 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:46 crc kubenswrapper[4991]: I1206 01:04:46.744431 4991 generic.go:334] "Generic (PLEG): container finished" podID="7a5f3df1-9238-407b-9fb1-97d41e3a7f11" containerID="2568abe8c7fc7ba168b0f5d6e8d76fb98844230102825e6d9668dad007f0702e" exitCode=0 Dec 06 01:04:46 crc kubenswrapper[4991]: I1206 01:04:46.744703 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 01:04:46 crc kubenswrapper[4991]: I1206 01:04:46.744773 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 01:04:46 crc kubenswrapper[4991]: I1206 01:04:46.746734 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-52fr4" event={"ID":"7a5f3df1-9238-407b-9fb1-97d41e3a7f11","Type":"ContainerDied","Data":"2568abe8c7fc7ba168b0f5d6e8d76fb98844230102825e6d9668dad007f0702e"} Dec 06 01:04:46 crc kubenswrapper[4991]: I1206 01:04:46.746777 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-52fr4" event={"ID":"7a5f3df1-9238-407b-9fb1-97d41e3a7f11","Type":"ContainerStarted","Data":"41e9f134e7bb2926307389d8dbf96ff866c7b71d8613170122a4ce274d6eba65"} Dec 06 01:04:46 crc kubenswrapper[4991]: I1206 01:04:46.748290 4991 patch_prober.go:28] interesting pod/downloads-7954f5f757-mwwth container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Dec 06 01:04:46 crc kubenswrapper[4991]: I1206 01:04:46.748343 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mwwth" podUID="bbd5656f-5e1f-4c7b-921d-6a13c8145217" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" Dec 06 01:04:46 crc kubenswrapper[4991]: I1206 01:04:46.766795 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 06 01:04:46 crc kubenswrapper[4991]: I1206 01:04:46.797224 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-rxb7h" Dec 06 01:04:46 crc kubenswrapper[4991]: I1206 01:04:46.814114 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:46 crc kubenswrapper[4991]: E1206 01:04:46.818384 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 01:04:47.318368387 +0000 UTC m=+160.668658855 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v8msm" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:46 crc kubenswrapper[4991]: I1206 01:04:46.864920 4991 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 06 01:04:46 crc kubenswrapper[4991]: I1206 01:04:46.920814 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:04:46 crc kubenswrapper[4991]: E1206 01:04:46.921545 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:04:47.421524757 +0000 UTC m=+160.771815225 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:46 crc kubenswrapper[4991]: I1206 01:04:46.967214 4991 patch_prober.go:28] interesting pod/router-default-5444994796-qhtk2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 01:04:46 crc kubenswrapper[4991]: [-]has-synced failed: reason withheld Dec 06 01:04:46 crc kubenswrapper[4991]: [+]process-running ok Dec 06 01:04:46 crc kubenswrapper[4991]: healthz check failed Dec 06 01:04:46 crc kubenswrapper[4991]: I1206 01:04:46.967306 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhtk2" podUID="e8f82106-cdc2-4dec-a30d-221e17bdc50d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 01:04:47 crc kubenswrapper[4991]: E1206 01:04:47.024590 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 01:04:47.524567363 +0000 UTC m=+160.874857831 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v8msm" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:47 crc kubenswrapper[4991]: I1206 01:04:47.030002 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:47 crc kubenswrapper[4991]: I1206 01:04:47.078191 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p8x59"] Dec 06 01:04:47 crc kubenswrapper[4991]: I1206 01:04:47.131918 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:04:47 crc kubenswrapper[4991]: E1206 01:04:47.132249 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-06 01:04:47.632229433 +0000 UTC m=+160.982519901 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:47 crc kubenswrapper[4991]: I1206 01:04:47.224639 4991 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-06T01:04:46.864952764Z","Handler":null,"Name":""} Dec 06 01:04:47 crc kubenswrapper[4991]: I1206 01:04:47.233256 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:47 crc kubenswrapper[4991]: E1206 01:04:47.233598 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-06 01:04:47.733583905 +0000 UTC m=+161.083874373 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v8msm" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 06 01:04:47 crc kubenswrapper[4991]: I1206 01:04:47.249234 4991 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 06 01:04:47 crc kubenswrapper[4991]: I1206 01:04:47.249285 4991 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 06 01:04:47 crc kubenswrapper[4991]: I1206 01:04:47.273484 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-h594s" Dec 06 01:04:47 crc kubenswrapper[4991]: I1206 01:04:47.301138 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-h594s" Dec 06 01:04:47 crc kubenswrapper[4991]: I1206 01:04:47.337770 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 06 01:04:47 crc kubenswrapper[4991]: I1206 01:04:47.343290 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bgvdn"] Dec 06 01:04:47 crc kubenswrapper[4991]: I1206 01:04:47.345436 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 06 01:04:47 crc kubenswrapper[4991]: I1206 01:04:47.439338 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:47 crc kubenswrapper[4991]: I1206 01:04:47.443941 4991 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 06 01:04:47 crc kubenswrapper[4991]: I1206 01:04:47.443999 4991 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:47 crc kubenswrapper[4991]: W1206 01:04:47.556531 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cb731e7_ddf5_47db_b5b1_607ae7a9f146.slice/crio-05467ac759a80c2da907905b6a3645ad55cf87aa509ef7133d68f09bbfecc74e WatchSource:0}: Error finding container 05467ac759a80c2da907905b6a3645ad55cf87aa509ef7133d68f09bbfecc74e: Status 404 returned error can't find the container with id 05467ac759a80c2da907905b6a3645ad55cf87aa509ef7133d68f09bbfecc74e Dec 06 01:04:47 crc kubenswrapper[4991]: I1206 01:04:47.599663 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v8msm\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:47 crc kubenswrapper[4991]: I1206 01:04:47.763044 4991 generic.go:334] "Generic (PLEG): container finished" podID="5c39eb27-3dbb-46d7-a874-ca6605d84134" containerID="867c41330627daa5c0d98002304ca56b3621bf8b95ffaf8d247634a23c9df42e" exitCode=0 Dec 06 01:04:47 crc kubenswrapper[4991]: I1206 01:04:47.763128 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p8x59" event={"ID":"5c39eb27-3dbb-46d7-a874-ca6605d84134","Type":"ContainerDied","Data":"867c41330627daa5c0d98002304ca56b3621bf8b95ffaf8d247634a23c9df42e"} Dec 06 01:04:47 crc kubenswrapper[4991]: I1206 01:04:47.763161 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p8x59" event={"ID":"5c39eb27-3dbb-46d7-a874-ca6605d84134","Type":"ContainerStarted","Data":"02b0650ee0ebaf197aa328805de4859f606d7e9be0ebf16636c8b36fcaf84d53"} Dec 06 01:04:47 crc kubenswrapper[4991]: I1206 01:04:47.828517 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:47 crc kubenswrapper[4991]: I1206 01:04:47.885115 4991 patch_prober.go:28] interesting pod/router-default-5444994796-qhtk2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 01:04:47 crc kubenswrapper[4991]: [-]has-synced failed: reason withheld Dec 06 01:04:47 crc kubenswrapper[4991]: [+]process-running ok Dec 06 01:04:47 crc kubenswrapper[4991]: healthz check failed Dec 06 01:04:47 crc kubenswrapper[4991]: I1206 01:04:47.885781 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhtk2" podUID="e8f82106-cdc2-4dec-a30d-221e17bdc50d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 01:04:47 crc kubenswrapper[4991]: I1206 01:04:47.992591 4991 generic.go:334] "Generic (PLEG): container finished" podID="4317558f-42d2-4dd3-8d1e-1e13c4355cc4" containerID="e3f9f5f7bf26940b16b28570d965cb244887519b3d1265fd7314d993686966c4" exitCode=0 Dec 06 01:04:47 crc kubenswrapper[4991]: I1206 01:04:47.993611 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b72bk" event={"ID":"4317558f-42d2-4dd3-8d1e-1e13c4355cc4","Type":"ContainerDied","Data":"e3f9f5f7bf26940b16b28570d965cb244887519b3d1265fd7314d993686966c4"} Dec 06 01:04:47 crc kubenswrapper[4991]: I1206 01:04:47.993642 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b72bk" event={"ID":"4317558f-42d2-4dd3-8d1e-1e13c4355cc4","Type":"ContainerStarted","Data":"3da8328782a21eb259f2beb23675f695d15cc71cb9945a001df2ff766bbcc049"} Dec 06 01:04:48 crc kubenswrapper[4991]: I1206 01:04:48.017650 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-647zw" event={"ID":"f130c551-8102-4733-8447-8f749c2f90f5","Type":"ContainerStarted","Data":"a2fcf0f4112a6e7a05facf34da2d6f20ae2686bed6c4f8c6488ff3e8beaa08c5"} Dec 06 01:04:48 crc kubenswrapper[4991]: I1206 01:04:48.041341 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bgvdn" event={"ID":"2cb731e7-ddf5-47db-b5b1-607ae7a9f146","Type":"ContainerStarted","Data":"05467ac759a80c2da907905b6a3645ad55cf87aa509ef7133d68f09bbfecc74e"} Dec 06 01:04:48 crc kubenswrapper[4991]: I1206 01:04:48.050136 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"47fdd6b8-9c70-4bdc-a103-c8db8445f397","Type":"ContainerStarted","Data":"c8d2000ac73b15dd81a900d6d8728299e1f2b879a4cfeaa5b1fad6d3406f211c"} Dec 06 01:04:48 crc kubenswrapper[4991]: E1206 01:04:48.187032 4991 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cb731e7_ddf5_47db_b5b1_607ae7a9f146.slice/crio-conmon-d5c44e99c2e9fc05a0b5884900c555b4a7de09bc45aac48aaf359adcc08e3238.scope\": RecentStats: unable to find data in memory cache]" Dec 06 01:04:48 crc kubenswrapper[4991]: I1206 01:04:48.477109 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-97gxp" Dec 06 01:04:48 crc kubenswrapper[4991]: I1206 01:04:48.477724 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-97gxp" Dec 06 01:04:48 crc kubenswrapper[4991]: I1206 01:04:48.491330 4991 patch_prober.go:28] interesting pod/console-f9d7485db-97gxp container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Dec 06 01:04:48 crc kubenswrapper[4991]: I1206 01:04:48.491400 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-97gxp" podUID="8723600d-fcf8-414f-a440-da9d65d41e07" containerName="console" probeResult="failure" output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" Dec 06 01:04:48 crc kubenswrapper[4991]: I1206 01:04:48.537964 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-v8msm"] Dec 06 01:04:48 crc kubenswrapper[4991]: I1206 01:04:48.878245 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-qhtk2" Dec 06 01:04:48 crc kubenswrapper[4991]: I1206 01:04:48.887460 4991 patch_prober.go:28] interesting pod/router-default-5444994796-qhtk2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 01:04:48 crc kubenswrapper[4991]: [-]has-synced failed: reason withheld Dec 06 01:04:48 crc kubenswrapper[4991]: [+]process-running ok Dec 06 01:04:48 crc kubenswrapper[4991]: healthz check failed Dec 06 01:04:48 crc kubenswrapper[4991]: I1206 01:04:48.887525 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhtk2" podUID="e8f82106-cdc2-4dec-a30d-221e17bdc50d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 01:04:49 crc kubenswrapper[4991]: I1206 01:04:49.060475 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 06 01:04:49 crc kubenswrapper[4991]: I1206 01:04:49.173186 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-647zw" event={"ID":"f130c551-8102-4733-8447-8f749c2f90f5","Type":"ContainerStarted","Data":"45b377983636627b3c243ef8b745f3dcdb25ebac78e213663e7b33acfe34afcf"} Dec 06 01:04:49 crc kubenswrapper[4991]: I1206 01:04:49.180518 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" event={"ID":"4afb29d9-e4be-4c64-88d4-823d89c15ec8","Type":"ContainerStarted","Data":"509438158e14c60c9a0c12f72509bea658ccf58dad60b25dfdafcfb7ea2f3001"} Dec 06 01:04:49 crc kubenswrapper[4991]: I1206 01:04:49.181737 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:04:49 crc kubenswrapper[4991]: I1206 01:04:49.196743 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nt5q5" Dec 06 01:04:49 crc kubenswrapper[4991]: I1206 01:04:49.196813 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nt5q5" Dec 06 01:04:49 crc kubenswrapper[4991]: I1206 01:04:49.216772 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-647zw" podStartSLOduration=13.21674088 podStartE2EDuration="13.21674088s" podCreationTimestamp="2025-12-06 01:04:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:04:49.194605928 +0000 UTC m=+162.544896396" watchObservedRunningTime="2025-12-06 01:04:49.21674088 +0000 UTC m=+162.567031348" Dec 06 01:04:49 crc kubenswrapper[4991]: I1206 01:04:49.217505 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nt5q5" Dec 06 01:04:49 crc kubenswrapper[4991]: I1206 01:04:49.217972 4991 generic.go:334] "Generic (PLEG): container finished" podID="2cb731e7-ddf5-47db-b5b1-607ae7a9f146" containerID="d5c44e99c2e9fc05a0b5884900c555b4a7de09bc45aac48aaf359adcc08e3238" exitCode=0 Dec 06 01:04:49 crc kubenswrapper[4991]: I1206 01:04:49.218274 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bgvdn" event={"ID":"2cb731e7-ddf5-47db-b5b1-607ae7a9f146","Type":"ContainerDied","Data":"d5c44e99c2e9fc05a0b5884900c555b4a7de09bc45aac48aaf359adcc08e3238"} Dec 06 01:04:49 crc kubenswrapper[4991]: I1206 01:04:49.226804 4991 generic.go:334] "Generic (PLEG): container finished" podID="0d67d95f-81bb-491f-87ca-e15aba253031" containerID="418db56e6e056c536019e0f33d387cd948704052752b80a1e797bb4c32ee374a" exitCode=0 Dec 06 01:04:49 crc kubenswrapper[4991]: I1206 01:04:49.226871 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416380-k4nkv" event={"ID":"0d67d95f-81bb-491f-87ca-e15aba253031","Type":"ContainerDied","Data":"418db56e6e056c536019e0f33d387cd948704052752b80a1e797bb4c32ee374a"} Dec 06 01:04:49 crc kubenswrapper[4991]: I1206 01:04:49.265767 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" podStartSLOduration=138.265743721 podStartE2EDuration="2m18.265743721s" podCreationTimestamp="2025-12-06 01:02:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:04:49.22717997 +0000 UTC m=+162.577470438" watchObservedRunningTime="2025-12-06 01:04:49.265743721 +0000 UTC m=+162.616034339" Dec 06 01:04:49 crc kubenswrapper[4991]: I1206 01:04:49.306673 4991 generic.go:334] "Generic (PLEG): container finished" podID="47fdd6b8-9c70-4bdc-a103-c8db8445f397" containerID="8124b58de0c984a0acee36ed1a182bb43d7ef811967f2b45969690be65503e70" exitCode=0 Dec 06 01:04:49 crc kubenswrapper[4991]: I1206 01:04:49.306743 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"47fdd6b8-9c70-4bdc-a103-c8db8445f397","Type":"ContainerDied","Data":"8124b58de0c984a0acee36ed1a182bb43d7ef811967f2b45969690be65503e70"} Dec 06 01:04:49 crc kubenswrapper[4991]: I1206 01:04:49.544119 4991 patch_prober.go:28] interesting pod/downloads-7954f5f757-mwwth container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Dec 06 01:04:49 crc kubenswrapper[4991]: I1206 01:04:49.544192 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-mwwth" podUID="bbd5656f-5e1f-4c7b-921d-6a13c8145217" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" Dec 06 01:04:49 crc kubenswrapper[4991]: I1206 01:04:49.544266 4991 patch_prober.go:28] interesting pod/downloads-7954f5f757-mwwth container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Dec 06 01:04:49 crc kubenswrapper[4991]: I1206 01:04:49.544352 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mwwth" podUID="bbd5656f-5e1f-4c7b-921d-6a13c8145217" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" Dec 06 01:04:49 crc kubenswrapper[4991]: I1206 01:04:49.881735 4991 patch_prober.go:28] interesting pod/router-default-5444994796-qhtk2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 01:04:49 crc kubenswrapper[4991]: [-]has-synced failed: reason withheld Dec 06 01:04:49 crc kubenswrapper[4991]: [+]process-running ok Dec 06 01:04:49 crc kubenswrapper[4991]: healthz check failed Dec 06 01:04:49 crc kubenswrapper[4991]: I1206 01:04:49.881819 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhtk2" podUID="e8f82106-cdc2-4dec-a30d-221e17bdc50d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 01:04:50 crc kubenswrapper[4991]: I1206 01:04:50.110563 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 06 01:04:50 crc kubenswrapper[4991]: I1206 01:04:50.112351 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 01:04:50 crc kubenswrapper[4991]: I1206 01:04:50.122475 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 06 01:04:50 crc kubenswrapper[4991]: I1206 01:04:50.122604 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 06 01:04:50 crc kubenswrapper[4991]: I1206 01:04:50.132226 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 06 01:04:50 crc kubenswrapper[4991]: I1206 01:04:50.208671 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/80cccc5c-ffaa-43f5-9458-0221eec19fba-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"80cccc5c-ffaa-43f5-9458-0221eec19fba\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 01:04:50 crc kubenswrapper[4991]: I1206 01:04:50.208731 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/80cccc5c-ffaa-43f5-9458-0221eec19fba-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"80cccc5c-ffaa-43f5-9458-0221eec19fba\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 01:04:50 crc kubenswrapper[4991]: I1206 01:04:50.309917 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/80cccc5c-ffaa-43f5-9458-0221eec19fba-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"80cccc5c-ffaa-43f5-9458-0221eec19fba\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 01:04:50 crc kubenswrapper[4991]: I1206 01:04:50.309961 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/80cccc5c-ffaa-43f5-9458-0221eec19fba-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"80cccc5c-ffaa-43f5-9458-0221eec19fba\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 01:04:50 crc kubenswrapper[4991]: I1206 01:04:50.310509 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/80cccc5c-ffaa-43f5-9458-0221eec19fba-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"80cccc5c-ffaa-43f5-9458-0221eec19fba\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 01:04:50 crc kubenswrapper[4991]: I1206 01:04:50.346340 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/80cccc5c-ffaa-43f5-9458-0221eec19fba-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"80cccc5c-ffaa-43f5-9458-0221eec19fba\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 01:04:50 crc kubenswrapper[4991]: I1206 01:04:50.369857 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" event={"ID":"4afb29d9-e4be-4c64-88d4-823d89c15ec8","Type":"ContainerStarted","Data":"c4463813345be68b0861983adfa026ac9ebc3d8366ca7332b286582d723c53b1"} Dec 06 01:04:50 crc kubenswrapper[4991]: I1206 01:04:50.384504 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nt5q5" Dec 06 01:04:50 crc kubenswrapper[4991]: I1206 01:04:50.456347 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 01:04:50 crc kubenswrapper[4991]: I1206 01:04:50.885123 4991 patch_prober.go:28] interesting pod/router-default-5444994796-qhtk2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 01:04:50 crc kubenswrapper[4991]: [-]has-synced failed: reason withheld Dec 06 01:04:50 crc kubenswrapper[4991]: [+]process-running ok Dec 06 01:04:50 crc kubenswrapper[4991]: healthz check failed Dec 06 01:04:50 crc kubenswrapper[4991]: I1206 01:04:50.885638 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhtk2" podUID="e8f82106-cdc2-4dec-a30d-221e17bdc50d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 01:04:50 crc kubenswrapper[4991]: I1206 01:04:50.915643 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416380-k4nkv" Dec 06 01:04:51 crc kubenswrapper[4991]: I1206 01:04:51.012879 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 01:04:51 crc kubenswrapper[4991]: I1206 01:04:51.028927 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0d67d95f-81bb-491f-87ca-e15aba253031-secret-volume\") pod \"0d67d95f-81bb-491f-87ca-e15aba253031\" (UID: \"0d67d95f-81bb-491f-87ca-e15aba253031\") " Dec 06 01:04:51 crc kubenswrapper[4991]: I1206 01:04:51.029111 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-696dw\" (UniqueName: \"kubernetes.io/projected/0d67d95f-81bb-491f-87ca-e15aba253031-kube-api-access-696dw\") pod \"0d67d95f-81bb-491f-87ca-e15aba253031\" (UID: \"0d67d95f-81bb-491f-87ca-e15aba253031\") " Dec 06 01:04:51 crc kubenswrapper[4991]: I1206 01:04:51.029158 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0d67d95f-81bb-491f-87ca-e15aba253031-config-volume\") pod \"0d67d95f-81bb-491f-87ca-e15aba253031\" (UID: \"0d67d95f-81bb-491f-87ca-e15aba253031\") " Dec 06 01:04:51 crc kubenswrapper[4991]: I1206 01:04:51.030099 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d67d95f-81bb-491f-87ca-e15aba253031-config-volume" (OuterVolumeSpecName: "config-volume") pod "0d67d95f-81bb-491f-87ca-e15aba253031" (UID: "0d67d95f-81bb-491f-87ca-e15aba253031"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:04:51 crc kubenswrapper[4991]: I1206 01:04:51.037968 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d67d95f-81bb-491f-87ca-e15aba253031-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0d67d95f-81bb-491f-87ca-e15aba253031" (UID: "0d67d95f-81bb-491f-87ca-e15aba253031"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:04:51 crc kubenswrapper[4991]: I1206 01:04:51.038604 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d67d95f-81bb-491f-87ca-e15aba253031-kube-api-access-696dw" (OuterVolumeSpecName: "kube-api-access-696dw") pod "0d67d95f-81bb-491f-87ca-e15aba253031" (UID: "0d67d95f-81bb-491f-87ca-e15aba253031"). InnerVolumeSpecName "kube-api-access-696dw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:04:51 crc kubenswrapper[4991]: I1206 01:04:51.133949 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/47fdd6b8-9c70-4bdc-a103-c8db8445f397-kube-api-access\") pod \"47fdd6b8-9c70-4bdc-a103-c8db8445f397\" (UID: \"47fdd6b8-9c70-4bdc-a103-c8db8445f397\") " Dec 06 01:04:51 crc kubenswrapper[4991]: I1206 01:04:51.134049 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/47fdd6b8-9c70-4bdc-a103-c8db8445f397-kubelet-dir\") pod \"47fdd6b8-9c70-4bdc-a103-c8db8445f397\" (UID: \"47fdd6b8-9c70-4bdc-a103-c8db8445f397\") " Dec 06 01:04:51 crc kubenswrapper[4991]: I1206 01:04:51.134208 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/47fdd6b8-9c70-4bdc-a103-c8db8445f397-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "47fdd6b8-9c70-4bdc-a103-c8db8445f397" (UID: "47fdd6b8-9c70-4bdc-a103-c8db8445f397"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 01:04:51 crc kubenswrapper[4991]: I1206 01:04:51.134437 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-696dw\" (UniqueName: \"kubernetes.io/projected/0d67d95f-81bb-491f-87ca-e15aba253031-kube-api-access-696dw\") on node \"crc\" DevicePath \"\"" Dec 06 01:04:51 crc kubenswrapper[4991]: I1206 01:04:51.134471 4991 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/47fdd6b8-9c70-4bdc-a103-c8db8445f397-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 06 01:04:51 crc kubenswrapper[4991]: I1206 01:04:51.134485 4991 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0d67d95f-81bb-491f-87ca-e15aba253031-config-volume\") on node \"crc\" DevicePath \"\"" Dec 06 01:04:51 crc kubenswrapper[4991]: I1206 01:04:51.134495 4991 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0d67d95f-81bb-491f-87ca-e15aba253031-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 06 01:04:51 crc kubenswrapper[4991]: I1206 01:04:51.149298 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47fdd6b8-9c70-4bdc-a103-c8db8445f397-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "47fdd6b8-9c70-4bdc-a103-c8db8445f397" (UID: "47fdd6b8-9c70-4bdc-a103-c8db8445f397"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:04:51 crc kubenswrapper[4991]: I1206 01:04:51.210841 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 06 01:04:51 crc kubenswrapper[4991]: I1206 01:04:51.237122 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/47fdd6b8-9c70-4bdc-a103-c8db8445f397-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 06 01:04:51 crc kubenswrapper[4991]: I1206 01:04:51.434249 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"80cccc5c-ffaa-43f5-9458-0221eec19fba","Type":"ContainerStarted","Data":"41057701a7f32e3f5699b075bab691a4eee6f6395aa93b30ec11a95b1b713efc"} Dec 06 01:04:51 crc kubenswrapper[4991]: I1206 01:04:51.442802 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416380-k4nkv" event={"ID":"0d67d95f-81bb-491f-87ca-e15aba253031","Type":"ContainerDied","Data":"3ccb1a5398bff2f86531f39a2156bcb413fb4c5b8e59a75d24b0037488a6e844"} Dec 06 01:04:51 crc kubenswrapper[4991]: I1206 01:04:51.442893 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ccb1a5398bff2f86531f39a2156bcb413fb4c5b8e59a75d24b0037488a6e844" Dec 06 01:04:51 crc kubenswrapper[4991]: I1206 01:04:51.442841 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416380-k4nkv" Dec 06 01:04:51 crc kubenswrapper[4991]: I1206 01:04:51.448776 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 06 01:04:51 crc kubenswrapper[4991]: I1206 01:04:51.450284 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"47fdd6b8-9c70-4bdc-a103-c8db8445f397","Type":"ContainerDied","Data":"c8d2000ac73b15dd81a900d6d8728299e1f2b879a4cfeaa5b1fad6d3406f211c"} Dec 06 01:04:51 crc kubenswrapper[4991]: I1206 01:04:51.450376 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8d2000ac73b15dd81a900d6d8728299e1f2b879a4cfeaa5b1fad6d3406f211c" Dec 06 01:04:51 crc kubenswrapper[4991]: I1206 01:04:51.880929 4991 patch_prober.go:28] interesting pod/router-default-5444994796-qhtk2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 06 01:04:51 crc kubenswrapper[4991]: [-]has-synced failed: reason withheld Dec 06 01:04:51 crc kubenswrapper[4991]: [+]process-running ok Dec 06 01:04:51 crc kubenswrapper[4991]: healthz check failed Dec 06 01:04:51 crc kubenswrapper[4991]: I1206 01:04:51.881025 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhtk2" podUID="e8f82106-cdc2-4dec-a30d-221e17bdc50d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 06 01:04:52 crc kubenswrapper[4991]: I1206 01:04:52.480846 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"80cccc5c-ffaa-43f5-9458-0221eec19fba","Type":"ContainerStarted","Data":"1ae2977237efa0d677caac4d1dab0dbd39f2433c9e66879e6a6b9d18727a44ea"} Dec 06 01:04:52 crc kubenswrapper[4991]: I1206 01:04:52.510369 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.510346154 podStartE2EDuration="2.510346154s" podCreationTimestamp="2025-12-06 01:04:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:04:52.508056253 +0000 UTC m=+165.858346721" watchObservedRunningTime="2025-12-06 01:04:52.510346154 +0000 UTC m=+165.860636622" Dec 06 01:04:52 crc kubenswrapper[4991]: I1206 01:04:52.905662 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-qhtk2" Dec 06 01:04:52 crc kubenswrapper[4991]: I1206 01:04:52.922670 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-qhtk2" Dec 06 01:04:53 crc kubenswrapper[4991]: I1206 01:04:53.516758 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-665b6dd947-4rtwr_0c087aa7-b699-4d07-adb8-e3082e693727/cluster-samples-operator/0.log" Dec 06 01:04:53 crc kubenswrapper[4991]: I1206 01:04:53.516856 4991 generic.go:334] "Generic (PLEG): container finished" podID="0c087aa7-b699-4d07-adb8-e3082e693727" containerID="8db04a8ed9562f62813e1123343d2125ec27c92d3a4b0db7082caa17030164e1" exitCode=2 Dec 06 01:04:53 crc kubenswrapper[4991]: I1206 01:04:53.517007 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4rtwr" event={"ID":"0c087aa7-b699-4d07-adb8-e3082e693727","Type":"ContainerDied","Data":"8db04a8ed9562f62813e1123343d2125ec27c92d3a4b0db7082caa17030164e1"} Dec 06 01:04:53 crc kubenswrapper[4991]: I1206 01:04:53.518053 4991 scope.go:117] "RemoveContainer" containerID="8db04a8ed9562f62813e1123343d2125ec27c92d3a4b0db7082caa17030164e1" Dec 06 01:04:53 crc kubenswrapper[4991]: I1206 01:04:53.605951 4991 generic.go:334] "Generic (PLEG): container finished" podID="80cccc5c-ffaa-43f5-9458-0221eec19fba" containerID="1ae2977237efa0d677caac4d1dab0dbd39f2433c9e66879e6a6b9d18727a44ea" exitCode=0 Dec 06 01:04:53 crc kubenswrapper[4991]: I1206 01:04:53.606089 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"80cccc5c-ffaa-43f5-9458-0221eec19fba","Type":"ContainerDied","Data":"1ae2977237efa0d677caac4d1dab0dbd39f2433c9e66879e6a6b9d18727a44ea"} Dec 06 01:04:54 crc kubenswrapper[4991]: I1206 01:04:54.358305 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-c6qz6" Dec 06 01:04:54 crc kubenswrapper[4991]: I1206 01:04:54.425286 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/374025e5-ccaa-4cc4-816b-136713f17d6d-metrics-certs\") pod \"network-metrics-daemon-q8mbt\" (UID: \"374025e5-ccaa-4cc4-816b-136713f17d6d\") " pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:04:54 crc kubenswrapper[4991]: I1206 01:04:54.447294 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/374025e5-ccaa-4cc4-816b-136713f17d6d-metrics-certs\") pod \"network-metrics-daemon-q8mbt\" (UID: \"374025e5-ccaa-4cc4-816b-136713f17d6d\") " pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:04:54 crc kubenswrapper[4991]: I1206 01:04:54.661713 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q8mbt" Dec 06 01:04:54 crc kubenswrapper[4991]: I1206 01:04:54.669152 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-665b6dd947-4rtwr_0c087aa7-b699-4d07-adb8-e3082e693727/cluster-samples-operator/0.log" Dec 06 01:04:54 crc kubenswrapper[4991]: I1206 01:04:54.669610 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4rtwr" event={"ID":"0c087aa7-b699-4d07-adb8-e3082e693727","Type":"ContainerStarted","Data":"398dce48f64bc9efa33046e97ceb6ffa267540a5e3006495aaedaddfdc3c9be0"} Dec 06 01:04:55 crc kubenswrapper[4991]: I1206 01:04:55.209641 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-q8mbt"] Dec 06 01:04:55 crc kubenswrapper[4991]: W1206 01:04:55.362043 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod374025e5_ccaa_4cc4_816b_136713f17d6d.slice/crio-96cb878a2d6c89f9658b3e34fdedd9d24df14a4a44215092102dd0ea4f25666f WatchSource:0}: Error finding container 96cb878a2d6c89f9658b3e34fdedd9d24df14a4a44215092102dd0ea4f25666f: Status 404 returned error can't find the container with id 96cb878a2d6c89f9658b3e34fdedd9d24df14a4a44215092102dd0ea4f25666f Dec 06 01:04:55 crc kubenswrapper[4991]: I1206 01:04:55.412694 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 01:04:55 crc kubenswrapper[4991]: I1206 01:04:55.458506 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/80cccc5c-ffaa-43f5-9458-0221eec19fba-kube-api-access\") pod \"80cccc5c-ffaa-43f5-9458-0221eec19fba\" (UID: \"80cccc5c-ffaa-43f5-9458-0221eec19fba\") " Dec 06 01:04:55 crc kubenswrapper[4991]: I1206 01:04:55.458663 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/80cccc5c-ffaa-43f5-9458-0221eec19fba-kubelet-dir\") pod \"80cccc5c-ffaa-43f5-9458-0221eec19fba\" (UID: \"80cccc5c-ffaa-43f5-9458-0221eec19fba\") " Dec 06 01:04:55 crc kubenswrapper[4991]: I1206 01:04:55.458856 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/80cccc5c-ffaa-43f5-9458-0221eec19fba-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "80cccc5c-ffaa-43f5-9458-0221eec19fba" (UID: "80cccc5c-ffaa-43f5-9458-0221eec19fba"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 01:04:55 crc kubenswrapper[4991]: I1206 01:04:55.459290 4991 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/80cccc5c-ffaa-43f5-9458-0221eec19fba-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 06 01:04:55 crc kubenswrapper[4991]: I1206 01:04:55.466387 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80cccc5c-ffaa-43f5-9458-0221eec19fba-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "80cccc5c-ffaa-43f5-9458-0221eec19fba" (UID: "80cccc5c-ffaa-43f5-9458-0221eec19fba"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:04:55 crc kubenswrapper[4991]: I1206 01:04:55.560665 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/80cccc5c-ffaa-43f5-9458-0221eec19fba-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 06 01:04:55 crc kubenswrapper[4991]: I1206 01:04:55.686289 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 06 01:04:55 crc kubenswrapper[4991]: I1206 01:04:55.686305 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"80cccc5c-ffaa-43f5-9458-0221eec19fba","Type":"ContainerDied","Data":"41057701a7f32e3f5699b075bab691a4eee6f6395aa93b30ec11a95b1b713efc"} Dec 06 01:04:55 crc kubenswrapper[4991]: I1206 01:04:55.686429 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41057701a7f32e3f5699b075bab691a4eee6f6395aa93b30ec11a95b1b713efc" Dec 06 01:04:55 crc kubenswrapper[4991]: I1206 01:04:55.688288 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-q8mbt" event={"ID":"374025e5-ccaa-4cc4-816b-136713f17d6d","Type":"ContainerStarted","Data":"96cb878a2d6c89f9658b3e34fdedd9d24df14a4a44215092102dd0ea4f25666f"} Dec 06 01:04:57 crc kubenswrapper[4991]: I1206 01:04:57.720327 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-q8mbt" event={"ID":"374025e5-ccaa-4cc4-816b-136713f17d6d","Type":"ContainerStarted","Data":"6f301cb6454f037b9fddc0c4d3c885008f7b8a8b74f417d70e4954ed0036d3ea"} Dec 06 01:04:57 crc kubenswrapper[4991]: I1206 01:04:57.898046 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:04:58 crc kubenswrapper[4991]: I1206 01:04:58.489612 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-97gxp" Dec 06 01:04:58 crc kubenswrapper[4991]: I1206 01:04:58.494258 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-97gxp" Dec 06 01:04:59 crc kubenswrapper[4991]: I1206 01:04:59.542415 4991 patch_prober.go:28] interesting pod/downloads-7954f5f757-mwwth container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Dec 06 01:04:59 crc kubenswrapper[4991]: I1206 01:04:59.542479 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-mwwth" podUID="bbd5656f-5e1f-4c7b-921d-6a13c8145217" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" Dec 06 01:04:59 crc kubenswrapper[4991]: I1206 01:04:59.542504 4991 patch_prober.go:28] interesting pod/downloads-7954f5f757-mwwth container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Dec 06 01:04:59 crc kubenswrapper[4991]: I1206 01:04:59.542572 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mwwth" podUID="bbd5656f-5e1f-4c7b-921d-6a13c8145217" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" Dec 06 01:05:07 crc kubenswrapper[4991]: I1206 01:05:07.837758 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:05:09 crc kubenswrapper[4991]: I1206 01:05:09.547723 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-mwwth" Dec 06 01:05:14 crc kubenswrapper[4991]: I1206 01:05:14.377297 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 06 01:05:16 crc kubenswrapper[4991]: I1206 01:05:16.739749 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 01:05:16 crc kubenswrapper[4991]: I1206 01:05:16.739863 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 01:05:19 crc kubenswrapper[4991]: I1206 01:05:19.560148 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rw4lh" Dec 06 01:05:22 crc kubenswrapper[4991]: E1206 01:05:22.783397 4991 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 06 01:05:22 crc kubenswrapper[4991]: E1206 01:05:22.783944 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s5nt2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-b72bk_openshift-marketplace(4317558f-42d2-4dd3-8d1e-1e13c4355cc4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 06 01:05:22 crc kubenswrapper[4991]: E1206 01:05:22.785145 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-b72bk" podUID="4317558f-42d2-4dd3-8d1e-1e13c4355cc4" Dec 06 01:05:24 crc kubenswrapper[4991]: E1206 01:05:24.717670 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-b72bk" podUID="4317558f-42d2-4dd3-8d1e-1e13c4355cc4" Dec 06 01:05:24 crc kubenswrapper[4991]: E1206 01:05:24.787924 4991 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 06 01:05:24 crc kubenswrapper[4991]: E1206 01:05:24.788169 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-thqvw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-rdt9r_openshift-marketplace(6ca586f2-2d2c-4772-9af3-c870daaf5285): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 06 01:05:24 crc kubenswrapper[4991]: E1206 01:05:24.789328 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-rdt9r" podUID="6ca586f2-2d2c-4772-9af3-c870daaf5285" Dec 06 01:05:26 crc kubenswrapper[4991]: I1206 01:05:26.708285 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 06 01:05:26 crc kubenswrapper[4991]: E1206 01:05:26.711042 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47fdd6b8-9c70-4bdc-a103-c8db8445f397" containerName="pruner" Dec 06 01:05:26 crc kubenswrapper[4991]: I1206 01:05:26.711070 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="47fdd6b8-9c70-4bdc-a103-c8db8445f397" containerName="pruner" Dec 06 01:05:26 crc kubenswrapper[4991]: E1206 01:05:26.711127 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80cccc5c-ffaa-43f5-9458-0221eec19fba" containerName="pruner" Dec 06 01:05:26 crc kubenswrapper[4991]: I1206 01:05:26.711139 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="80cccc5c-ffaa-43f5-9458-0221eec19fba" containerName="pruner" Dec 06 01:05:26 crc kubenswrapper[4991]: E1206 01:05:26.711162 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d67d95f-81bb-491f-87ca-e15aba253031" containerName="collect-profiles" Dec 06 01:05:26 crc kubenswrapper[4991]: I1206 01:05:26.711171 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d67d95f-81bb-491f-87ca-e15aba253031" containerName="collect-profiles" Dec 06 01:05:26 crc kubenswrapper[4991]: I1206 01:05:26.711529 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="80cccc5c-ffaa-43f5-9458-0221eec19fba" containerName="pruner" Dec 06 01:05:26 crc kubenswrapper[4991]: I1206 01:05:26.711584 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d67d95f-81bb-491f-87ca-e15aba253031" containerName="collect-profiles" Dec 06 01:05:26 crc kubenswrapper[4991]: I1206 01:05:26.711598 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="47fdd6b8-9c70-4bdc-a103-c8db8445f397" containerName="pruner" Dec 06 01:05:26 crc kubenswrapper[4991]: I1206 01:05:26.722549 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 06 01:05:26 crc kubenswrapper[4991]: I1206 01:05:26.722747 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 01:05:26 crc kubenswrapper[4991]: I1206 01:05:26.726370 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 06 01:05:26 crc kubenswrapper[4991]: I1206 01:05:26.726841 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 06 01:05:26 crc kubenswrapper[4991]: I1206 01:05:26.736496 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6a48da04-8ff0-4b3f-8064-a1d2e086ea09-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6a48da04-8ff0-4b3f-8064-a1d2e086ea09\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 01:05:26 crc kubenswrapper[4991]: I1206 01:05:26.736553 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a48da04-8ff0-4b3f-8064-a1d2e086ea09-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6a48da04-8ff0-4b3f-8064-a1d2e086ea09\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 01:05:26 crc kubenswrapper[4991]: I1206 01:05:26.837904 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a48da04-8ff0-4b3f-8064-a1d2e086ea09-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6a48da04-8ff0-4b3f-8064-a1d2e086ea09\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 01:05:26 crc kubenswrapper[4991]: I1206 01:05:26.838055 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6a48da04-8ff0-4b3f-8064-a1d2e086ea09-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6a48da04-8ff0-4b3f-8064-a1d2e086ea09\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 01:05:26 crc kubenswrapper[4991]: I1206 01:05:26.838137 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6a48da04-8ff0-4b3f-8064-a1d2e086ea09-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6a48da04-8ff0-4b3f-8064-a1d2e086ea09\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 01:05:26 crc kubenswrapper[4991]: I1206 01:05:26.858410 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a48da04-8ff0-4b3f-8064-a1d2e086ea09-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6a48da04-8ff0-4b3f-8064-a1d2e086ea09\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 01:05:27 crc kubenswrapper[4991]: I1206 01:05:27.052692 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 01:05:28 crc kubenswrapper[4991]: E1206 01:05:28.334112 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-rdt9r" podUID="6ca586f2-2d2c-4772-9af3-c870daaf5285" Dec 06 01:05:28 crc kubenswrapper[4991]: E1206 01:05:28.453596 4991 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 06 01:05:28 crc kubenswrapper[4991]: E1206 01:05:28.454474 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ckjwt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-6cvk9_openshift-marketplace(271a982a-6005-4d08-b284-6b140ba25f07): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 06 01:05:28 crc kubenswrapper[4991]: E1206 01:05:28.455754 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-6cvk9" podUID="271a982a-6005-4d08-b284-6b140ba25f07" Dec 06 01:05:28 crc kubenswrapper[4991]: E1206 01:05:28.470462 4991 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 06 01:05:28 crc kubenswrapper[4991]: E1206 01:05:28.470674 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fzn62,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-p8x59_openshift-marketplace(5c39eb27-3dbb-46d7-a874-ca6605d84134): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 06 01:05:28 crc kubenswrapper[4991]: E1206 01:05:28.472617 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-p8x59" podUID="5c39eb27-3dbb-46d7-a874-ca6605d84134" Dec 06 01:05:28 crc kubenswrapper[4991]: E1206 01:05:28.479302 4991 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 06 01:05:28 crc kubenswrapper[4991]: E1206 01:05:28.479542 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w7m8f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-qzvgr_openshift-marketplace(ab7acc10-27d6-4f6b-9ec1-faf4fb60771d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 06 01:05:28 crc kubenswrapper[4991]: E1206 01:05:28.481499 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-qzvgr" podUID="ab7acc10-27d6-4f6b-9ec1-faf4fb60771d" Dec 06 01:05:28 crc kubenswrapper[4991]: E1206 01:05:28.496940 4991 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 06 01:05:28 crc kubenswrapper[4991]: E1206 01:05:28.497140 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rxvm5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-bgvdn_openshift-marketplace(2cb731e7-ddf5-47db-b5b1-607ae7a9f146): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 06 01:05:28 crc kubenswrapper[4991]: E1206 01:05:28.501577 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-bgvdn" podUID="2cb731e7-ddf5-47db-b5b1-607ae7a9f146" Dec 06 01:05:28 crc kubenswrapper[4991]: I1206 01:05:28.881085 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 06 01:05:28 crc kubenswrapper[4991]: I1206 01:05:28.978353 4991 generic.go:334] "Generic (PLEG): container finished" podID="aeca113e-21c6-4114-9130-111b9c2e051a" containerID="2eb8303ad5af329db74c56ceeba5acbac0fe2e8e85e5e67628ea177274d0d666" exitCode=0 Dec 06 01:05:28 crc kubenswrapper[4991]: I1206 01:05:28.978411 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wmtfj" event={"ID":"aeca113e-21c6-4114-9130-111b9c2e051a","Type":"ContainerDied","Data":"2eb8303ad5af329db74c56ceeba5acbac0fe2e8e85e5e67628ea177274d0d666"} Dec 06 01:05:28 crc kubenswrapper[4991]: I1206 01:05:28.985039 4991 generic.go:334] "Generic (PLEG): container finished" podID="7a5f3df1-9238-407b-9fb1-97d41e3a7f11" containerID="f16e2167fa4e0ebce63189aad65ac840fd78eb795b5be099e16c527d85fb857f" exitCode=0 Dec 06 01:05:28 crc kubenswrapper[4991]: I1206 01:05:28.985107 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-52fr4" event={"ID":"7a5f3df1-9238-407b-9fb1-97d41e3a7f11","Type":"ContainerDied","Data":"f16e2167fa4e0ebce63189aad65ac840fd78eb795b5be099e16c527d85fb857f"} Dec 06 01:05:28 crc kubenswrapper[4991]: I1206 01:05:28.987835 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-q8mbt" event={"ID":"374025e5-ccaa-4cc4-816b-136713f17d6d","Type":"ContainerStarted","Data":"e4b21fd1df176a72ef4d7a9bd9c4997a71a5207f47e93395c300eaa0b2205133"} Dec 06 01:05:28 crc kubenswrapper[4991]: I1206 01:05:28.992428 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"6a48da04-8ff0-4b3f-8064-a1d2e086ea09","Type":"ContainerStarted","Data":"c16736c89d8a37a98c0974d2f761db3e45812fdf0ed2d20baa8bf5f9dc63ac56"} Dec 06 01:05:28 crc kubenswrapper[4991]: E1206 01:05:28.993825 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-bgvdn" podUID="2cb731e7-ddf5-47db-b5b1-607ae7a9f146" Dec 06 01:05:28 crc kubenswrapper[4991]: E1206 01:05:28.993895 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-6cvk9" podUID="271a982a-6005-4d08-b284-6b140ba25f07" Dec 06 01:05:28 crc kubenswrapper[4991]: E1206 01:05:28.994072 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-p8x59" podUID="5c39eb27-3dbb-46d7-a874-ca6605d84134" Dec 06 01:05:28 crc kubenswrapper[4991]: E1206 01:05:28.998573 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qzvgr" podUID="ab7acc10-27d6-4f6b-9ec1-faf4fb60771d" Dec 06 01:05:29 crc kubenswrapper[4991]: I1206 01:05:29.036998 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-q8mbt" podStartSLOduration=178.036909951 podStartE2EDuration="2m58.036909951s" podCreationTimestamp="2025-12-06 01:02:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:05:29.02561357 +0000 UTC m=+202.375904048" watchObservedRunningTime="2025-12-06 01:05:29.036909951 +0000 UTC m=+202.387200429" Dec 06 01:05:30 crc kubenswrapper[4991]: I1206 01:05:30.002214 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wmtfj" event={"ID":"aeca113e-21c6-4114-9130-111b9c2e051a","Type":"ContainerStarted","Data":"076097247f296c0a7e90c17b745e9a8f6e92d905fc06abfd983c6b9b686e4725"} Dec 06 01:05:30 crc kubenswrapper[4991]: I1206 01:05:30.008285 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-52fr4" event={"ID":"7a5f3df1-9238-407b-9fb1-97d41e3a7f11","Type":"ContainerStarted","Data":"40132ee862b553cee0e0369420776ae9d494e2c0fba515b19494dc1bee2ecfcd"} Dec 06 01:05:30 crc kubenswrapper[4991]: I1206 01:05:30.010762 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"6a48da04-8ff0-4b3f-8064-a1d2e086ea09","Type":"ContainerStarted","Data":"4cf792fe8867c3469754a954896e0d4fda25f5a133fec392985fe1011c33d37c"} Dec 06 01:05:30 crc kubenswrapper[4991]: I1206 01:05:30.019396 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wmtfj" podStartSLOduration=3.666561525 podStartE2EDuration="48.019382225s" podCreationTimestamp="2025-12-06 01:04:42 +0000 UTC" firstStartedPulling="2025-12-06 01:04:45.034979306 +0000 UTC m=+158.385269774" lastFinishedPulling="2025-12-06 01:05:29.387800006 +0000 UTC m=+202.738090474" observedRunningTime="2025-12-06 01:05:30.019299023 +0000 UTC m=+203.369589491" watchObservedRunningTime="2025-12-06 01:05:30.019382225 +0000 UTC m=+203.369672683" Dec 06 01:05:30 crc kubenswrapper[4991]: I1206 01:05:30.044275 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-52fr4" podStartSLOduration=3.400576207 podStartE2EDuration="46.044249076s" podCreationTimestamp="2025-12-06 01:04:44 +0000 UTC" firstStartedPulling="2025-12-06 01:04:46.7911768 +0000 UTC m=+160.141467268" lastFinishedPulling="2025-12-06 01:05:29.434849679 +0000 UTC m=+202.785140137" observedRunningTime="2025-12-06 01:05:30.041260109 +0000 UTC m=+203.391550597" watchObservedRunningTime="2025-12-06 01:05:30.044249076 +0000 UTC m=+203.394539544" Dec 06 01:05:30 crc kubenswrapper[4991]: I1206 01:05:30.068201 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=4.068175972 podStartE2EDuration="4.068175972s" podCreationTimestamp="2025-12-06 01:05:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:05:30.057554229 +0000 UTC m=+203.407844717" watchObservedRunningTime="2025-12-06 01:05:30.068175972 +0000 UTC m=+203.418466440" Dec 06 01:05:31 crc kubenswrapper[4991]: I1206 01:05:31.017404 4991 generic.go:334] "Generic (PLEG): container finished" podID="6a48da04-8ff0-4b3f-8064-a1d2e086ea09" containerID="4cf792fe8867c3469754a954896e0d4fda25f5a133fec392985fe1011c33d37c" exitCode=0 Dec 06 01:05:31 crc kubenswrapper[4991]: I1206 01:05:31.017551 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"6a48da04-8ff0-4b3f-8064-a1d2e086ea09","Type":"ContainerDied","Data":"4cf792fe8867c3469754a954896e0d4fda25f5a133fec392985fe1011c33d37c"} Dec 06 01:05:32 crc kubenswrapper[4991]: I1206 01:05:32.332712 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 01:05:32 crc kubenswrapper[4991]: I1206 01:05:32.504822 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 06 01:05:32 crc kubenswrapper[4991]: E1206 01:05:32.505595 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a48da04-8ff0-4b3f-8064-a1d2e086ea09" containerName="pruner" Dec 06 01:05:32 crc kubenswrapper[4991]: I1206 01:05:32.505611 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a48da04-8ff0-4b3f-8064-a1d2e086ea09" containerName="pruner" Dec 06 01:05:32 crc kubenswrapper[4991]: I1206 01:05:32.505717 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a48da04-8ff0-4b3f-8064-a1d2e086ea09" containerName="pruner" Dec 06 01:05:32 crc kubenswrapper[4991]: I1206 01:05:32.506251 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 06 01:05:32 crc kubenswrapper[4991]: I1206 01:05:32.515714 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 06 01:05:32 crc kubenswrapper[4991]: I1206 01:05:32.529931 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6a48da04-8ff0-4b3f-8064-a1d2e086ea09-kubelet-dir\") pod \"6a48da04-8ff0-4b3f-8064-a1d2e086ea09\" (UID: \"6a48da04-8ff0-4b3f-8064-a1d2e086ea09\") " Dec 06 01:05:32 crc kubenswrapper[4991]: I1206 01:05:32.530050 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a48da04-8ff0-4b3f-8064-a1d2e086ea09-kube-api-access\") pod \"6a48da04-8ff0-4b3f-8064-a1d2e086ea09\" (UID: \"6a48da04-8ff0-4b3f-8064-a1d2e086ea09\") " Dec 06 01:05:32 crc kubenswrapper[4991]: I1206 01:05:32.530566 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6a48da04-8ff0-4b3f-8064-a1d2e086ea09-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6a48da04-8ff0-4b3f-8064-a1d2e086ea09" (UID: "6a48da04-8ff0-4b3f-8064-a1d2e086ea09"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 01:05:32 crc kubenswrapper[4991]: I1206 01:05:32.541230 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a48da04-8ff0-4b3f-8064-a1d2e086ea09-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6a48da04-8ff0-4b3f-8064-a1d2e086ea09" (UID: "6a48da04-8ff0-4b3f-8064-a1d2e086ea09"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:05:32 crc kubenswrapper[4991]: I1206 01:05:32.632389 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6ea93134-143a-4533-a84c-d23e1f1e505c-kube-api-access\") pod \"installer-9-crc\" (UID: \"6ea93134-143a-4533-a84c-d23e1f1e505c\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 06 01:05:32 crc kubenswrapper[4991]: I1206 01:05:32.632452 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6ea93134-143a-4533-a84c-d23e1f1e505c-var-lock\") pod \"installer-9-crc\" (UID: \"6ea93134-143a-4533-a84c-d23e1f1e505c\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 06 01:05:32 crc kubenswrapper[4991]: I1206 01:05:32.632498 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6ea93134-143a-4533-a84c-d23e1f1e505c-kubelet-dir\") pod \"installer-9-crc\" (UID: \"6ea93134-143a-4533-a84c-d23e1f1e505c\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 06 01:05:32 crc kubenswrapper[4991]: I1206 01:05:32.633954 4991 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6a48da04-8ff0-4b3f-8064-a1d2e086ea09-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 06 01:05:32 crc kubenswrapper[4991]: I1206 01:05:32.634006 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a48da04-8ff0-4b3f-8064-a1d2e086ea09-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 06 01:05:32 crc kubenswrapper[4991]: I1206 01:05:32.734770 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6ea93134-143a-4533-a84c-d23e1f1e505c-kube-api-access\") pod \"installer-9-crc\" (UID: \"6ea93134-143a-4533-a84c-d23e1f1e505c\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 06 01:05:32 crc kubenswrapper[4991]: I1206 01:05:32.734824 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6ea93134-143a-4533-a84c-d23e1f1e505c-var-lock\") pod \"installer-9-crc\" (UID: \"6ea93134-143a-4533-a84c-d23e1f1e505c\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 06 01:05:32 crc kubenswrapper[4991]: I1206 01:05:32.734856 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6ea93134-143a-4533-a84c-d23e1f1e505c-kubelet-dir\") pod \"installer-9-crc\" (UID: \"6ea93134-143a-4533-a84c-d23e1f1e505c\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 06 01:05:32 crc kubenswrapper[4991]: I1206 01:05:32.734950 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6ea93134-143a-4533-a84c-d23e1f1e505c-kubelet-dir\") pod \"installer-9-crc\" (UID: \"6ea93134-143a-4533-a84c-d23e1f1e505c\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 06 01:05:32 crc kubenswrapper[4991]: I1206 01:05:32.735462 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6ea93134-143a-4533-a84c-d23e1f1e505c-var-lock\") pod \"installer-9-crc\" (UID: \"6ea93134-143a-4533-a84c-d23e1f1e505c\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 06 01:05:32 crc kubenswrapper[4991]: I1206 01:05:32.754689 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6ea93134-143a-4533-a84c-d23e1f1e505c-kube-api-access\") pod \"installer-9-crc\" (UID: \"6ea93134-143a-4533-a84c-d23e1f1e505c\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 06 01:05:32 crc kubenswrapper[4991]: I1206 01:05:32.835032 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 06 01:05:33 crc kubenswrapper[4991]: I1206 01:05:33.030815 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"6a48da04-8ff0-4b3f-8064-a1d2e086ea09","Type":"ContainerDied","Data":"c16736c89d8a37a98c0974d2f761db3e45812fdf0ed2d20baa8bf5f9dc63ac56"} Dec 06 01:05:33 crc kubenswrapper[4991]: I1206 01:05:33.030874 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c16736c89d8a37a98c0974d2f761db3e45812fdf0ed2d20baa8bf5f9dc63ac56" Dec 06 01:05:33 crc kubenswrapper[4991]: I1206 01:05:33.030949 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 06 01:05:33 crc kubenswrapper[4991]: I1206 01:05:33.054283 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wmtfj" Dec 06 01:05:33 crc kubenswrapper[4991]: I1206 01:05:33.054320 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wmtfj" Dec 06 01:05:33 crc kubenswrapper[4991]: I1206 01:05:33.056197 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 06 01:05:33 crc kubenswrapper[4991]: W1206 01:05:33.062589 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6ea93134_143a_4533_a84c_d23e1f1e505c.slice/crio-38fca5f913cf059fbb05e14a84f8c3568365d30a4dd5857ea414f411e622a4d4 WatchSource:0}: Error finding container 38fca5f913cf059fbb05e14a84f8c3568365d30a4dd5857ea414f411e622a4d4: Status 404 returned error can't find the container with id 38fca5f913cf059fbb05e14a84f8c3568365d30a4dd5857ea414f411e622a4d4 Dec 06 01:05:33 crc kubenswrapper[4991]: I1206 01:05:33.145680 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wmtfj" Dec 06 01:05:34 crc kubenswrapper[4991]: I1206 01:05:34.045039 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6ea93134-143a-4533-a84c-d23e1f1e505c","Type":"ContainerStarted","Data":"26a0545e430ab3af87fc68591a78b37925f62af92e76e5675eee59ff726fd7c6"} Dec 06 01:05:34 crc kubenswrapper[4991]: I1206 01:05:34.045553 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6ea93134-143a-4533-a84c-d23e1f1e505c","Type":"ContainerStarted","Data":"38fca5f913cf059fbb05e14a84f8c3568365d30a4dd5857ea414f411e622a4d4"} Dec 06 01:05:34 crc kubenswrapper[4991]: I1206 01:05:34.078146 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.078122384 podStartE2EDuration="2.078122384s" podCreationTimestamp="2025-12-06 01:05:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:05:34.062748167 +0000 UTC m=+207.413038655" watchObservedRunningTime="2025-12-06 01:05:34.078122384 +0000 UTC m=+207.428412852" Dec 06 01:05:34 crc kubenswrapper[4991]: I1206 01:05:34.094711 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wmtfj" Dec 06 01:05:34 crc kubenswrapper[4991]: I1206 01:05:34.916829 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-52fr4" Dec 06 01:05:34 crc kubenswrapper[4991]: I1206 01:05:34.916941 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-52fr4" Dec 06 01:05:34 crc kubenswrapper[4991]: I1206 01:05:34.979815 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-52fr4" Dec 06 01:05:35 crc kubenswrapper[4991]: I1206 01:05:35.109775 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-52fr4" Dec 06 01:05:37 crc kubenswrapper[4991]: I1206 01:05:37.924922 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mzbqd"] Dec 06 01:05:43 crc kubenswrapper[4991]: I1206 01:05:43.115747 4991 generic.go:334] "Generic (PLEG): container finished" podID="4317558f-42d2-4dd3-8d1e-1e13c4355cc4" containerID="3651d5406dba0b782f873d1e7f5ea69653d4686790cd64630a3bc0465809208d" exitCode=0 Dec 06 01:05:43 crc kubenswrapper[4991]: I1206 01:05:43.116130 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b72bk" event={"ID":"4317558f-42d2-4dd3-8d1e-1e13c4355cc4","Type":"ContainerDied","Data":"3651d5406dba0b782f873d1e7f5ea69653d4686790cd64630a3bc0465809208d"} Dec 06 01:05:43 crc kubenswrapper[4991]: I1206 01:05:43.119270 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6cvk9" event={"ID":"271a982a-6005-4d08-b284-6b140ba25f07","Type":"ContainerStarted","Data":"3ba272f5c190abf1548de6c2dc71037a2f3b5ba74634deaf11a324c4a50d8dbf"} Dec 06 01:05:43 crc kubenswrapper[4991]: I1206 01:05:43.121827 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rdt9r" event={"ID":"6ca586f2-2d2c-4772-9af3-c870daaf5285","Type":"ContainerStarted","Data":"47ff4fdce1d0368be12fe669beb3e141848afce03540d5d93ba0fbc9e8e68108"} Dec 06 01:05:44 crc kubenswrapper[4991]: I1206 01:05:44.130295 4991 generic.go:334] "Generic (PLEG): container finished" podID="271a982a-6005-4d08-b284-6b140ba25f07" containerID="3ba272f5c190abf1548de6c2dc71037a2f3b5ba74634deaf11a324c4a50d8dbf" exitCode=0 Dec 06 01:05:44 crc kubenswrapper[4991]: I1206 01:05:44.130389 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6cvk9" event={"ID":"271a982a-6005-4d08-b284-6b140ba25f07","Type":"ContainerDied","Data":"3ba272f5c190abf1548de6c2dc71037a2f3b5ba74634deaf11a324c4a50d8dbf"} Dec 06 01:05:44 crc kubenswrapper[4991]: I1206 01:05:44.135063 4991 generic.go:334] "Generic (PLEG): container finished" podID="6ca586f2-2d2c-4772-9af3-c870daaf5285" containerID="47ff4fdce1d0368be12fe669beb3e141848afce03540d5d93ba0fbc9e8e68108" exitCode=0 Dec 06 01:05:44 crc kubenswrapper[4991]: I1206 01:05:44.135139 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rdt9r" event={"ID":"6ca586f2-2d2c-4772-9af3-c870daaf5285","Type":"ContainerDied","Data":"47ff4fdce1d0368be12fe669beb3e141848afce03540d5d93ba0fbc9e8e68108"} Dec 06 01:05:46 crc kubenswrapper[4991]: I1206 01:05:46.739913 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 01:05:46 crc kubenswrapper[4991]: I1206 01:05:46.739978 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 01:05:46 crc kubenswrapper[4991]: I1206 01:05:46.740045 4991 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" Dec 06 01:05:46 crc kubenswrapper[4991]: I1206 01:05:46.740676 4991 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"674001323aaa7a65c9d6df014b5b35f7afb939aa6afe57ac9858923e3fe6ce15"} pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 01:05:46 crc kubenswrapper[4991]: I1206 01:05:46.740780 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" containerID="cri-o://674001323aaa7a65c9d6df014b5b35f7afb939aa6afe57ac9858923e3fe6ce15" gracePeriod=600 Dec 06 01:05:48 crc kubenswrapper[4991]: I1206 01:05:48.161181 4991 generic.go:334] "Generic (PLEG): container finished" podID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerID="674001323aaa7a65c9d6df014b5b35f7afb939aa6afe57ac9858923e3fe6ce15" exitCode=0 Dec 06 01:05:48 crc kubenswrapper[4991]: I1206 01:05:48.161270 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" event={"ID":"f35cea72-9e31-4432-9e3b-16a50ebce794","Type":"ContainerDied","Data":"674001323aaa7a65c9d6df014b5b35f7afb939aa6afe57ac9858923e3fe6ce15"} Dec 06 01:05:51 crc kubenswrapper[4991]: I1206 01:05:51.183477 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rdt9r" event={"ID":"6ca586f2-2d2c-4772-9af3-c870daaf5285","Type":"ContainerStarted","Data":"de2bc1f91304afaaa72e75970ebed7714b977230b4531af4ed9bd1fc13f9242d"} Dec 06 01:05:51 crc kubenswrapper[4991]: I1206 01:05:51.187867 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qzvgr" event={"ID":"ab7acc10-27d6-4f6b-9ec1-faf4fb60771d","Type":"ContainerStarted","Data":"a9d8a3c9299db15aea3d5f2665d25a3e110295f98976b860d51452774f186eb0"} Dec 06 01:05:51 crc kubenswrapper[4991]: I1206 01:05:51.191830 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bgvdn" event={"ID":"2cb731e7-ddf5-47db-b5b1-607ae7a9f146","Type":"ContainerStarted","Data":"407e65280e3e96618fd14db8b5f073a865b60c90d9a1fbb8a19109229ea18f34"} Dec 06 01:05:51 crc kubenswrapper[4991]: I1206 01:05:51.194845 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p8x59" event={"ID":"5c39eb27-3dbb-46d7-a874-ca6605d84134","Type":"ContainerStarted","Data":"ca3bf8d5feedd4cccfb5b2f45a7a49a2405309013121bf2522fea69736a279ae"} Dec 06 01:05:51 crc kubenswrapper[4991]: I1206 01:05:51.197926 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b72bk" event={"ID":"4317558f-42d2-4dd3-8d1e-1e13c4355cc4","Type":"ContainerStarted","Data":"7fff615cb1fa3b1e88091e369ace21def189a76ac6f283e97408993ac1532730"} Dec 06 01:05:51 crc kubenswrapper[4991]: I1206 01:05:51.204677 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" event={"ID":"f35cea72-9e31-4432-9e3b-16a50ebce794","Type":"ContainerStarted","Data":"fa487fa248ed7047ff7af2295096d5dbccca7d43cd1f2ade21a82400151996cb"} Dec 06 01:05:51 crc kubenswrapper[4991]: I1206 01:05:51.207858 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6cvk9" event={"ID":"271a982a-6005-4d08-b284-6b140ba25f07","Type":"ContainerStarted","Data":"50c61b2e11494df050da0f7745ea53d0e2e31b47d82bf2a342b6b3dd89d06ceb"} Dec 06 01:05:51 crc kubenswrapper[4991]: I1206 01:05:51.214336 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rdt9r" podStartSLOduration=3.9935009580000003 podStartE2EDuration="1m8.214315992s" podCreationTimestamp="2025-12-06 01:04:43 +0000 UTC" firstStartedPulling="2025-12-06 01:04:46.670995695 +0000 UTC m=+160.021286163" lastFinishedPulling="2025-12-06 01:05:50.891810729 +0000 UTC m=+224.242101197" observedRunningTime="2025-12-06 01:05:51.208794769 +0000 UTC m=+224.559085257" watchObservedRunningTime="2025-12-06 01:05:51.214315992 +0000 UTC m=+224.564606460" Dec 06 01:05:51 crc kubenswrapper[4991]: I1206 01:05:51.254579 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-b72bk" podStartSLOduration=4.454932713 podStartE2EDuration="1m7.254554539s" podCreationTimestamp="2025-12-06 01:04:44 +0000 UTC" firstStartedPulling="2025-12-06 01:04:47.998084327 +0000 UTC m=+161.348374785" lastFinishedPulling="2025-12-06 01:05:50.797706143 +0000 UTC m=+224.147996611" observedRunningTime="2025-12-06 01:05:51.252163377 +0000 UTC m=+224.602453855" watchObservedRunningTime="2025-12-06 01:05:51.254554539 +0000 UTC m=+224.604845007" Dec 06 01:05:51 crc kubenswrapper[4991]: I1206 01:05:51.299233 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6cvk9" podStartSLOduration=5.153309225 podStartE2EDuration="1m9.29921222s" podCreationTimestamp="2025-12-06 01:04:42 +0000 UTC" firstStartedPulling="2025-12-06 01:04:46.577643157 +0000 UTC m=+159.927933625" lastFinishedPulling="2025-12-06 01:05:50.723546152 +0000 UTC m=+224.073836620" observedRunningTime="2025-12-06 01:05:51.296335886 +0000 UTC m=+224.646626374" watchObservedRunningTime="2025-12-06 01:05:51.29921222 +0000 UTC m=+224.649502688" Dec 06 01:05:52 crc kubenswrapper[4991]: I1206 01:05:52.216919 4991 generic.go:334] "Generic (PLEG): container finished" podID="ab7acc10-27d6-4f6b-9ec1-faf4fb60771d" containerID="a9d8a3c9299db15aea3d5f2665d25a3e110295f98976b860d51452774f186eb0" exitCode=0 Dec 06 01:05:52 crc kubenswrapper[4991]: I1206 01:05:52.217048 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qzvgr" event={"ID":"ab7acc10-27d6-4f6b-9ec1-faf4fb60771d","Type":"ContainerDied","Data":"a9d8a3c9299db15aea3d5f2665d25a3e110295f98976b860d51452774f186eb0"} Dec 06 01:05:52 crc kubenswrapper[4991]: I1206 01:05:52.223729 4991 generic.go:334] "Generic (PLEG): container finished" podID="2cb731e7-ddf5-47db-b5b1-607ae7a9f146" containerID="407e65280e3e96618fd14db8b5f073a865b60c90d9a1fbb8a19109229ea18f34" exitCode=0 Dec 06 01:05:52 crc kubenswrapper[4991]: I1206 01:05:52.223812 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bgvdn" event={"ID":"2cb731e7-ddf5-47db-b5b1-607ae7a9f146","Type":"ContainerDied","Data":"407e65280e3e96618fd14db8b5f073a865b60c90d9a1fbb8a19109229ea18f34"} Dec 06 01:05:52 crc kubenswrapper[4991]: I1206 01:05:52.226797 4991 generic.go:334] "Generic (PLEG): container finished" podID="5c39eb27-3dbb-46d7-a874-ca6605d84134" containerID="ca3bf8d5feedd4cccfb5b2f45a7a49a2405309013121bf2522fea69736a279ae" exitCode=0 Dec 06 01:05:52 crc kubenswrapper[4991]: I1206 01:05:52.227235 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p8x59" event={"ID":"5c39eb27-3dbb-46d7-a874-ca6605d84134","Type":"ContainerDied","Data":"ca3bf8d5feedd4cccfb5b2f45a7a49a2405309013121bf2522fea69736a279ae"} Dec 06 01:05:53 crc kubenswrapper[4991]: I1206 01:05:53.145862 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6cvk9" Dec 06 01:05:53 crc kubenswrapper[4991]: I1206 01:05:53.146380 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6cvk9" Dec 06 01:05:53 crc kubenswrapper[4991]: I1206 01:05:53.197051 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6cvk9" Dec 06 01:05:53 crc kubenswrapper[4991]: I1206 01:05:53.235265 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qzvgr" event={"ID":"ab7acc10-27d6-4f6b-9ec1-faf4fb60771d","Type":"ContainerStarted","Data":"2d7235cb205a62da000e0e305ec5b0bbc80d9cbce86c88848ecc6b0d66e82285"} Dec 06 01:05:53 crc kubenswrapper[4991]: I1206 01:05:53.249528 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bgvdn" event={"ID":"2cb731e7-ddf5-47db-b5b1-607ae7a9f146","Type":"ContainerStarted","Data":"55f1e8360fcc5ea60ff435c5371b7147e5132be6f20b9393f20955981f90c4bc"} Dec 06 01:05:53 crc kubenswrapper[4991]: I1206 01:05:53.253625 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p8x59" event={"ID":"5c39eb27-3dbb-46d7-a874-ca6605d84134","Type":"ContainerStarted","Data":"8415567781f5bd3ca845fbd841dd4b03813472e3c0a5abeb15b4f37fad780ade"} Dec 06 01:05:53 crc kubenswrapper[4991]: I1206 01:05:53.256859 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qzvgr" podStartSLOduration=4.088317511 podStartE2EDuration="1m11.256845049s" podCreationTimestamp="2025-12-06 01:04:42 +0000 UTC" firstStartedPulling="2025-12-06 01:04:45.46870199 +0000 UTC m=+158.818992458" lastFinishedPulling="2025-12-06 01:05:52.637229528 +0000 UTC m=+225.987519996" observedRunningTime="2025-12-06 01:05:53.256040008 +0000 UTC m=+226.606330486" watchObservedRunningTime="2025-12-06 01:05:53.256845049 +0000 UTC m=+226.607135517" Dec 06 01:05:53 crc kubenswrapper[4991]: I1206 01:05:53.279677 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bgvdn" podStartSLOduration=3.835024704 podStartE2EDuration="1m7.279654937s" podCreationTimestamp="2025-12-06 01:04:46 +0000 UTC" firstStartedPulling="2025-12-06 01:04:49.223596074 +0000 UTC m=+162.573886542" lastFinishedPulling="2025-12-06 01:05:52.668226307 +0000 UTC m=+226.018516775" observedRunningTime="2025-12-06 01:05:53.275524461 +0000 UTC m=+226.625814929" watchObservedRunningTime="2025-12-06 01:05:53.279654937 +0000 UTC m=+226.629945405" Dec 06 01:05:53 crc kubenswrapper[4991]: I1206 01:05:53.300806 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-p8x59" podStartSLOduration=3.5921051999999998 podStartE2EDuration="1m8.300783152s" podCreationTimestamp="2025-12-06 01:04:45 +0000 UTC" firstStartedPulling="2025-12-06 01:04:47.978259367 +0000 UTC m=+161.328549825" lastFinishedPulling="2025-12-06 01:05:52.686937309 +0000 UTC m=+226.037227777" observedRunningTime="2025-12-06 01:05:53.298503073 +0000 UTC m=+226.648793531" watchObservedRunningTime="2025-12-06 01:05:53.300783152 +0000 UTC m=+226.651073620" Dec 06 01:05:53 crc kubenswrapper[4991]: I1206 01:05:53.506856 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qzvgr" Dec 06 01:05:53 crc kubenswrapper[4991]: I1206 01:05:53.506935 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qzvgr" Dec 06 01:05:53 crc kubenswrapper[4991]: I1206 01:05:53.596470 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rdt9r" Dec 06 01:05:53 crc kubenswrapper[4991]: I1206 01:05:53.596558 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rdt9r" Dec 06 01:05:53 crc kubenswrapper[4991]: I1206 01:05:53.643031 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rdt9r" Dec 06 01:05:54 crc kubenswrapper[4991]: I1206 01:05:54.552168 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-qzvgr" podUID="ab7acc10-27d6-4f6b-9ec1-faf4fb60771d" containerName="registry-server" probeResult="failure" output=< Dec 06 01:05:54 crc kubenswrapper[4991]: timeout: failed to connect service ":50051" within 1s Dec 06 01:05:54 crc kubenswrapper[4991]: > Dec 06 01:05:55 crc kubenswrapper[4991]: I1206 01:05:55.409801 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-b72bk" Dec 06 01:05:55 crc kubenswrapper[4991]: I1206 01:05:55.409884 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-b72bk" Dec 06 01:05:55 crc kubenswrapper[4991]: I1206 01:05:55.470956 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-b72bk" Dec 06 01:05:56 crc kubenswrapper[4991]: I1206 01:05:56.119016 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-p8x59" Dec 06 01:05:56 crc kubenswrapper[4991]: I1206 01:05:56.119132 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-p8x59" Dec 06 01:05:56 crc kubenswrapper[4991]: I1206 01:05:56.318079 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-b72bk" Dec 06 01:05:56 crc kubenswrapper[4991]: I1206 01:05:56.574305 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bgvdn" Dec 06 01:05:56 crc kubenswrapper[4991]: I1206 01:05:56.574400 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bgvdn" Dec 06 01:05:57 crc kubenswrapper[4991]: I1206 01:05:57.180921 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-p8x59" podUID="5c39eb27-3dbb-46d7-a874-ca6605d84134" containerName="registry-server" probeResult="failure" output=< Dec 06 01:05:57 crc kubenswrapper[4991]: timeout: failed to connect service ":50051" within 1s Dec 06 01:05:57 crc kubenswrapper[4991]: > Dec 06 01:05:57 crc kubenswrapper[4991]: I1206 01:05:57.637960 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bgvdn" podUID="2cb731e7-ddf5-47db-b5b1-607ae7a9f146" containerName="registry-server" probeResult="failure" output=< Dec 06 01:05:57 crc kubenswrapper[4991]: timeout: failed to connect service ":50051" within 1s Dec 06 01:05:57 crc kubenswrapper[4991]: > Dec 06 01:05:59 crc kubenswrapper[4991]: I1206 01:05:59.614668 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b72bk"] Dec 06 01:05:59 crc kubenswrapper[4991]: I1206 01:05:59.615340 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-b72bk" podUID="4317558f-42d2-4dd3-8d1e-1e13c4355cc4" containerName="registry-server" containerID="cri-o://7fff615cb1fa3b1e88091e369ace21def189a76ac6f283e97408993ac1532730" gracePeriod=2 Dec 06 01:05:59 crc kubenswrapper[4991]: I1206 01:05:59.989004 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b72bk" Dec 06 01:06:00 crc kubenswrapper[4991]: I1206 01:06:00.074892 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4317558f-42d2-4dd3-8d1e-1e13c4355cc4-utilities\") pod \"4317558f-42d2-4dd3-8d1e-1e13c4355cc4\" (UID: \"4317558f-42d2-4dd3-8d1e-1e13c4355cc4\") " Dec 06 01:06:00 crc kubenswrapper[4991]: I1206 01:06:00.074954 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5nt2\" (UniqueName: \"kubernetes.io/projected/4317558f-42d2-4dd3-8d1e-1e13c4355cc4-kube-api-access-s5nt2\") pod \"4317558f-42d2-4dd3-8d1e-1e13c4355cc4\" (UID: \"4317558f-42d2-4dd3-8d1e-1e13c4355cc4\") " Dec 06 01:06:00 crc kubenswrapper[4991]: I1206 01:06:00.075090 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4317558f-42d2-4dd3-8d1e-1e13c4355cc4-catalog-content\") pod \"4317558f-42d2-4dd3-8d1e-1e13c4355cc4\" (UID: \"4317558f-42d2-4dd3-8d1e-1e13c4355cc4\") " Dec 06 01:06:00 crc kubenswrapper[4991]: I1206 01:06:00.077686 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4317558f-42d2-4dd3-8d1e-1e13c4355cc4-utilities" (OuterVolumeSpecName: "utilities") pod "4317558f-42d2-4dd3-8d1e-1e13c4355cc4" (UID: "4317558f-42d2-4dd3-8d1e-1e13c4355cc4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:06:00 crc kubenswrapper[4991]: I1206 01:06:00.088229 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4317558f-42d2-4dd3-8d1e-1e13c4355cc4-kube-api-access-s5nt2" (OuterVolumeSpecName: "kube-api-access-s5nt2") pod "4317558f-42d2-4dd3-8d1e-1e13c4355cc4" (UID: "4317558f-42d2-4dd3-8d1e-1e13c4355cc4"). InnerVolumeSpecName "kube-api-access-s5nt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:06:00 crc kubenswrapper[4991]: I1206 01:06:00.097626 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4317558f-42d2-4dd3-8d1e-1e13c4355cc4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4317558f-42d2-4dd3-8d1e-1e13c4355cc4" (UID: "4317558f-42d2-4dd3-8d1e-1e13c4355cc4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:06:00 crc kubenswrapper[4991]: I1206 01:06:00.176912 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4317558f-42d2-4dd3-8d1e-1e13c4355cc4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 01:06:00 crc kubenswrapper[4991]: I1206 01:06:00.176952 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4317558f-42d2-4dd3-8d1e-1e13c4355cc4-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 01:06:00 crc kubenswrapper[4991]: I1206 01:06:00.176965 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5nt2\" (UniqueName: \"kubernetes.io/projected/4317558f-42d2-4dd3-8d1e-1e13c4355cc4-kube-api-access-s5nt2\") on node \"crc\" DevicePath \"\"" Dec 06 01:06:00 crc kubenswrapper[4991]: I1206 01:06:00.297781 4991 generic.go:334] "Generic (PLEG): container finished" podID="4317558f-42d2-4dd3-8d1e-1e13c4355cc4" containerID="7fff615cb1fa3b1e88091e369ace21def189a76ac6f283e97408993ac1532730" exitCode=0 Dec 06 01:06:00 crc kubenswrapper[4991]: I1206 01:06:00.297852 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b72bk" event={"ID":"4317558f-42d2-4dd3-8d1e-1e13c4355cc4","Type":"ContainerDied","Data":"7fff615cb1fa3b1e88091e369ace21def189a76ac6f283e97408993ac1532730"} Dec 06 01:06:00 crc kubenswrapper[4991]: I1206 01:06:00.297893 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b72bk" event={"ID":"4317558f-42d2-4dd3-8d1e-1e13c4355cc4","Type":"ContainerDied","Data":"3da8328782a21eb259f2beb23675f695d15cc71cb9945a001df2ff766bbcc049"} Dec 06 01:06:00 crc kubenswrapper[4991]: I1206 01:06:00.297915 4991 scope.go:117] "RemoveContainer" containerID="7fff615cb1fa3b1e88091e369ace21def189a76ac6f283e97408993ac1532730" Dec 06 01:06:00 crc kubenswrapper[4991]: I1206 01:06:00.297854 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b72bk" Dec 06 01:06:00 crc kubenswrapper[4991]: I1206 01:06:00.318597 4991 scope.go:117] "RemoveContainer" containerID="3651d5406dba0b782f873d1e7f5ea69653d4686790cd64630a3bc0465809208d" Dec 06 01:06:00 crc kubenswrapper[4991]: I1206 01:06:00.331500 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b72bk"] Dec 06 01:06:00 crc kubenswrapper[4991]: I1206 01:06:00.341487 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-b72bk"] Dec 06 01:06:00 crc kubenswrapper[4991]: I1206 01:06:00.359093 4991 scope.go:117] "RemoveContainer" containerID="e3f9f5f7bf26940b16b28570d965cb244887519b3d1265fd7314d993686966c4" Dec 06 01:06:00 crc kubenswrapper[4991]: I1206 01:06:00.373754 4991 scope.go:117] "RemoveContainer" containerID="7fff615cb1fa3b1e88091e369ace21def189a76ac6f283e97408993ac1532730" Dec 06 01:06:00 crc kubenswrapper[4991]: E1206 01:06:00.374462 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fff615cb1fa3b1e88091e369ace21def189a76ac6f283e97408993ac1532730\": container with ID starting with 7fff615cb1fa3b1e88091e369ace21def189a76ac6f283e97408993ac1532730 not found: ID does not exist" containerID="7fff615cb1fa3b1e88091e369ace21def189a76ac6f283e97408993ac1532730" Dec 06 01:06:00 crc kubenswrapper[4991]: I1206 01:06:00.374526 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fff615cb1fa3b1e88091e369ace21def189a76ac6f283e97408993ac1532730"} err="failed to get container status \"7fff615cb1fa3b1e88091e369ace21def189a76ac6f283e97408993ac1532730\": rpc error: code = NotFound desc = could not find container \"7fff615cb1fa3b1e88091e369ace21def189a76ac6f283e97408993ac1532730\": container with ID starting with 7fff615cb1fa3b1e88091e369ace21def189a76ac6f283e97408993ac1532730 not found: ID does not exist" Dec 06 01:06:00 crc kubenswrapper[4991]: I1206 01:06:00.374578 4991 scope.go:117] "RemoveContainer" containerID="3651d5406dba0b782f873d1e7f5ea69653d4686790cd64630a3bc0465809208d" Dec 06 01:06:00 crc kubenswrapper[4991]: E1206 01:06:00.375144 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3651d5406dba0b782f873d1e7f5ea69653d4686790cd64630a3bc0465809208d\": container with ID starting with 3651d5406dba0b782f873d1e7f5ea69653d4686790cd64630a3bc0465809208d not found: ID does not exist" containerID="3651d5406dba0b782f873d1e7f5ea69653d4686790cd64630a3bc0465809208d" Dec 06 01:06:00 crc kubenswrapper[4991]: I1206 01:06:00.375248 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3651d5406dba0b782f873d1e7f5ea69653d4686790cd64630a3bc0465809208d"} err="failed to get container status \"3651d5406dba0b782f873d1e7f5ea69653d4686790cd64630a3bc0465809208d\": rpc error: code = NotFound desc = could not find container \"3651d5406dba0b782f873d1e7f5ea69653d4686790cd64630a3bc0465809208d\": container with ID starting with 3651d5406dba0b782f873d1e7f5ea69653d4686790cd64630a3bc0465809208d not found: ID does not exist" Dec 06 01:06:00 crc kubenswrapper[4991]: I1206 01:06:00.375336 4991 scope.go:117] "RemoveContainer" containerID="e3f9f5f7bf26940b16b28570d965cb244887519b3d1265fd7314d993686966c4" Dec 06 01:06:00 crc kubenswrapper[4991]: E1206 01:06:00.375668 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3f9f5f7bf26940b16b28570d965cb244887519b3d1265fd7314d993686966c4\": container with ID starting with e3f9f5f7bf26940b16b28570d965cb244887519b3d1265fd7314d993686966c4 not found: ID does not exist" containerID="e3f9f5f7bf26940b16b28570d965cb244887519b3d1265fd7314d993686966c4" Dec 06 01:06:00 crc kubenswrapper[4991]: I1206 01:06:00.375704 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3f9f5f7bf26940b16b28570d965cb244887519b3d1265fd7314d993686966c4"} err="failed to get container status \"e3f9f5f7bf26940b16b28570d965cb244887519b3d1265fd7314d993686966c4\": rpc error: code = NotFound desc = could not find container \"e3f9f5f7bf26940b16b28570d965cb244887519b3d1265fd7314d993686966c4\": container with ID starting with e3f9f5f7bf26940b16b28570d965cb244887519b3d1265fd7314d993686966c4 not found: ID does not exist" Dec 06 01:06:01 crc kubenswrapper[4991]: I1206 01:06:01.047588 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4317558f-42d2-4dd3-8d1e-1e13c4355cc4" path="/var/lib/kubelet/pods/4317558f-42d2-4dd3-8d1e-1e13c4355cc4/volumes" Dec 06 01:06:02 crc kubenswrapper[4991]: I1206 01:06:02.972747 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" podUID="2524c61e-9353-41bd-acda-66380b2381b2" containerName="oauth-openshift" containerID="cri-o://fb95cf865d9d86576a33d5821fc49072b36573ef0160317efad665457ad2ef3e" gracePeriod=15 Dec 06 01:06:03 crc kubenswrapper[4991]: I1206 01:06:03.223520 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6cvk9" Dec 06 01:06:03 crc kubenswrapper[4991]: I1206 01:06:03.577319 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qzvgr" Dec 06 01:06:03 crc kubenswrapper[4991]: I1206 01:06:03.623416 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qzvgr" Dec 06 01:06:03 crc kubenswrapper[4991]: I1206 01:06:03.674736 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rdt9r" Dec 06 01:06:05 crc kubenswrapper[4991]: I1206 01:06:05.015462 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qzvgr"] Dec 06 01:06:05 crc kubenswrapper[4991]: I1206 01:06:05.334515 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qzvgr" podUID="ab7acc10-27d6-4f6b-9ec1-faf4fb60771d" containerName="registry-server" containerID="cri-o://2d7235cb205a62da000e0e305ec5b0bbc80d9cbce86c88848ecc6b0d66e82285" gracePeriod=2 Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.011101 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rdt9r"] Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.011405 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rdt9r" podUID="6ca586f2-2d2c-4772-9af3-c870daaf5285" containerName="registry-server" containerID="cri-o://de2bc1f91304afaaa72e75970ebed7714b977230b4531af4ed9bd1fc13f9242d" gracePeriod=2 Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.161283 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-p8x59" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.212448 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-p8x59" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.350490 4991 generic.go:334] "Generic (PLEG): container finished" podID="ab7acc10-27d6-4f6b-9ec1-faf4fb60771d" containerID="2d7235cb205a62da000e0e305ec5b0bbc80d9cbce86c88848ecc6b0d66e82285" exitCode=0 Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.350613 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qzvgr" event={"ID":"ab7acc10-27d6-4f6b-9ec1-faf4fb60771d","Type":"ContainerDied","Data":"2d7235cb205a62da000e0e305ec5b0bbc80d9cbce86c88848ecc6b0d66e82285"} Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.353195 4991 generic.go:334] "Generic (PLEG): container finished" podID="2524c61e-9353-41bd-acda-66380b2381b2" containerID="fb95cf865d9d86576a33d5821fc49072b36573ef0160317efad665457ad2ef3e" exitCode=0 Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.353248 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" event={"ID":"2524c61e-9353-41bd-acda-66380b2381b2","Type":"ContainerDied","Data":"fb95cf865d9d86576a33d5821fc49072b36573ef0160317efad665457ad2ef3e"} Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.655139 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bgvdn" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.727310 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bgvdn" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.806130 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.865055 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2"] Dec 06 01:06:06 crc kubenswrapper[4991]: E1206 01:06:06.869081 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2524c61e-9353-41bd-acda-66380b2381b2" containerName="oauth-openshift" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.869121 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="2524c61e-9353-41bd-acda-66380b2381b2" containerName="oauth-openshift" Dec 06 01:06:06 crc kubenswrapper[4991]: E1206 01:06:06.869156 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4317558f-42d2-4dd3-8d1e-1e13c4355cc4" containerName="extract-utilities" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.869169 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="4317558f-42d2-4dd3-8d1e-1e13c4355cc4" containerName="extract-utilities" Dec 06 01:06:06 crc kubenswrapper[4991]: E1206 01:06:06.869191 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4317558f-42d2-4dd3-8d1e-1e13c4355cc4" containerName="registry-server" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.869207 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="4317558f-42d2-4dd3-8d1e-1e13c4355cc4" containerName="registry-server" Dec 06 01:06:06 crc kubenswrapper[4991]: E1206 01:06:06.869229 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4317558f-42d2-4dd3-8d1e-1e13c4355cc4" containerName="extract-content" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.869241 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="4317558f-42d2-4dd3-8d1e-1e13c4355cc4" containerName="extract-content" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.869454 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="4317558f-42d2-4dd3-8d1e-1e13c4355cc4" containerName="registry-server" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.869484 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="2524c61e-9353-41bd-acda-66380b2381b2" containerName="oauth-openshift" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.870275 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.881104 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2"] Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.885938 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-system-cliconfig\") pod \"2524c61e-9353-41bd-acda-66380b2381b2\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.886023 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2524c61e-9353-41bd-acda-66380b2381b2-audit-policies\") pod \"2524c61e-9353-41bd-acda-66380b2381b2\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.886048 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-system-ocp-branding-template\") pod \"2524c61e-9353-41bd-acda-66380b2381b2\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.886078 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2524c61e-9353-41bd-acda-66380b2381b2-audit-dir\") pod \"2524c61e-9353-41bd-acda-66380b2381b2\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.886109 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-system-trusted-ca-bundle\") pod \"2524c61e-9353-41bd-acda-66380b2381b2\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.886136 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-user-template-login\") pod \"2524c61e-9353-41bd-acda-66380b2381b2\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.886160 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt48w\" (UniqueName: \"kubernetes.io/projected/2524c61e-9353-41bd-acda-66380b2381b2-kube-api-access-vt48w\") pod \"2524c61e-9353-41bd-acda-66380b2381b2\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.886200 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-system-service-ca\") pod \"2524c61e-9353-41bd-acda-66380b2381b2\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.886256 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-system-serving-cert\") pod \"2524c61e-9353-41bd-acda-66380b2381b2\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.886275 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-user-idp-0-file-data\") pod \"2524c61e-9353-41bd-acda-66380b2381b2\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.886294 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-system-router-certs\") pod \"2524c61e-9353-41bd-acda-66380b2381b2\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.886336 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-user-template-provider-selection\") pod \"2524c61e-9353-41bd-acda-66380b2381b2\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.886362 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-system-session\") pod \"2524c61e-9353-41bd-acda-66380b2381b2\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.886392 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-user-template-error\") pod \"2524c61e-9353-41bd-acda-66380b2381b2\" (UID: \"2524c61e-9353-41bd-acda-66380b2381b2\") " Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.887010 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "2524c61e-9353-41bd-acda-66380b2381b2" (UID: "2524c61e-9353-41bd-acda-66380b2381b2"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.887119 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2524c61e-9353-41bd-acda-66380b2381b2-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "2524c61e-9353-41bd-acda-66380b2381b2" (UID: "2524c61e-9353-41bd-acda-66380b2381b2"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.887670 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2524c61e-9353-41bd-acda-66380b2381b2-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "2524c61e-9353-41bd-acda-66380b2381b2" (UID: "2524c61e-9353-41bd-acda-66380b2381b2"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.887872 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "2524c61e-9353-41bd-acda-66380b2381b2" (UID: "2524c61e-9353-41bd-acda-66380b2381b2"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.888312 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "2524c61e-9353-41bd-acda-66380b2381b2" (UID: "2524c61e-9353-41bd-acda-66380b2381b2"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.898194 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2524c61e-9353-41bd-acda-66380b2381b2-kube-api-access-vt48w" (OuterVolumeSpecName: "kube-api-access-vt48w") pod "2524c61e-9353-41bd-acda-66380b2381b2" (UID: "2524c61e-9353-41bd-acda-66380b2381b2"). InnerVolumeSpecName "kube-api-access-vt48w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.901870 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qzvgr" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.904463 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "2524c61e-9353-41bd-acda-66380b2381b2" (UID: "2524c61e-9353-41bd-acda-66380b2381b2"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.904971 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "2524c61e-9353-41bd-acda-66380b2381b2" (UID: "2524c61e-9353-41bd-acda-66380b2381b2"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.905350 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "2524c61e-9353-41bd-acda-66380b2381b2" (UID: "2524c61e-9353-41bd-acda-66380b2381b2"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.908042 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "2524c61e-9353-41bd-acda-66380b2381b2" (UID: "2524c61e-9353-41bd-acda-66380b2381b2"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.908456 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "2524c61e-9353-41bd-acda-66380b2381b2" (UID: "2524c61e-9353-41bd-acda-66380b2381b2"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.909012 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "2524c61e-9353-41bd-acda-66380b2381b2" (UID: "2524c61e-9353-41bd-acda-66380b2381b2"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.909380 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "2524c61e-9353-41bd-acda-66380b2381b2" (UID: "2524c61e-9353-41bd-acda-66380b2381b2"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.912462 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "2524c61e-9353-41bd-acda-66380b2381b2" (UID: "2524c61e-9353-41bd-acda-66380b2381b2"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.988087 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab7acc10-27d6-4f6b-9ec1-faf4fb60771d-utilities\") pod \"ab7acc10-27d6-4f6b-9ec1-faf4fb60771d\" (UID: \"ab7acc10-27d6-4f6b-9ec1-faf4fb60771d\") " Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.988182 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7m8f\" (UniqueName: \"kubernetes.io/projected/ab7acc10-27d6-4f6b-9ec1-faf4fb60771d-kube-api-access-w7m8f\") pod \"ab7acc10-27d6-4f6b-9ec1-faf4fb60771d\" (UID: \"ab7acc10-27d6-4f6b-9ec1-faf4fb60771d\") " Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.988294 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab7acc10-27d6-4f6b-9ec1-faf4fb60771d-catalog-content\") pod \"ab7acc10-27d6-4f6b-9ec1-faf4fb60771d\" (UID: \"ab7acc10-27d6-4f6b-9ec1-faf4fb60771d\") " Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.988598 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a30ca898-e9a6-482d-ae56-b583a4ab48b0-v4-0-config-system-router-certs\") pod \"oauth-openshift-9bc7b6b6b-jbht2\" (UID: \"a30ca898-e9a6-482d-ae56-b583a4ab48b0\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.988728 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a30ca898-e9a6-482d-ae56-b583a4ab48b0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-9bc7b6b6b-jbht2\" (UID: \"a30ca898-e9a6-482d-ae56-b583a4ab48b0\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.988757 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a30ca898-e9a6-482d-ae56-b583a4ab48b0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-9bc7b6b6b-jbht2\" (UID: \"a30ca898-e9a6-482d-ae56-b583a4ab48b0\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.988827 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a30ca898-e9a6-482d-ae56-b583a4ab48b0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-9bc7b6b6b-jbht2\" (UID: \"a30ca898-e9a6-482d-ae56-b583a4ab48b0\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.989025 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a30ca898-e9a6-482d-ae56-b583a4ab48b0-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-9bc7b6b6b-jbht2\" (UID: \"a30ca898-e9a6-482d-ae56-b583a4ab48b0\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.989160 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a30ca898-e9a6-482d-ae56-b583a4ab48b0-v4-0-config-system-session\") pod \"oauth-openshift-9bc7b6b6b-jbht2\" (UID: \"a30ca898-e9a6-482d-ae56-b583a4ab48b0\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.989240 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvqs4\" (UniqueName: \"kubernetes.io/projected/a30ca898-e9a6-482d-ae56-b583a4ab48b0-kube-api-access-pvqs4\") pod \"oauth-openshift-9bc7b6b6b-jbht2\" (UID: \"a30ca898-e9a6-482d-ae56-b583a4ab48b0\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.989276 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a30ca898-e9a6-482d-ae56-b583a4ab48b0-v4-0-config-system-service-ca\") pod \"oauth-openshift-9bc7b6b6b-jbht2\" (UID: \"a30ca898-e9a6-482d-ae56-b583a4ab48b0\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.989295 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a30ca898-e9a6-482d-ae56-b583a4ab48b0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-9bc7b6b6b-jbht2\" (UID: \"a30ca898-e9a6-482d-ae56-b583a4ab48b0\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.989353 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a30ca898-e9a6-482d-ae56-b583a4ab48b0-v4-0-config-user-template-error\") pod \"oauth-openshift-9bc7b6b6b-jbht2\" (UID: \"a30ca898-e9a6-482d-ae56-b583a4ab48b0\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.989437 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a30ca898-e9a6-482d-ae56-b583a4ab48b0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-9bc7b6b6b-jbht2\" (UID: \"a30ca898-e9a6-482d-ae56-b583a4ab48b0\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.989525 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a30ca898-e9a6-482d-ae56-b583a4ab48b0-v4-0-config-user-template-login\") pod \"oauth-openshift-9bc7b6b6b-jbht2\" (UID: \"a30ca898-e9a6-482d-ae56-b583a4ab48b0\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.989590 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a30ca898-e9a6-482d-ae56-b583a4ab48b0-audit-policies\") pod \"oauth-openshift-9bc7b6b6b-jbht2\" (UID: \"a30ca898-e9a6-482d-ae56-b583a4ab48b0\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.989650 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a30ca898-e9a6-482d-ae56-b583a4ab48b0-audit-dir\") pod \"oauth-openshift-9bc7b6b6b-jbht2\" (UID: \"a30ca898-e9a6-482d-ae56-b583a4ab48b0\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.989699 4991 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2524c61e-9353-41bd-acda-66380b2381b2-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.989714 4991 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.989727 4991 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2524c61e-9353-41bd-acda-66380b2381b2-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.989739 4991 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.989749 4991 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.989763 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt48w\" (UniqueName: \"kubernetes.io/projected/2524c61e-9353-41bd-acda-66380b2381b2-kube-api-access-vt48w\") on node \"crc\" DevicePath \"\"" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.989777 4991 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.989788 4991 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.989799 4991 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.989811 4991 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.989824 4991 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.989835 4991 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.989846 4991 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.989859 4991 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2524c61e-9353-41bd-acda-66380b2381b2-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.992633 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab7acc10-27d6-4f6b-9ec1-faf4fb60771d-kube-api-access-w7m8f" (OuterVolumeSpecName: "kube-api-access-w7m8f") pod "ab7acc10-27d6-4f6b-9ec1-faf4fb60771d" (UID: "ab7acc10-27d6-4f6b-9ec1-faf4fb60771d"). InnerVolumeSpecName "kube-api-access-w7m8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:06:06 crc kubenswrapper[4991]: I1206 01:06:06.997037 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab7acc10-27d6-4f6b-9ec1-faf4fb60771d-utilities" (OuterVolumeSpecName: "utilities") pod "ab7acc10-27d6-4f6b-9ec1-faf4fb60771d" (UID: "ab7acc10-27d6-4f6b-9ec1-faf4fb60771d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.084583 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab7acc10-27d6-4f6b-9ec1-faf4fb60771d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ab7acc10-27d6-4f6b-9ec1-faf4fb60771d" (UID: "ab7acc10-27d6-4f6b-9ec1-faf4fb60771d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.092182 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a30ca898-e9a6-482d-ae56-b583a4ab48b0-v4-0-config-system-service-ca\") pod \"oauth-openshift-9bc7b6b6b-jbht2\" (UID: \"a30ca898-e9a6-482d-ae56-b583a4ab48b0\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.092417 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a30ca898-e9a6-482d-ae56-b583a4ab48b0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-9bc7b6b6b-jbht2\" (UID: \"a30ca898-e9a6-482d-ae56-b583a4ab48b0\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.092448 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a30ca898-e9a6-482d-ae56-b583a4ab48b0-v4-0-config-user-template-error\") pod \"oauth-openshift-9bc7b6b6b-jbht2\" (UID: \"a30ca898-e9a6-482d-ae56-b583a4ab48b0\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.092500 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a30ca898-e9a6-482d-ae56-b583a4ab48b0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-9bc7b6b6b-jbht2\" (UID: \"a30ca898-e9a6-482d-ae56-b583a4ab48b0\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.092536 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a30ca898-e9a6-482d-ae56-b583a4ab48b0-v4-0-config-user-template-login\") pod \"oauth-openshift-9bc7b6b6b-jbht2\" (UID: \"a30ca898-e9a6-482d-ae56-b583a4ab48b0\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.092575 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a30ca898-e9a6-482d-ae56-b583a4ab48b0-audit-policies\") pod \"oauth-openshift-9bc7b6b6b-jbht2\" (UID: \"a30ca898-e9a6-482d-ae56-b583a4ab48b0\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.092602 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a30ca898-e9a6-482d-ae56-b583a4ab48b0-audit-dir\") pod \"oauth-openshift-9bc7b6b6b-jbht2\" (UID: \"a30ca898-e9a6-482d-ae56-b583a4ab48b0\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.092687 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a30ca898-e9a6-482d-ae56-b583a4ab48b0-v4-0-config-system-router-certs\") pod \"oauth-openshift-9bc7b6b6b-jbht2\" (UID: \"a30ca898-e9a6-482d-ae56-b583a4ab48b0\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.092718 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a30ca898-e9a6-482d-ae56-b583a4ab48b0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-9bc7b6b6b-jbht2\" (UID: \"a30ca898-e9a6-482d-ae56-b583a4ab48b0\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.092749 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a30ca898-e9a6-482d-ae56-b583a4ab48b0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-9bc7b6b6b-jbht2\" (UID: \"a30ca898-e9a6-482d-ae56-b583a4ab48b0\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.092778 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a30ca898-e9a6-482d-ae56-b583a4ab48b0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-9bc7b6b6b-jbht2\" (UID: \"a30ca898-e9a6-482d-ae56-b583a4ab48b0\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.092847 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a30ca898-e9a6-482d-ae56-b583a4ab48b0-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-9bc7b6b6b-jbht2\" (UID: \"a30ca898-e9a6-482d-ae56-b583a4ab48b0\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.092894 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvqs4\" (UniqueName: \"kubernetes.io/projected/a30ca898-e9a6-482d-ae56-b583a4ab48b0-kube-api-access-pvqs4\") pod \"oauth-openshift-9bc7b6b6b-jbht2\" (UID: \"a30ca898-e9a6-482d-ae56-b583a4ab48b0\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.092968 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a30ca898-e9a6-482d-ae56-b583a4ab48b0-v4-0-config-system-session\") pod \"oauth-openshift-9bc7b6b6b-jbht2\" (UID: \"a30ca898-e9a6-482d-ae56-b583a4ab48b0\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.093042 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab7acc10-27d6-4f6b-9ec1-faf4fb60771d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.093060 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab7acc10-27d6-4f6b-9ec1-faf4fb60771d-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.093549 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7m8f\" (UniqueName: \"kubernetes.io/projected/ab7acc10-27d6-4f6b-9ec1-faf4fb60771d-kube-api-access-w7m8f\") on node \"crc\" DevicePath \"\"" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.096363 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a30ca898-e9a6-482d-ae56-b583a4ab48b0-v4-0-config-system-service-ca\") pod \"oauth-openshift-9bc7b6b6b-jbht2\" (UID: \"a30ca898-e9a6-482d-ae56-b583a4ab48b0\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.096733 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a30ca898-e9a6-482d-ae56-b583a4ab48b0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-9bc7b6b6b-jbht2\" (UID: \"a30ca898-e9a6-482d-ae56-b583a4ab48b0\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.098046 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a30ca898-e9a6-482d-ae56-b583a4ab48b0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-9bc7b6b6b-jbht2\" (UID: \"a30ca898-e9a6-482d-ae56-b583a4ab48b0\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.098459 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a30ca898-e9a6-482d-ae56-b583a4ab48b0-audit-dir\") pod \"oauth-openshift-9bc7b6b6b-jbht2\" (UID: \"a30ca898-e9a6-482d-ae56-b583a4ab48b0\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.098737 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a30ca898-e9a6-482d-ae56-b583a4ab48b0-v4-0-config-system-session\") pod \"oauth-openshift-9bc7b6b6b-jbht2\" (UID: \"a30ca898-e9a6-482d-ae56-b583a4ab48b0\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.102077 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a30ca898-e9a6-482d-ae56-b583a4ab48b0-audit-policies\") pod \"oauth-openshift-9bc7b6b6b-jbht2\" (UID: \"a30ca898-e9a6-482d-ae56-b583a4ab48b0\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.104402 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a30ca898-e9a6-482d-ae56-b583a4ab48b0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-9bc7b6b6b-jbht2\" (UID: \"a30ca898-e9a6-482d-ae56-b583a4ab48b0\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.109765 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a30ca898-e9a6-482d-ae56-b583a4ab48b0-v4-0-config-system-router-certs\") pod \"oauth-openshift-9bc7b6b6b-jbht2\" (UID: \"a30ca898-e9a6-482d-ae56-b583a4ab48b0\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.109779 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a30ca898-e9a6-482d-ae56-b583a4ab48b0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-9bc7b6b6b-jbht2\" (UID: \"a30ca898-e9a6-482d-ae56-b583a4ab48b0\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.110100 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a30ca898-e9a6-482d-ae56-b583a4ab48b0-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-9bc7b6b6b-jbht2\" (UID: \"a30ca898-e9a6-482d-ae56-b583a4ab48b0\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.111243 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a30ca898-e9a6-482d-ae56-b583a4ab48b0-v4-0-config-user-template-error\") pod \"oauth-openshift-9bc7b6b6b-jbht2\" (UID: \"a30ca898-e9a6-482d-ae56-b583a4ab48b0\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.111441 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a30ca898-e9a6-482d-ae56-b583a4ab48b0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-9bc7b6b6b-jbht2\" (UID: \"a30ca898-e9a6-482d-ae56-b583a4ab48b0\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.116746 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a30ca898-e9a6-482d-ae56-b583a4ab48b0-v4-0-config-user-template-login\") pod \"oauth-openshift-9bc7b6b6b-jbht2\" (UID: \"a30ca898-e9a6-482d-ae56-b583a4ab48b0\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.119810 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvqs4\" (UniqueName: \"kubernetes.io/projected/a30ca898-e9a6-482d-ae56-b583a4ab48b0-kube-api-access-pvqs4\") pod \"oauth-openshift-9bc7b6b6b-jbht2\" (UID: \"a30ca898-e9a6-482d-ae56-b583a4ab48b0\") " pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.176722 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rdt9r" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.254696 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.298517 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thqvw\" (UniqueName: \"kubernetes.io/projected/6ca586f2-2d2c-4772-9af3-c870daaf5285-kube-api-access-thqvw\") pod \"6ca586f2-2d2c-4772-9af3-c870daaf5285\" (UID: \"6ca586f2-2d2c-4772-9af3-c870daaf5285\") " Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.298626 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ca586f2-2d2c-4772-9af3-c870daaf5285-utilities\") pod \"6ca586f2-2d2c-4772-9af3-c870daaf5285\" (UID: \"6ca586f2-2d2c-4772-9af3-c870daaf5285\") " Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.298688 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ca586f2-2d2c-4772-9af3-c870daaf5285-catalog-content\") pod \"6ca586f2-2d2c-4772-9af3-c870daaf5285\" (UID: \"6ca586f2-2d2c-4772-9af3-c870daaf5285\") " Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.299716 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ca586f2-2d2c-4772-9af3-c870daaf5285-utilities" (OuterVolumeSpecName: "utilities") pod "6ca586f2-2d2c-4772-9af3-c870daaf5285" (UID: "6ca586f2-2d2c-4772-9af3-c870daaf5285"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.302118 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ca586f2-2d2c-4772-9af3-c870daaf5285-kube-api-access-thqvw" (OuterVolumeSpecName: "kube-api-access-thqvw") pod "6ca586f2-2d2c-4772-9af3-c870daaf5285" (UID: "6ca586f2-2d2c-4772-9af3-c870daaf5285"). InnerVolumeSpecName "kube-api-access-thqvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.365487 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ca586f2-2d2c-4772-9af3-c870daaf5285-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6ca586f2-2d2c-4772-9af3-c870daaf5285" (UID: "6ca586f2-2d2c-4772-9af3-c870daaf5285"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.369920 4991 generic.go:334] "Generic (PLEG): container finished" podID="6ca586f2-2d2c-4772-9af3-c870daaf5285" containerID="de2bc1f91304afaaa72e75970ebed7714b977230b4531af4ed9bd1fc13f9242d" exitCode=0 Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.369997 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rdt9r" event={"ID":"6ca586f2-2d2c-4772-9af3-c870daaf5285","Type":"ContainerDied","Data":"de2bc1f91304afaaa72e75970ebed7714b977230b4531af4ed9bd1fc13f9242d"} Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.370034 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rdt9r" event={"ID":"6ca586f2-2d2c-4772-9af3-c870daaf5285","Type":"ContainerDied","Data":"4d4bb9c08e8fea5636c14076eb629406fd33b5fbc3e6b0b4768bbe6d969f9a68"} Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.370052 4991 scope.go:117] "RemoveContainer" containerID="de2bc1f91304afaaa72e75970ebed7714b977230b4531af4ed9bd1fc13f9242d" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.370119 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rdt9r" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.377674 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qzvgr" event={"ID":"ab7acc10-27d6-4f6b-9ec1-faf4fb60771d","Type":"ContainerDied","Data":"37299552df8e6bc75231d7552e3f82537137191e8d8ddd4bd49e6ab01af96c78"} Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.384690 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.385222 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mzbqd" event={"ID":"2524c61e-9353-41bd-acda-66380b2381b2","Type":"ContainerDied","Data":"5b812d0c8b7c54ec0d577ab58f9d07de710d48958c2906a05159bf574b9accf1"} Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.390910 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qzvgr" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.405708 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ca586f2-2d2c-4772-9af3-c870daaf5285-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.405745 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thqvw\" (UniqueName: \"kubernetes.io/projected/6ca586f2-2d2c-4772-9af3-c870daaf5285-kube-api-access-thqvw\") on node \"crc\" DevicePath \"\"" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.405757 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ca586f2-2d2c-4772-9af3-c870daaf5285-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.416935 4991 scope.go:117] "RemoveContainer" containerID="47ff4fdce1d0368be12fe669beb3e141848afce03540d5d93ba0fbc9e8e68108" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.430042 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rdt9r"] Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.449730 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rdt9r"] Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.457360 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mzbqd"] Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.461311 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mzbqd"] Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.464198 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qzvgr"] Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.462738 4991 scope.go:117] "RemoveContainer" containerID="b59d3be3bcaadd91c7ee963da4863c7e7c430f9617109dd86a59f1a0fa667954" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.467211 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qzvgr"] Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.501196 4991 scope.go:117] "RemoveContainer" containerID="de2bc1f91304afaaa72e75970ebed7714b977230b4531af4ed9bd1fc13f9242d" Dec 06 01:06:07 crc kubenswrapper[4991]: E1206 01:06:07.503711 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de2bc1f91304afaaa72e75970ebed7714b977230b4531af4ed9bd1fc13f9242d\": container with ID starting with de2bc1f91304afaaa72e75970ebed7714b977230b4531af4ed9bd1fc13f9242d not found: ID does not exist" containerID="de2bc1f91304afaaa72e75970ebed7714b977230b4531af4ed9bd1fc13f9242d" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.503762 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de2bc1f91304afaaa72e75970ebed7714b977230b4531af4ed9bd1fc13f9242d"} err="failed to get container status \"de2bc1f91304afaaa72e75970ebed7714b977230b4531af4ed9bd1fc13f9242d\": rpc error: code = NotFound desc = could not find container \"de2bc1f91304afaaa72e75970ebed7714b977230b4531af4ed9bd1fc13f9242d\": container with ID starting with de2bc1f91304afaaa72e75970ebed7714b977230b4531af4ed9bd1fc13f9242d not found: ID does not exist" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.503793 4991 scope.go:117] "RemoveContainer" containerID="47ff4fdce1d0368be12fe669beb3e141848afce03540d5d93ba0fbc9e8e68108" Dec 06 01:06:07 crc kubenswrapper[4991]: E1206 01:06:07.504423 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47ff4fdce1d0368be12fe669beb3e141848afce03540d5d93ba0fbc9e8e68108\": container with ID starting with 47ff4fdce1d0368be12fe669beb3e141848afce03540d5d93ba0fbc9e8e68108 not found: ID does not exist" containerID="47ff4fdce1d0368be12fe669beb3e141848afce03540d5d93ba0fbc9e8e68108" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.504498 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47ff4fdce1d0368be12fe669beb3e141848afce03540d5d93ba0fbc9e8e68108"} err="failed to get container status \"47ff4fdce1d0368be12fe669beb3e141848afce03540d5d93ba0fbc9e8e68108\": rpc error: code = NotFound desc = could not find container \"47ff4fdce1d0368be12fe669beb3e141848afce03540d5d93ba0fbc9e8e68108\": container with ID starting with 47ff4fdce1d0368be12fe669beb3e141848afce03540d5d93ba0fbc9e8e68108 not found: ID does not exist" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.505112 4991 scope.go:117] "RemoveContainer" containerID="b59d3be3bcaadd91c7ee963da4863c7e7c430f9617109dd86a59f1a0fa667954" Dec 06 01:06:07 crc kubenswrapper[4991]: E1206 01:06:07.506512 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b59d3be3bcaadd91c7ee963da4863c7e7c430f9617109dd86a59f1a0fa667954\": container with ID starting with b59d3be3bcaadd91c7ee963da4863c7e7c430f9617109dd86a59f1a0fa667954 not found: ID does not exist" containerID="b59d3be3bcaadd91c7ee963da4863c7e7c430f9617109dd86a59f1a0fa667954" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.506574 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b59d3be3bcaadd91c7ee963da4863c7e7c430f9617109dd86a59f1a0fa667954"} err="failed to get container status \"b59d3be3bcaadd91c7ee963da4863c7e7c430f9617109dd86a59f1a0fa667954\": rpc error: code = NotFound desc = could not find container \"b59d3be3bcaadd91c7ee963da4863c7e7c430f9617109dd86a59f1a0fa667954\": container with ID starting with b59d3be3bcaadd91c7ee963da4863c7e7c430f9617109dd86a59f1a0fa667954 not found: ID does not exist" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.506607 4991 scope.go:117] "RemoveContainer" containerID="2d7235cb205a62da000e0e305ec5b0bbc80d9cbce86c88848ecc6b0d66e82285" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.509921 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2"] Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.551648 4991 scope.go:117] "RemoveContainer" containerID="a9d8a3c9299db15aea3d5f2665d25a3e110295f98976b860d51452774f186eb0" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.583159 4991 scope.go:117] "RemoveContainer" containerID="5d4dc4ab89acecc898408a73c987ae239047ac63a91cd3820c367cbfe16fa911" Dec 06 01:06:07 crc kubenswrapper[4991]: I1206 01:06:07.606120 4991 scope.go:117] "RemoveContainer" containerID="fb95cf865d9d86576a33d5821fc49072b36573ef0160317efad665457ad2ef3e" Dec 06 01:06:08 crc kubenswrapper[4991]: I1206 01:06:08.396229 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" event={"ID":"a30ca898-e9a6-482d-ae56-b583a4ab48b0","Type":"ContainerStarted","Data":"46961e871241dda29dfcc35959fbe0bc829072280b88e399a3e5996b32d779e3"} Dec 06 01:06:08 crc kubenswrapper[4991]: I1206 01:06:08.396317 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" event={"ID":"a30ca898-e9a6-482d-ae56-b583a4ab48b0","Type":"ContainerStarted","Data":"03a660b91efeec929f4b3d19d044bb34ccac7fffe07a123bc5a1487b196174e6"} Dec 06 01:06:08 crc kubenswrapper[4991]: I1206 01:06:08.396647 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" Dec 06 01:06:08 crc kubenswrapper[4991]: I1206 01:06:08.403358 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" Dec 06 01:06:08 crc kubenswrapper[4991]: I1206 01:06:08.416258 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-9bc7b6b6b-jbht2" podStartSLOduration=31.416237184 podStartE2EDuration="31.416237184s" podCreationTimestamp="2025-12-06 01:05:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:06:08.41568588 +0000 UTC m=+241.765976348" watchObservedRunningTime="2025-12-06 01:06:08.416237184 +0000 UTC m=+241.766527652" Dec 06 01:06:09 crc kubenswrapper[4991]: I1206 01:06:09.050321 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2524c61e-9353-41bd-acda-66380b2381b2" path="/var/lib/kubelet/pods/2524c61e-9353-41bd-acda-66380b2381b2/volumes" Dec 06 01:06:09 crc kubenswrapper[4991]: I1206 01:06:09.051979 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ca586f2-2d2c-4772-9af3-c870daaf5285" path="/var/lib/kubelet/pods/6ca586f2-2d2c-4772-9af3-c870daaf5285/volumes" Dec 06 01:06:09 crc kubenswrapper[4991]: I1206 01:06:09.053231 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab7acc10-27d6-4f6b-9ec1-faf4fb60771d" path="/var/lib/kubelet/pods/ab7acc10-27d6-4f6b-9ec1-faf4fb60771d/volumes" Dec 06 01:06:09 crc kubenswrapper[4991]: I1206 01:06:09.417805 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bgvdn"] Dec 06 01:06:09 crc kubenswrapper[4991]: I1206 01:06:09.418345 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bgvdn" podUID="2cb731e7-ddf5-47db-b5b1-607ae7a9f146" containerName="registry-server" containerID="cri-o://55f1e8360fcc5ea60ff435c5371b7147e5132be6f20b9393f20955981f90c4bc" gracePeriod=2 Dec 06 01:06:09 crc kubenswrapper[4991]: I1206 01:06:09.806885 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bgvdn" Dec 06 01:06:09 crc kubenswrapper[4991]: I1206 01:06:09.844469 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cb731e7-ddf5-47db-b5b1-607ae7a9f146-catalog-content\") pod \"2cb731e7-ddf5-47db-b5b1-607ae7a9f146\" (UID: \"2cb731e7-ddf5-47db-b5b1-607ae7a9f146\") " Dec 06 01:06:09 crc kubenswrapper[4991]: I1206 01:06:09.844518 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxvm5\" (UniqueName: \"kubernetes.io/projected/2cb731e7-ddf5-47db-b5b1-607ae7a9f146-kube-api-access-rxvm5\") pod \"2cb731e7-ddf5-47db-b5b1-607ae7a9f146\" (UID: \"2cb731e7-ddf5-47db-b5b1-607ae7a9f146\") " Dec 06 01:06:09 crc kubenswrapper[4991]: I1206 01:06:09.844684 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cb731e7-ddf5-47db-b5b1-607ae7a9f146-utilities\") pod \"2cb731e7-ddf5-47db-b5b1-607ae7a9f146\" (UID: \"2cb731e7-ddf5-47db-b5b1-607ae7a9f146\") " Dec 06 01:06:09 crc kubenswrapper[4991]: I1206 01:06:09.845842 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cb731e7-ddf5-47db-b5b1-607ae7a9f146-utilities" (OuterVolumeSpecName: "utilities") pod "2cb731e7-ddf5-47db-b5b1-607ae7a9f146" (UID: "2cb731e7-ddf5-47db-b5b1-607ae7a9f146"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:06:09 crc kubenswrapper[4991]: I1206 01:06:09.855098 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cb731e7-ddf5-47db-b5b1-607ae7a9f146-kube-api-access-rxvm5" (OuterVolumeSpecName: "kube-api-access-rxvm5") pod "2cb731e7-ddf5-47db-b5b1-607ae7a9f146" (UID: "2cb731e7-ddf5-47db-b5b1-607ae7a9f146"). InnerVolumeSpecName "kube-api-access-rxvm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:06:09 crc kubenswrapper[4991]: I1206 01:06:09.946184 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxvm5\" (UniqueName: \"kubernetes.io/projected/2cb731e7-ddf5-47db-b5b1-607ae7a9f146-kube-api-access-rxvm5\") on node \"crc\" DevicePath \"\"" Dec 06 01:06:09 crc kubenswrapper[4991]: I1206 01:06:09.946271 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cb731e7-ddf5-47db-b5b1-607ae7a9f146-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 01:06:09 crc kubenswrapper[4991]: I1206 01:06:09.968479 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cb731e7-ddf5-47db-b5b1-607ae7a9f146-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2cb731e7-ddf5-47db-b5b1-607ae7a9f146" (UID: "2cb731e7-ddf5-47db-b5b1-607ae7a9f146"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:06:10 crc kubenswrapper[4991]: I1206 01:06:10.048121 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cb731e7-ddf5-47db-b5b1-607ae7a9f146-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 01:06:10 crc kubenswrapper[4991]: I1206 01:06:10.415869 4991 generic.go:334] "Generic (PLEG): container finished" podID="2cb731e7-ddf5-47db-b5b1-607ae7a9f146" containerID="55f1e8360fcc5ea60ff435c5371b7147e5132be6f20b9393f20955981f90c4bc" exitCode=0 Dec 06 01:06:10 crc kubenswrapper[4991]: I1206 01:06:10.417046 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bgvdn" Dec 06 01:06:10 crc kubenswrapper[4991]: I1206 01:06:10.423151 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bgvdn" event={"ID":"2cb731e7-ddf5-47db-b5b1-607ae7a9f146","Type":"ContainerDied","Data":"55f1e8360fcc5ea60ff435c5371b7147e5132be6f20b9393f20955981f90c4bc"} Dec 06 01:06:10 crc kubenswrapper[4991]: I1206 01:06:10.423210 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bgvdn" event={"ID":"2cb731e7-ddf5-47db-b5b1-607ae7a9f146","Type":"ContainerDied","Data":"05467ac759a80c2da907905b6a3645ad55cf87aa509ef7133d68f09bbfecc74e"} Dec 06 01:06:10 crc kubenswrapper[4991]: I1206 01:06:10.423242 4991 scope.go:117] "RemoveContainer" containerID="55f1e8360fcc5ea60ff435c5371b7147e5132be6f20b9393f20955981f90c4bc" Dec 06 01:06:10 crc kubenswrapper[4991]: I1206 01:06:10.452374 4991 scope.go:117] "RemoveContainer" containerID="407e65280e3e96618fd14db8b5f073a865b60c90d9a1fbb8a19109229ea18f34" Dec 06 01:06:10 crc kubenswrapper[4991]: I1206 01:06:10.476007 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bgvdn"] Dec 06 01:06:10 crc kubenswrapper[4991]: I1206 01:06:10.479405 4991 scope.go:117] "RemoveContainer" containerID="d5c44e99c2e9fc05a0b5884900c555b4a7de09bc45aac48aaf359adcc08e3238" Dec 06 01:06:10 crc kubenswrapper[4991]: I1206 01:06:10.479622 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bgvdn"] Dec 06 01:06:10 crc kubenswrapper[4991]: I1206 01:06:10.506771 4991 scope.go:117] "RemoveContainer" containerID="55f1e8360fcc5ea60ff435c5371b7147e5132be6f20b9393f20955981f90c4bc" Dec 06 01:06:10 crc kubenswrapper[4991]: E1206 01:06:10.507353 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55f1e8360fcc5ea60ff435c5371b7147e5132be6f20b9393f20955981f90c4bc\": container with ID starting with 55f1e8360fcc5ea60ff435c5371b7147e5132be6f20b9393f20955981f90c4bc not found: ID does not exist" containerID="55f1e8360fcc5ea60ff435c5371b7147e5132be6f20b9393f20955981f90c4bc" Dec 06 01:06:10 crc kubenswrapper[4991]: I1206 01:06:10.507396 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55f1e8360fcc5ea60ff435c5371b7147e5132be6f20b9393f20955981f90c4bc"} err="failed to get container status \"55f1e8360fcc5ea60ff435c5371b7147e5132be6f20b9393f20955981f90c4bc\": rpc error: code = NotFound desc = could not find container \"55f1e8360fcc5ea60ff435c5371b7147e5132be6f20b9393f20955981f90c4bc\": container with ID starting with 55f1e8360fcc5ea60ff435c5371b7147e5132be6f20b9393f20955981f90c4bc not found: ID does not exist" Dec 06 01:06:10 crc kubenswrapper[4991]: I1206 01:06:10.507435 4991 scope.go:117] "RemoveContainer" containerID="407e65280e3e96618fd14db8b5f073a865b60c90d9a1fbb8a19109229ea18f34" Dec 06 01:06:10 crc kubenswrapper[4991]: E1206 01:06:10.508034 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"407e65280e3e96618fd14db8b5f073a865b60c90d9a1fbb8a19109229ea18f34\": container with ID starting with 407e65280e3e96618fd14db8b5f073a865b60c90d9a1fbb8a19109229ea18f34 not found: ID does not exist" containerID="407e65280e3e96618fd14db8b5f073a865b60c90d9a1fbb8a19109229ea18f34" Dec 06 01:06:10 crc kubenswrapper[4991]: I1206 01:06:10.508071 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"407e65280e3e96618fd14db8b5f073a865b60c90d9a1fbb8a19109229ea18f34"} err="failed to get container status \"407e65280e3e96618fd14db8b5f073a865b60c90d9a1fbb8a19109229ea18f34\": rpc error: code = NotFound desc = could not find container \"407e65280e3e96618fd14db8b5f073a865b60c90d9a1fbb8a19109229ea18f34\": container with ID starting with 407e65280e3e96618fd14db8b5f073a865b60c90d9a1fbb8a19109229ea18f34 not found: ID does not exist" Dec 06 01:06:10 crc kubenswrapper[4991]: I1206 01:06:10.508091 4991 scope.go:117] "RemoveContainer" containerID="d5c44e99c2e9fc05a0b5884900c555b4a7de09bc45aac48aaf359adcc08e3238" Dec 06 01:06:10 crc kubenswrapper[4991]: E1206 01:06:10.509209 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5c44e99c2e9fc05a0b5884900c555b4a7de09bc45aac48aaf359adcc08e3238\": container with ID starting with d5c44e99c2e9fc05a0b5884900c555b4a7de09bc45aac48aaf359adcc08e3238 not found: ID does not exist" containerID="d5c44e99c2e9fc05a0b5884900c555b4a7de09bc45aac48aaf359adcc08e3238" Dec 06 01:06:10 crc kubenswrapper[4991]: I1206 01:06:10.509323 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5c44e99c2e9fc05a0b5884900c555b4a7de09bc45aac48aaf359adcc08e3238"} err="failed to get container status \"d5c44e99c2e9fc05a0b5884900c555b4a7de09bc45aac48aaf359adcc08e3238\": rpc error: code = NotFound desc = could not find container \"d5c44e99c2e9fc05a0b5884900c555b4a7de09bc45aac48aaf359adcc08e3238\": container with ID starting with d5c44e99c2e9fc05a0b5884900c555b4a7de09bc45aac48aaf359adcc08e3238 not found: ID does not exist" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.051308 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cb731e7-ddf5-47db-b5b1-607ae7a9f146" path="/var/lib/kubelet/pods/2cb731e7-ddf5-47db-b5b1-607ae7a9f146/volumes" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.114287 4991 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 06 01:06:11 crc kubenswrapper[4991]: E1206 01:06:11.114620 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ca586f2-2d2c-4772-9af3-c870daaf5285" containerName="registry-server" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.114638 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ca586f2-2d2c-4772-9af3-c870daaf5285" containerName="registry-server" Dec 06 01:06:11 crc kubenswrapper[4991]: E1206 01:06:11.114677 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab7acc10-27d6-4f6b-9ec1-faf4fb60771d" containerName="extract-content" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.114685 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab7acc10-27d6-4f6b-9ec1-faf4fb60771d" containerName="extract-content" Dec 06 01:06:11 crc kubenswrapper[4991]: E1206 01:06:11.114700 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab7acc10-27d6-4f6b-9ec1-faf4fb60771d" containerName="registry-server" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.114708 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab7acc10-27d6-4f6b-9ec1-faf4fb60771d" containerName="registry-server" Dec 06 01:06:11 crc kubenswrapper[4991]: E1206 01:06:11.114720 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab7acc10-27d6-4f6b-9ec1-faf4fb60771d" containerName="extract-utilities" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.114728 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab7acc10-27d6-4f6b-9ec1-faf4fb60771d" containerName="extract-utilities" Dec 06 01:06:11 crc kubenswrapper[4991]: E1206 01:06:11.114738 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cb731e7-ddf5-47db-b5b1-607ae7a9f146" containerName="extract-content" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.114746 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cb731e7-ddf5-47db-b5b1-607ae7a9f146" containerName="extract-content" Dec 06 01:06:11 crc kubenswrapper[4991]: E1206 01:06:11.114763 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cb731e7-ddf5-47db-b5b1-607ae7a9f146" containerName="extract-utilities" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.114771 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cb731e7-ddf5-47db-b5b1-607ae7a9f146" containerName="extract-utilities" Dec 06 01:06:11 crc kubenswrapper[4991]: E1206 01:06:11.114787 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ca586f2-2d2c-4772-9af3-c870daaf5285" containerName="extract-utilities" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.114797 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ca586f2-2d2c-4772-9af3-c870daaf5285" containerName="extract-utilities" Dec 06 01:06:11 crc kubenswrapper[4991]: E1206 01:06:11.114809 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ca586f2-2d2c-4772-9af3-c870daaf5285" containerName="extract-content" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.114817 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ca586f2-2d2c-4772-9af3-c870daaf5285" containerName="extract-content" Dec 06 01:06:11 crc kubenswrapper[4991]: E1206 01:06:11.114829 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cb731e7-ddf5-47db-b5b1-607ae7a9f146" containerName="registry-server" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.114837 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cb731e7-ddf5-47db-b5b1-607ae7a9f146" containerName="registry-server" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.114971 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab7acc10-27d6-4f6b-9ec1-faf4fb60771d" containerName="registry-server" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.115003 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cb731e7-ddf5-47db-b5b1-607ae7a9f146" containerName="registry-server" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.115022 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ca586f2-2d2c-4772-9af3-c870daaf5285" containerName="registry-server" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.115497 4991 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.115917 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.115959 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef" gracePeriod=15 Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.116027 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d" gracePeriod=15 Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.115956 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://0d50b2b6313c316a02866d767184d6eb48a30af52e2d86c65a08dab461e73a7c" gracePeriod=15 Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.115856 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0" gracePeriod=15 Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.116352 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96" gracePeriod=15 Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.116967 4991 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 06 01:06:11 crc kubenswrapper[4991]: E1206 01:06:11.117364 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.117386 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 06 01:06:11 crc kubenswrapper[4991]: E1206 01:06:11.117400 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.117408 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 06 01:06:11 crc kubenswrapper[4991]: E1206 01:06:11.117418 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.117424 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 06 01:06:11 crc kubenswrapper[4991]: E1206 01:06:11.117433 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.117439 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 06 01:06:11 crc kubenswrapper[4991]: E1206 01:06:11.117447 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.117454 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 06 01:06:11 crc kubenswrapper[4991]: E1206 01:06:11.117466 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.117474 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 06 01:06:11 crc kubenswrapper[4991]: E1206 01:06:11.117483 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.117490 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 06 01:06:11 crc kubenswrapper[4991]: E1206 01:06:11.117502 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.117509 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.117630 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.117643 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.117654 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.117666 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.117674 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.117684 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.117920 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.178059 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.178124 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.178181 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.178212 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.178256 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.178284 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.178342 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.178370 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.180967 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.280389 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.280460 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.280494 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.280537 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.280569 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.280596 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.280612 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.280656 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.280689 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.280568 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.280709 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.280731 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.280622 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.280761 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.280947 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.281049 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.430223 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.432778 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.433935 4991 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0d50b2b6313c316a02866d767184d6eb48a30af52e2d86c65a08dab461e73a7c" exitCode=0 Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.433971 4991 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96" exitCode=0 Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.434002 4991 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef" exitCode=0 Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.434011 4991 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d" exitCode=2 Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.434081 4991 scope.go:117] "RemoveContainer" containerID="d2c714a0739100044094f55d44461e6a882977103a5d45f49871ddd557a1822f" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.437036 4991 generic.go:334] "Generic (PLEG): container finished" podID="6ea93134-143a-4533-a84c-d23e1f1e505c" containerID="26a0545e430ab3af87fc68591a78b37925f62af92e76e5675eee59ff726fd7c6" exitCode=0 Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.437084 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6ea93134-143a-4533-a84c-d23e1f1e505c","Type":"ContainerDied","Data":"26a0545e430ab3af87fc68591a78b37925f62af92e76e5675eee59ff726fd7c6"} Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.438675 4991 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.439370 4991 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.440118 4991 status_manager.go:851] "Failed to get status for pod" podUID="6ea93134-143a-4533-a84c-d23e1f1e505c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 06 01:06:11 crc kubenswrapper[4991]: I1206 01:06:11.475053 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 01:06:11 crc kubenswrapper[4991]: E1206 01:06:11.512056 4991 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.110:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187e7ad7dc77b498 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-06 01:06:11.511202968 +0000 UTC m=+244.861493436,LastTimestamp:2025-12-06 01:06:11.511202968 +0000 UTC m=+244.861493436,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 06 01:06:12 crc kubenswrapper[4991]: I1206 01:06:12.395074 4991 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" start-of-body= Dec 06 01:06:12 crc kubenswrapper[4991]: I1206 01:06:12.395644 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" Dec 06 01:06:12 crc kubenswrapper[4991]: I1206 01:06:12.449379 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 06 01:06:12 crc kubenswrapper[4991]: I1206 01:06:12.454199 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"61edcb31ad91e0541ff3f0bc06fff677b8873ce12cdac2364dc0c80412f471ca"} Dec 06 01:06:12 crc kubenswrapper[4991]: I1206 01:06:12.454401 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"a4ac0c46fcb8606259e779ad9b84cd5286103334a1607f646d67c68da04a812d"} Dec 06 01:06:12 crc kubenswrapper[4991]: I1206 01:06:12.455009 4991 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 06 01:06:12 crc kubenswrapper[4991]: I1206 01:06:12.456327 4991 status_manager.go:851] "Failed to get status for pod" podUID="6ea93134-143a-4533-a84c-d23e1f1e505c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 06 01:06:12 crc kubenswrapper[4991]: I1206 01:06:12.457251 4991 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 06 01:06:12 crc kubenswrapper[4991]: I1206 01:06:12.733342 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 06 01:06:12 crc kubenswrapper[4991]: I1206 01:06:12.734712 4991 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 06 01:06:12 crc kubenswrapper[4991]: I1206 01:06:12.734911 4991 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 06 01:06:12 crc kubenswrapper[4991]: I1206 01:06:12.735075 4991 status_manager.go:851] "Failed to get status for pod" podUID="6ea93134-143a-4533-a84c-d23e1f1e505c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 06 01:06:12 crc kubenswrapper[4991]: I1206 01:06:12.804585 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6ea93134-143a-4533-a84c-d23e1f1e505c-var-lock\") pod \"6ea93134-143a-4533-a84c-d23e1f1e505c\" (UID: \"6ea93134-143a-4533-a84c-d23e1f1e505c\") " Dec 06 01:06:12 crc kubenswrapper[4991]: I1206 01:06:12.804741 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6ea93134-143a-4533-a84c-d23e1f1e505c-kube-api-access\") pod \"6ea93134-143a-4533-a84c-d23e1f1e505c\" (UID: \"6ea93134-143a-4533-a84c-d23e1f1e505c\") " Dec 06 01:06:12 crc kubenswrapper[4991]: I1206 01:06:12.804780 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6ea93134-143a-4533-a84c-d23e1f1e505c-kubelet-dir\") pod \"6ea93134-143a-4533-a84c-d23e1f1e505c\" (UID: \"6ea93134-143a-4533-a84c-d23e1f1e505c\") " Dec 06 01:06:12 crc kubenswrapper[4991]: I1206 01:06:12.804753 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6ea93134-143a-4533-a84c-d23e1f1e505c-var-lock" (OuterVolumeSpecName: "var-lock") pod "6ea93134-143a-4533-a84c-d23e1f1e505c" (UID: "6ea93134-143a-4533-a84c-d23e1f1e505c"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 01:06:12 crc kubenswrapper[4991]: I1206 01:06:12.805027 4991 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6ea93134-143a-4533-a84c-d23e1f1e505c-var-lock\") on node \"crc\" DevicePath \"\"" Dec 06 01:06:12 crc kubenswrapper[4991]: I1206 01:06:12.804811 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6ea93134-143a-4533-a84c-d23e1f1e505c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6ea93134-143a-4533-a84c-d23e1f1e505c" (UID: "6ea93134-143a-4533-a84c-d23e1f1e505c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 01:06:12 crc kubenswrapper[4991]: I1206 01:06:12.811942 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea93134-143a-4533-a84c-d23e1f1e505c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6ea93134-143a-4533-a84c-d23e1f1e505c" (UID: "6ea93134-143a-4533-a84c-d23e1f1e505c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:06:12 crc kubenswrapper[4991]: I1206 01:06:12.906548 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6ea93134-143a-4533-a84c-d23e1f1e505c-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 06 01:06:12 crc kubenswrapper[4991]: I1206 01:06:12.906607 4991 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6ea93134-143a-4533-a84c-d23e1f1e505c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 06 01:06:13 crc kubenswrapper[4991]: I1206 01:06:13.460620 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 06 01:06:13 crc kubenswrapper[4991]: I1206 01:06:13.461397 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6ea93134-143a-4533-a84c-d23e1f1e505c","Type":"ContainerDied","Data":"38fca5f913cf059fbb05e14a84f8c3568365d30a4dd5857ea414f411e622a4d4"} Dec 06 01:06:13 crc kubenswrapper[4991]: I1206 01:06:13.461433 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38fca5f913cf059fbb05e14a84f8c3568365d30a4dd5857ea414f411e622a4d4" Dec 06 01:06:13 crc kubenswrapper[4991]: I1206 01:06:13.466598 4991 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 06 01:06:13 crc kubenswrapper[4991]: I1206 01:06:13.466956 4991 status_manager.go:851] "Failed to get status for pod" podUID="6ea93134-143a-4533-a84c-d23e1f1e505c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 06 01:06:13 crc kubenswrapper[4991]: I1206 01:06:13.597950 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 06 01:06:13 crc kubenswrapper[4991]: I1206 01:06:13.599448 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 01:06:13 crc kubenswrapper[4991]: I1206 01:06:13.601135 4991 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 06 01:06:13 crc kubenswrapper[4991]: I1206 01:06:13.601903 4991 status_manager.go:851] "Failed to get status for pod" podUID="6ea93134-143a-4533-a84c-d23e1f1e505c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 06 01:06:13 crc kubenswrapper[4991]: I1206 01:06:13.602580 4991 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 06 01:06:13 crc kubenswrapper[4991]: I1206 01:06:13.639043 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 06 01:06:13 crc kubenswrapper[4991]: I1206 01:06:13.639132 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 06 01:06:13 crc kubenswrapper[4991]: I1206 01:06:13.639159 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 06 01:06:13 crc kubenswrapper[4991]: I1206 01:06:13.639310 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 01:06:13 crc kubenswrapper[4991]: I1206 01:06:13.639334 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 01:06:13 crc kubenswrapper[4991]: I1206 01:06:13.639331 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 01:06:13 crc kubenswrapper[4991]: I1206 01:06:13.740662 4991 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 06 01:06:13 crc kubenswrapper[4991]: I1206 01:06:13.740715 4991 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 06 01:06:13 crc kubenswrapper[4991]: I1206 01:06:13.740727 4991 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 06 01:06:14 crc kubenswrapper[4991]: I1206 01:06:14.474009 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 06 01:06:14 crc kubenswrapper[4991]: I1206 01:06:14.474969 4991 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0" exitCode=0 Dec 06 01:06:14 crc kubenswrapper[4991]: I1206 01:06:14.475076 4991 scope.go:117] "RemoveContainer" containerID="0d50b2b6313c316a02866d767184d6eb48a30af52e2d86c65a08dab461e73a7c" Dec 06 01:06:14 crc kubenswrapper[4991]: I1206 01:06:14.475111 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 01:06:14 crc kubenswrapper[4991]: I1206 01:06:14.496254 4991 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 06 01:06:14 crc kubenswrapper[4991]: I1206 01:06:14.498275 4991 status_manager.go:851] "Failed to get status for pod" podUID="6ea93134-143a-4533-a84c-d23e1f1e505c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 06 01:06:14 crc kubenswrapper[4991]: I1206 01:06:14.498960 4991 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 06 01:06:14 crc kubenswrapper[4991]: I1206 01:06:14.502540 4991 scope.go:117] "RemoveContainer" containerID="dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96" Dec 06 01:06:14 crc kubenswrapper[4991]: I1206 01:06:14.535359 4991 scope.go:117] "RemoveContainer" containerID="93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef" Dec 06 01:06:14 crc kubenswrapper[4991]: I1206 01:06:14.562681 4991 scope.go:117] "RemoveContainer" containerID="ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d" Dec 06 01:06:14 crc kubenswrapper[4991]: I1206 01:06:14.579510 4991 scope.go:117] "RemoveContainer" containerID="87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0" Dec 06 01:06:14 crc kubenswrapper[4991]: I1206 01:06:14.598748 4991 scope.go:117] "RemoveContainer" containerID="2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72" Dec 06 01:06:14 crc kubenswrapper[4991]: I1206 01:06:14.627554 4991 scope.go:117] "RemoveContainer" containerID="0d50b2b6313c316a02866d767184d6eb48a30af52e2d86c65a08dab461e73a7c" Dec 06 01:06:14 crc kubenswrapper[4991]: E1206 01:06:14.628385 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d50b2b6313c316a02866d767184d6eb48a30af52e2d86c65a08dab461e73a7c\": container with ID starting with 0d50b2b6313c316a02866d767184d6eb48a30af52e2d86c65a08dab461e73a7c not found: ID does not exist" containerID="0d50b2b6313c316a02866d767184d6eb48a30af52e2d86c65a08dab461e73a7c" Dec 06 01:06:14 crc kubenswrapper[4991]: I1206 01:06:14.628524 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d50b2b6313c316a02866d767184d6eb48a30af52e2d86c65a08dab461e73a7c"} err="failed to get container status \"0d50b2b6313c316a02866d767184d6eb48a30af52e2d86c65a08dab461e73a7c\": rpc error: code = NotFound desc = could not find container \"0d50b2b6313c316a02866d767184d6eb48a30af52e2d86c65a08dab461e73a7c\": container with ID starting with 0d50b2b6313c316a02866d767184d6eb48a30af52e2d86c65a08dab461e73a7c not found: ID does not exist" Dec 06 01:06:14 crc kubenswrapper[4991]: I1206 01:06:14.628631 4991 scope.go:117] "RemoveContainer" containerID="dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96" Dec 06 01:06:14 crc kubenswrapper[4991]: E1206 01:06:14.629362 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\": container with ID starting with dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96 not found: ID does not exist" containerID="dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96" Dec 06 01:06:14 crc kubenswrapper[4991]: I1206 01:06:14.629429 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96"} err="failed to get container status \"dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\": rpc error: code = NotFound desc = could not find container \"dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96\": container with ID starting with dea2dd7d2f84c2fee7ba18165351eae6d0ae06a89a6fac3a67518a56b8d7bc96 not found: ID does not exist" Dec 06 01:06:14 crc kubenswrapper[4991]: I1206 01:06:14.629476 4991 scope.go:117] "RemoveContainer" containerID="93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef" Dec 06 01:06:14 crc kubenswrapper[4991]: E1206 01:06:14.629895 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\": container with ID starting with 93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef not found: ID does not exist" containerID="93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef" Dec 06 01:06:14 crc kubenswrapper[4991]: I1206 01:06:14.629929 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef"} err="failed to get container status \"93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\": rpc error: code = NotFound desc = could not find container \"93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef\": container with ID starting with 93424ab7a61719069ce7f809e46b7780f0242e4053bdb44c1306189d6ae804ef not found: ID does not exist" Dec 06 01:06:14 crc kubenswrapper[4991]: I1206 01:06:14.629963 4991 scope.go:117] "RemoveContainer" containerID="ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d" Dec 06 01:06:14 crc kubenswrapper[4991]: E1206 01:06:14.630361 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\": container with ID starting with ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d not found: ID does not exist" containerID="ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d" Dec 06 01:06:14 crc kubenswrapper[4991]: I1206 01:06:14.630401 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d"} err="failed to get container status \"ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\": rpc error: code = NotFound desc = could not find container \"ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d\": container with ID starting with ded53fe21e6c004f7067393db39fc90a4856cf2864553348f2833f218711652d not found: ID does not exist" Dec 06 01:06:14 crc kubenswrapper[4991]: I1206 01:06:14.630417 4991 scope.go:117] "RemoveContainer" containerID="87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0" Dec 06 01:06:14 crc kubenswrapper[4991]: E1206 01:06:14.630706 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\": container with ID starting with 87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0 not found: ID does not exist" containerID="87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0" Dec 06 01:06:14 crc kubenswrapper[4991]: I1206 01:06:14.630737 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0"} err="failed to get container status \"87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\": rpc error: code = NotFound desc = could not find container \"87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0\": container with ID starting with 87b3270c4bd037d04f7df9cb6f0a526e612f2ffd278a4e5ff511ed6dbf3879e0 not found: ID does not exist" Dec 06 01:06:14 crc kubenswrapper[4991]: I1206 01:06:14.630756 4991 scope.go:117] "RemoveContainer" containerID="2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72" Dec 06 01:06:14 crc kubenswrapper[4991]: E1206 01:06:14.631063 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\": container with ID starting with 2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72 not found: ID does not exist" containerID="2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72" Dec 06 01:06:14 crc kubenswrapper[4991]: I1206 01:06:14.631107 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72"} err="failed to get container status \"2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\": rpc error: code = NotFound desc = could not find container \"2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72\": container with ID starting with 2dd47ad54dff83c0f836a356295022269da79e8fd07ee1b99af6a5301f5e2e72 not found: ID does not exist" Dec 06 01:06:15 crc kubenswrapper[4991]: I1206 01:06:15.045932 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 06 01:06:16 crc kubenswrapper[4991]: E1206 01:06:16.622736 4991 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.110:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187e7ad7dc77b498 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-06 01:06:11.511202968 +0000 UTC m=+244.861493436,LastTimestamp:2025-12-06 01:06:11.511202968 +0000 UTC m=+244.861493436,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 06 01:06:17 crc kubenswrapper[4991]: I1206 01:06:17.040170 4991 status_manager.go:851] "Failed to get status for pod" podUID="6ea93134-143a-4533-a84c-d23e1f1e505c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 06 01:06:17 crc kubenswrapper[4991]: I1206 01:06:17.040475 4991 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 06 01:06:17 crc kubenswrapper[4991]: E1206 01:06:17.266135 4991 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 06 01:06:17 crc kubenswrapper[4991]: E1206 01:06:17.266819 4991 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 06 01:06:17 crc kubenswrapper[4991]: E1206 01:06:17.267320 4991 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 06 01:06:17 crc kubenswrapper[4991]: E1206 01:06:17.268281 4991 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 06 01:06:17 crc kubenswrapper[4991]: E1206 01:06:17.268667 4991 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 06 01:06:17 crc kubenswrapper[4991]: I1206 01:06:17.268706 4991 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 06 01:06:17 crc kubenswrapper[4991]: E1206 01:06:17.269066 4991 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="200ms" Dec 06 01:06:17 crc kubenswrapper[4991]: E1206 01:06:17.470056 4991 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="400ms" Dec 06 01:06:17 crc kubenswrapper[4991]: E1206 01:06:17.871231 4991 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="800ms" Dec 06 01:06:18 crc kubenswrapper[4991]: E1206 01:06:18.673094 4991 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="1.6s" Dec 06 01:06:20 crc kubenswrapper[4991]: E1206 01:06:20.274258 4991 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="3.2s" Dec 06 01:06:22 crc kubenswrapper[4991]: I1206 01:06:22.037306 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 01:06:22 crc kubenswrapper[4991]: I1206 01:06:22.038793 4991 status_manager.go:851] "Failed to get status for pod" podUID="6ea93134-143a-4533-a84c-d23e1f1e505c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 06 01:06:22 crc kubenswrapper[4991]: I1206 01:06:22.039777 4991 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 06 01:06:22 crc kubenswrapper[4991]: I1206 01:06:22.057189 4991 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b2f3f8be-7093-40c6-8c6e-6bb65c878b31" Dec 06 01:06:22 crc kubenswrapper[4991]: I1206 01:06:22.057238 4991 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b2f3f8be-7093-40c6-8c6e-6bb65c878b31" Dec 06 01:06:22 crc kubenswrapper[4991]: E1206 01:06:22.057877 4991 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 01:06:22 crc kubenswrapper[4991]: I1206 01:06:22.058662 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 01:06:22 crc kubenswrapper[4991]: W1206 01:06:22.087148 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-a22d06394675b2c6f470342959f098bd28bee85d4c3f341cf6d53e50e5c3ac6c WatchSource:0}: Error finding container a22d06394675b2c6f470342959f098bd28bee85d4c3f341cf6d53e50e5c3ac6c: Status 404 returned error can't find the container with id a22d06394675b2c6f470342959f098bd28bee85d4c3f341cf6d53e50e5c3ac6c Dec 06 01:06:22 crc kubenswrapper[4991]: I1206 01:06:22.529137 4991 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="de0586d593fd2127c0ae90231124a3faf9f370140bcea01c264dcbe4d4508ceb" exitCode=0 Dec 06 01:06:22 crc kubenswrapper[4991]: I1206 01:06:22.529200 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"de0586d593fd2127c0ae90231124a3faf9f370140bcea01c264dcbe4d4508ceb"} Dec 06 01:06:22 crc kubenswrapper[4991]: I1206 01:06:22.529247 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a22d06394675b2c6f470342959f098bd28bee85d4c3f341cf6d53e50e5c3ac6c"} Dec 06 01:06:22 crc kubenswrapper[4991]: I1206 01:06:22.529545 4991 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b2f3f8be-7093-40c6-8c6e-6bb65c878b31" Dec 06 01:06:22 crc kubenswrapper[4991]: I1206 01:06:22.529561 4991 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b2f3f8be-7093-40c6-8c6e-6bb65c878b31" Dec 06 01:06:22 crc kubenswrapper[4991]: E1206 01:06:22.530013 4991 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 01:06:22 crc kubenswrapper[4991]: I1206 01:06:22.530201 4991 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 06 01:06:22 crc kubenswrapper[4991]: I1206 01:06:22.530594 4991 status_manager.go:851] "Failed to get status for pod" podUID="6ea93134-143a-4533-a84c-d23e1f1e505c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 06 01:06:23 crc kubenswrapper[4991]: I1206 01:06:23.538233 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2c48897c3ca52a54e8dc40611ecade12ee9fd6df07fd160eb9e280f89b28891f"} Dec 06 01:06:23 crc kubenswrapper[4991]: I1206 01:06:23.538752 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7358a0cc8e7d849679ac4bd7de6e2abb8bf625ffdd6678b8048be493cacdbc8d"} Dec 06 01:06:23 crc kubenswrapper[4991]: I1206 01:06:23.538768 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a73e7fac77d8a47a1a6684bd3b6700c5efefb7806dc59eaf3f230c20ecee9704"} Dec 06 01:06:23 crc kubenswrapper[4991]: I1206 01:06:23.538780 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"724918ed415e6149c5a6aa4e0ba691493e81d349855ec5ab2e4f854263715725"} Dec 06 01:06:24 crc kubenswrapper[4991]: I1206 01:06:24.546630 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8299027dfcd0e1949246db67f11c7d5b8366a7cb2cb677d5222cbc3f25cca45e"} Dec 06 01:06:24 crc kubenswrapper[4991]: I1206 01:06:24.547111 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 01:06:24 crc kubenswrapper[4991]: I1206 01:06:24.547268 4991 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b2f3f8be-7093-40c6-8c6e-6bb65c878b31" Dec 06 01:06:24 crc kubenswrapper[4991]: I1206 01:06:24.547312 4991 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b2f3f8be-7093-40c6-8c6e-6bb65c878b31" Dec 06 01:06:26 crc kubenswrapper[4991]: I1206 01:06:26.565058 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 06 01:06:26 crc kubenswrapper[4991]: I1206 01:06:26.565548 4991 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5" exitCode=1 Dec 06 01:06:26 crc kubenswrapper[4991]: I1206 01:06:26.565604 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5"} Dec 06 01:06:26 crc kubenswrapper[4991]: I1206 01:06:26.566526 4991 scope.go:117] "RemoveContainer" containerID="99d06fd2d6d2937da62861cbdaead939323e61f5212d889322233f67d15a44e5" Dec 06 01:06:27 crc kubenswrapper[4991]: I1206 01:06:27.059283 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 01:06:27 crc kubenswrapper[4991]: I1206 01:06:27.059841 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 01:06:27 crc kubenswrapper[4991]: I1206 01:06:27.067789 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 01:06:27 crc kubenswrapper[4991]: I1206 01:06:27.578760 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 06 01:06:27 crc kubenswrapper[4991]: I1206 01:06:27.578867 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"acc2835d5517d8994545cdd0622e431f3fb226eadb2ab5329f9a8a4b834f0223"} Dec 06 01:06:29 crc kubenswrapper[4991]: I1206 01:06:29.560447 4991 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 01:06:29 crc kubenswrapper[4991]: I1206 01:06:29.601931 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 01:06:29 crc kubenswrapper[4991]: I1206 01:06:29.602028 4991 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b2f3f8be-7093-40c6-8c6e-6bb65c878b31" Dec 06 01:06:29 crc kubenswrapper[4991]: I1206 01:06:29.602059 4991 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b2f3f8be-7093-40c6-8c6e-6bb65c878b31" Dec 06 01:06:29 crc kubenswrapper[4991]: I1206 01:06:29.629946 4991 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="923671f9-0cdc-4f32-9168-ed6c039fb9a8" Dec 06 01:06:30 crc kubenswrapper[4991]: I1206 01:06:30.479080 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 01:06:30 crc kubenswrapper[4991]: I1206 01:06:30.600500 4991 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b2f3f8be-7093-40c6-8c6e-6bb65c878b31" Dec 06 01:06:30 crc kubenswrapper[4991]: I1206 01:06:30.600538 4991 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b2f3f8be-7093-40c6-8c6e-6bb65c878b31" Dec 06 01:06:30 crc kubenswrapper[4991]: I1206 01:06:30.606432 4991 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="923671f9-0cdc-4f32-9168-ed6c039fb9a8" Dec 06 01:06:35 crc kubenswrapper[4991]: I1206 01:06:35.734831 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 06 01:06:35 crc kubenswrapper[4991]: I1206 01:06:35.760513 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 06 01:06:35 crc kubenswrapper[4991]: I1206 01:06:35.771188 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 06 01:06:36 crc kubenswrapper[4991]: I1206 01:06:36.022201 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 06 01:06:36 crc kubenswrapper[4991]: I1206 01:06:36.043057 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 06 01:06:36 crc kubenswrapper[4991]: I1206 01:06:36.094551 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 06 01:06:36 crc kubenswrapper[4991]: I1206 01:06:36.272457 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 06 01:06:36 crc kubenswrapper[4991]: I1206 01:06:36.499399 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 06 01:06:36 crc kubenswrapper[4991]: I1206 01:06:36.526015 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 06 01:06:36 crc kubenswrapper[4991]: I1206 01:06:36.536602 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 01:06:36 crc kubenswrapper[4991]: I1206 01:06:36.867636 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 01:06:36 crc kubenswrapper[4991]: I1206 01:06:36.872608 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 06 01:06:37 crc kubenswrapper[4991]: I1206 01:06:37.019819 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 06 01:06:37 crc kubenswrapper[4991]: I1206 01:06:37.112840 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 06 01:06:37 crc kubenswrapper[4991]: I1206 01:06:37.317376 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 06 01:06:37 crc kubenswrapper[4991]: I1206 01:06:37.408200 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 06 01:06:37 crc kubenswrapper[4991]: I1206 01:06:37.469956 4991 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 06 01:06:37 crc kubenswrapper[4991]: I1206 01:06:37.571256 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 06 01:06:37 crc kubenswrapper[4991]: I1206 01:06:37.601127 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 06 01:06:37 crc kubenswrapper[4991]: I1206 01:06:37.611772 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 06 01:06:37 crc kubenswrapper[4991]: I1206 01:06:37.691652 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 06 01:06:37 crc kubenswrapper[4991]: I1206 01:06:37.730761 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 06 01:06:38 crc kubenswrapper[4991]: I1206 01:06:38.292657 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 06 01:06:38 crc kubenswrapper[4991]: I1206 01:06:38.641666 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 06 01:06:38 crc kubenswrapper[4991]: I1206 01:06:38.699953 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 06 01:06:38 crc kubenswrapper[4991]: I1206 01:06:38.947837 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 06 01:06:39 crc kubenswrapper[4991]: I1206 01:06:39.178319 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 06 01:06:39 crc kubenswrapper[4991]: I1206 01:06:39.208762 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 06 01:06:39 crc kubenswrapper[4991]: I1206 01:06:39.384168 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 06 01:06:39 crc kubenswrapper[4991]: I1206 01:06:39.638095 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 06 01:06:40 crc kubenswrapper[4991]: I1206 01:06:40.187523 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 06 01:06:40 crc kubenswrapper[4991]: I1206 01:06:40.308051 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 06 01:06:40 crc kubenswrapper[4991]: I1206 01:06:40.475780 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 06 01:06:40 crc kubenswrapper[4991]: I1206 01:06:40.574822 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 06 01:06:40 crc kubenswrapper[4991]: I1206 01:06:40.692169 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 06 01:06:40 crc kubenswrapper[4991]: I1206 01:06:40.728237 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 06 01:06:40 crc kubenswrapper[4991]: I1206 01:06:40.807576 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 06 01:06:40 crc kubenswrapper[4991]: I1206 01:06:40.887583 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 06 01:06:41 crc kubenswrapper[4991]: I1206 01:06:41.106370 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 06 01:06:41 crc kubenswrapper[4991]: I1206 01:06:41.232807 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 06 01:06:41 crc kubenswrapper[4991]: I1206 01:06:41.239205 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 06 01:06:41 crc kubenswrapper[4991]: I1206 01:06:41.302463 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 06 01:06:41 crc kubenswrapper[4991]: I1206 01:06:41.554697 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 06 01:06:41 crc kubenswrapper[4991]: I1206 01:06:41.801434 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 06 01:06:41 crc kubenswrapper[4991]: I1206 01:06:41.926855 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 06 01:06:42 crc kubenswrapper[4991]: I1206 01:06:42.056579 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 06 01:06:42 crc kubenswrapper[4991]: I1206 01:06:42.152151 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 06 01:06:42 crc kubenswrapper[4991]: I1206 01:06:42.621757 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 06 01:06:42 crc kubenswrapper[4991]: I1206 01:06:42.895137 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 06 01:06:43 crc kubenswrapper[4991]: I1206 01:06:43.279639 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 06 01:06:43 crc kubenswrapper[4991]: I1206 01:06:43.336024 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 06 01:06:43 crc kubenswrapper[4991]: I1206 01:06:43.486318 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 06 01:06:43 crc kubenswrapper[4991]: I1206 01:06:43.546792 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 06 01:06:43 crc kubenswrapper[4991]: I1206 01:06:43.637653 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 06 01:06:43 crc kubenswrapper[4991]: I1206 01:06:43.831927 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 06 01:06:44 crc kubenswrapper[4991]: I1206 01:06:44.368819 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 06 01:06:44 crc kubenswrapper[4991]: I1206 01:06:44.369614 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 06 01:06:44 crc kubenswrapper[4991]: I1206 01:06:44.422802 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 06 01:06:44 crc kubenswrapper[4991]: I1206 01:06:44.556392 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 06 01:06:44 crc kubenswrapper[4991]: I1206 01:06:44.823035 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 06 01:06:44 crc kubenswrapper[4991]: I1206 01:06:44.865501 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 06 01:06:44 crc kubenswrapper[4991]: I1206 01:06:44.920939 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 06 01:06:44 crc kubenswrapper[4991]: I1206 01:06:44.960652 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 06 01:06:45 crc kubenswrapper[4991]: I1206 01:06:45.276752 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 06 01:06:45 crc kubenswrapper[4991]: I1206 01:06:45.433458 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 06 01:06:45 crc kubenswrapper[4991]: I1206 01:06:45.469553 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 06 01:06:45 crc kubenswrapper[4991]: I1206 01:06:45.508425 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 06 01:06:45 crc kubenswrapper[4991]: I1206 01:06:45.536950 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 06 01:06:45 crc kubenswrapper[4991]: I1206 01:06:45.595865 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 06 01:06:45 crc kubenswrapper[4991]: I1206 01:06:45.620720 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 06 01:06:45 crc kubenswrapper[4991]: I1206 01:06:45.654709 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 06 01:06:45 crc kubenswrapper[4991]: I1206 01:06:45.655789 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 06 01:06:45 crc kubenswrapper[4991]: I1206 01:06:45.880498 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 06 01:06:45 crc kubenswrapper[4991]: I1206 01:06:45.886173 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 06 01:06:45 crc kubenswrapper[4991]: I1206 01:06:45.928658 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 06 01:06:45 crc kubenswrapper[4991]: I1206 01:06:45.963647 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 06 01:06:45 crc kubenswrapper[4991]: I1206 01:06:45.974861 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 06 01:06:45 crc kubenswrapper[4991]: I1206 01:06:45.992711 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 06 01:06:46 crc kubenswrapper[4991]: I1206 01:06:46.087590 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 06 01:06:46 crc kubenswrapper[4991]: I1206 01:06:46.089132 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 06 01:06:46 crc kubenswrapper[4991]: I1206 01:06:46.160855 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 06 01:06:46 crc kubenswrapper[4991]: I1206 01:06:46.227376 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 06 01:06:46 crc kubenswrapper[4991]: I1206 01:06:46.466713 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 06 01:06:46 crc kubenswrapper[4991]: I1206 01:06:46.603779 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 06 01:06:46 crc kubenswrapper[4991]: I1206 01:06:46.687835 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 06 01:06:46 crc kubenswrapper[4991]: I1206 01:06:46.775077 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 06 01:06:46 crc kubenswrapper[4991]: I1206 01:06:46.809849 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 06 01:06:46 crc kubenswrapper[4991]: I1206 01:06:46.855412 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 06 01:06:46 crc kubenswrapper[4991]: I1206 01:06:46.877245 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 06 01:06:46 crc kubenswrapper[4991]: I1206 01:06:46.889808 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 06 01:06:46 crc kubenswrapper[4991]: I1206 01:06:46.890270 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 06 01:06:46 crc kubenswrapper[4991]: I1206 01:06:46.912786 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 06 01:06:46 crc kubenswrapper[4991]: I1206 01:06:46.913098 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 06 01:06:46 crc kubenswrapper[4991]: I1206 01:06:46.939601 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 06 01:06:46 crc kubenswrapper[4991]: I1206 01:06:46.999183 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 06 01:06:47 crc kubenswrapper[4991]: I1206 01:06:47.008678 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 06 01:06:47 crc kubenswrapper[4991]: I1206 01:06:47.210513 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 06 01:06:47 crc kubenswrapper[4991]: I1206 01:06:47.308477 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 06 01:06:47 crc kubenswrapper[4991]: I1206 01:06:47.316311 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 06 01:06:47 crc kubenswrapper[4991]: I1206 01:06:47.380146 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 06 01:06:47 crc kubenswrapper[4991]: I1206 01:06:47.382746 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 06 01:06:47 crc kubenswrapper[4991]: I1206 01:06:47.395176 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 06 01:06:47 crc kubenswrapper[4991]: I1206 01:06:47.515652 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 06 01:06:47 crc kubenswrapper[4991]: I1206 01:06:47.538751 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 06 01:06:47 crc kubenswrapper[4991]: I1206 01:06:47.553172 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 06 01:06:47 crc kubenswrapper[4991]: I1206 01:06:47.625668 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 06 01:06:47 crc kubenswrapper[4991]: I1206 01:06:47.651767 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 06 01:06:47 crc kubenswrapper[4991]: I1206 01:06:47.809487 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 06 01:06:47 crc kubenswrapper[4991]: I1206 01:06:47.831806 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 06 01:06:47 crc kubenswrapper[4991]: I1206 01:06:47.851113 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 06 01:06:47 crc kubenswrapper[4991]: I1206 01:06:47.859559 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 06 01:06:47 crc kubenswrapper[4991]: I1206 01:06:47.917591 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 06 01:06:48 crc kubenswrapper[4991]: I1206 01:06:48.050023 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 06 01:06:48 crc kubenswrapper[4991]: I1206 01:06:48.070661 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 06 01:06:48 crc kubenswrapper[4991]: I1206 01:06:48.074965 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 06 01:06:48 crc kubenswrapper[4991]: I1206 01:06:48.139376 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 06 01:06:48 crc kubenswrapper[4991]: I1206 01:06:48.350069 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 06 01:06:48 crc kubenswrapper[4991]: I1206 01:06:48.352179 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 06 01:06:48 crc kubenswrapper[4991]: I1206 01:06:48.435869 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 06 01:06:48 crc kubenswrapper[4991]: I1206 01:06:48.439004 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 06 01:06:48 crc kubenswrapper[4991]: I1206 01:06:48.445418 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 06 01:06:48 crc kubenswrapper[4991]: I1206 01:06:48.525026 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 06 01:06:48 crc kubenswrapper[4991]: I1206 01:06:48.623705 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 06 01:06:48 crc kubenswrapper[4991]: I1206 01:06:48.691977 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 06 01:06:48 crc kubenswrapper[4991]: I1206 01:06:48.800741 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 06 01:06:48 crc kubenswrapper[4991]: I1206 01:06:48.823056 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 06 01:06:48 crc kubenswrapper[4991]: I1206 01:06:48.844448 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 06 01:06:49 crc kubenswrapper[4991]: I1206 01:06:49.023733 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 06 01:06:49 crc kubenswrapper[4991]: I1206 01:06:49.066808 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 06 01:06:49 crc kubenswrapper[4991]: I1206 01:06:49.068019 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 06 01:06:49 crc kubenswrapper[4991]: I1206 01:06:49.276495 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 06 01:06:49 crc kubenswrapper[4991]: I1206 01:06:49.288036 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 06 01:06:49 crc kubenswrapper[4991]: I1206 01:06:49.295433 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 06 01:06:49 crc kubenswrapper[4991]: I1206 01:06:49.318814 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 06 01:06:49 crc kubenswrapper[4991]: I1206 01:06:49.425266 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 06 01:06:49 crc kubenswrapper[4991]: I1206 01:06:49.786310 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 06 01:06:50 crc kubenswrapper[4991]: I1206 01:06:50.090240 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 06 01:06:50 crc kubenswrapper[4991]: I1206 01:06:50.092514 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 06 01:06:50 crc kubenswrapper[4991]: I1206 01:06:50.092656 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 06 01:06:50 crc kubenswrapper[4991]: I1206 01:06:50.140340 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 06 01:06:50 crc kubenswrapper[4991]: I1206 01:06:50.176114 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 06 01:06:50 crc kubenswrapper[4991]: I1206 01:06:50.181615 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 06 01:06:50 crc kubenswrapper[4991]: I1206 01:06:50.304349 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 06 01:06:50 crc kubenswrapper[4991]: I1206 01:06:50.310570 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 06 01:06:50 crc kubenswrapper[4991]: I1206 01:06:50.366664 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 06 01:06:50 crc kubenswrapper[4991]: I1206 01:06:50.416161 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 06 01:06:50 crc kubenswrapper[4991]: I1206 01:06:50.418701 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 06 01:06:50 crc kubenswrapper[4991]: I1206 01:06:50.430964 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 06 01:06:50 crc kubenswrapper[4991]: I1206 01:06:50.515106 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 06 01:06:50 crc kubenswrapper[4991]: I1206 01:06:50.525808 4991 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 06 01:06:50 crc kubenswrapper[4991]: I1206 01:06:50.674949 4991 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 06 01:06:50 crc kubenswrapper[4991]: I1206 01:06:50.679312 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=39.679276025 podStartE2EDuration="39.679276025s" podCreationTimestamp="2025-12-06 01:06:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:06:29.57454431 +0000 UTC m=+262.924834808" watchObservedRunningTime="2025-12-06 01:06:50.679276025 +0000 UTC m=+284.029566533" Dec 06 01:06:50 crc kubenswrapper[4991]: I1206 01:06:50.684574 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 06 01:06:50 crc kubenswrapper[4991]: I1206 01:06:50.684657 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 06 01:06:50 crc kubenswrapper[4991]: I1206 01:06:50.684693 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p8x59","openshift-marketplace/redhat-marketplace-52fr4","openshift-marketplace/community-operators-6cvk9","openshift-marketplace/marketplace-operator-79b997595-rxb7h","openshift-marketplace/certified-operators-wmtfj"] Dec 06 01:06:50 crc kubenswrapper[4991]: I1206 01:06:50.685191 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-p8x59" podUID="5c39eb27-3dbb-46d7-a874-ca6605d84134" containerName="registry-server" containerID="cri-o://8415567781f5bd3ca845fbd841dd4b03813472e3c0a5abeb15b4f37fad780ade" gracePeriod=30 Dec 06 01:06:50 crc kubenswrapper[4991]: I1206 01:06:50.686044 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6cvk9" podUID="271a982a-6005-4d08-b284-6b140ba25f07" containerName="registry-server" containerID="cri-o://50c61b2e11494df050da0f7745ea53d0e2e31b47d82bf2a342b6b3dd89d06ceb" gracePeriod=30 Dec 06 01:06:50 crc kubenswrapper[4991]: I1206 01:06:50.686338 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wmtfj" podUID="aeca113e-21c6-4114-9130-111b9c2e051a" containerName="registry-server" containerID="cri-o://076097247f296c0a7e90c17b745e9a8f6e92d905fc06abfd983c6b9b686e4725" gracePeriod=30 Dec 06 01:06:50 crc kubenswrapper[4991]: I1206 01:06:50.686317 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-52fr4" podUID="7a5f3df1-9238-407b-9fb1-97d41e3a7f11" containerName="registry-server" containerID="cri-o://40132ee862b553cee0e0369420776ae9d494e2c0fba515b19494dc1bee2ecfcd" gracePeriod=30 Dec 06 01:06:50 crc kubenswrapper[4991]: I1206 01:06:50.686459 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-rxb7h" podUID="3ad4cd41-d615-4bdd-bd75-baa9c5f2a141" containerName="marketplace-operator" containerID="cri-o://4dd4142a6e0a5e5ee98e00853d18280493948f18798e563154f022b9d049909f" gracePeriod=30 Dec 06 01:06:50 crc kubenswrapper[4991]: I1206 01:06:50.726127 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 06 01:06:50 crc kubenswrapper[4991]: I1206 01:06:50.730953 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 06 01:06:50 crc kubenswrapper[4991]: I1206 01:06:50.733800 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=21.733764925 podStartE2EDuration="21.733764925s" podCreationTimestamp="2025-12-06 01:06:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:06:50.726706306 +0000 UTC m=+284.076996804" watchObservedRunningTime="2025-12-06 01:06:50.733764925 +0000 UTC m=+284.084055423" Dec 06 01:06:50 crc kubenswrapper[4991]: I1206 01:06:50.739626 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 06 01:06:50 crc kubenswrapper[4991]: I1206 01:06:50.779059 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 06 01:06:50 crc kubenswrapper[4991]: I1206 01:06:50.835804 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 06 01:06:50 crc kubenswrapper[4991]: I1206 01:06:50.972945 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.108813 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.112551 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wmtfj" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.117293 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6cvk9" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.150198 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.151657 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p8x59" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.171292 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-52fr4" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.173306 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rxb7h" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.210609 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.237197 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.257561 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/271a982a-6005-4d08-b284-6b140ba25f07-utilities\") pod \"271a982a-6005-4d08-b284-6b140ba25f07\" (UID: \"271a982a-6005-4d08-b284-6b140ba25f07\") " Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.257632 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c39eb27-3dbb-46d7-a874-ca6605d84134-catalog-content\") pod \"5c39eb27-3dbb-46d7-a874-ca6605d84134\" (UID: \"5c39eb27-3dbb-46d7-a874-ca6605d84134\") " Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.257684 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c39eb27-3dbb-46d7-a874-ca6605d84134-utilities\") pod \"5c39eb27-3dbb-46d7-a874-ca6605d84134\" (UID: \"5c39eb27-3dbb-46d7-a874-ca6605d84134\") " Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.258645 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/271a982a-6005-4d08-b284-6b140ba25f07-utilities" (OuterVolumeSpecName: "utilities") pod "271a982a-6005-4d08-b284-6b140ba25f07" (UID: "271a982a-6005-4d08-b284-6b140ba25f07"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.259633 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c39eb27-3dbb-46d7-a874-ca6605d84134-utilities" (OuterVolumeSpecName: "utilities") pod "5c39eb27-3dbb-46d7-a874-ca6605d84134" (UID: "5c39eb27-3dbb-46d7-a874-ca6605d84134"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.262752 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.264764 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzn62\" (UniqueName: \"kubernetes.io/projected/5c39eb27-3dbb-46d7-a874-ca6605d84134-kube-api-access-fzn62\") pod \"5c39eb27-3dbb-46d7-a874-ca6605d84134\" (UID: \"5c39eb27-3dbb-46d7-a874-ca6605d84134\") " Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.264802 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a5f3df1-9238-407b-9fb1-97d41e3a7f11-catalog-content\") pod \"7a5f3df1-9238-407b-9fb1-97d41e3a7f11\" (UID: \"7a5f3df1-9238-407b-9fb1-97d41e3a7f11\") " Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.264831 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckjwt\" (UniqueName: \"kubernetes.io/projected/271a982a-6005-4d08-b284-6b140ba25f07-kube-api-access-ckjwt\") pod \"271a982a-6005-4d08-b284-6b140ba25f07\" (UID: \"271a982a-6005-4d08-b284-6b140ba25f07\") " Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.264864 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgsrf\" (UniqueName: \"kubernetes.io/projected/3ad4cd41-d615-4bdd-bd75-baa9c5f2a141-kube-api-access-tgsrf\") pod \"3ad4cd41-d615-4bdd-bd75-baa9c5f2a141\" (UID: \"3ad4cd41-d615-4bdd-bd75-baa9c5f2a141\") " Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.264885 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aeca113e-21c6-4114-9130-111b9c2e051a-catalog-content\") pod \"aeca113e-21c6-4114-9130-111b9c2e051a\" (UID: \"aeca113e-21c6-4114-9130-111b9c2e051a\") " Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.264911 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/271a982a-6005-4d08-b284-6b140ba25f07-catalog-content\") pod \"271a982a-6005-4d08-b284-6b140ba25f07\" (UID: \"271a982a-6005-4d08-b284-6b140ba25f07\") " Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.264934 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3ad4cd41-d615-4bdd-bd75-baa9c5f2a141-marketplace-operator-metrics\") pod \"3ad4cd41-d615-4bdd-bd75-baa9c5f2a141\" (UID: \"3ad4cd41-d615-4bdd-bd75-baa9c5f2a141\") " Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.264960 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4v6x\" (UniqueName: \"kubernetes.io/projected/aeca113e-21c6-4114-9130-111b9c2e051a-kube-api-access-r4v6x\") pod \"aeca113e-21c6-4114-9130-111b9c2e051a\" (UID: \"aeca113e-21c6-4114-9130-111b9c2e051a\") " Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.264995 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aeca113e-21c6-4114-9130-111b9c2e051a-utilities\") pod \"aeca113e-21c6-4114-9130-111b9c2e051a\" (UID: \"aeca113e-21c6-4114-9130-111b9c2e051a\") " Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.265179 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/271a982a-6005-4d08-b284-6b140ba25f07-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.265199 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c39eb27-3dbb-46d7-a874-ca6605d84134-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.265845 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aeca113e-21c6-4114-9130-111b9c2e051a-utilities" (OuterVolumeSpecName: "utilities") pod "aeca113e-21c6-4114-9130-111b9c2e051a" (UID: "aeca113e-21c6-4114-9130-111b9c2e051a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.272795 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ad4cd41-d615-4bdd-bd75-baa9c5f2a141-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "3ad4cd41-d615-4bdd-bd75-baa9c5f2a141" (UID: "3ad4cd41-d615-4bdd-bd75-baa9c5f2a141"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.272833 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aeca113e-21c6-4114-9130-111b9c2e051a-kube-api-access-r4v6x" (OuterVolumeSpecName: "kube-api-access-r4v6x") pod "aeca113e-21c6-4114-9130-111b9c2e051a" (UID: "aeca113e-21c6-4114-9130-111b9c2e051a"). InnerVolumeSpecName "kube-api-access-r4v6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.273497 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ad4cd41-d615-4bdd-bd75-baa9c5f2a141-kube-api-access-tgsrf" (OuterVolumeSpecName: "kube-api-access-tgsrf") pod "3ad4cd41-d615-4bdd-bd75-baa9c5f2a141" (UID: "3ad4cd41-d615-4bdd-bd75-baa9c5f2a141"). InnerVolumeSpecName "kube-api-access-tgsrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.273955 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/271a982a-6005-4d08-b284-6b140ba25f07-kube-api-access-ckjwt" (OuterVolumeSpecName: "kube-api-access-ckjwt") pod "271a982a-6005-4d08-b284-6b140ba25f07" (UID: "271a982a-6005-4d08-b284-6b140ba25f07"). InnerVolumeSpecName "kube-api-access-ckjwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.277448 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c39eb27-3dbb-46d7-a874-ca6605d84134-kube-api-access-fzn62" (OuterVolumeSpecName: "kube-api-access-fzn62") pod "5c39eb27-3dbb-46d7-a874-ca6605d84134" (UID: "5c39eb27-3dbb-46d7-a874-ca6605d84134"). InnerVolumeSpecName "kube-api-access-fzn62". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.297440 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a5f3df1-9238-407b-9fb1-97d41e3a7f11-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7a5f3df1-9238-407b-9fb1-97d41e3a7f11" (UID: "7a5f3df1-9238-407b-9fb1-97d41e3a7f11"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.300873 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.324805 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aeca113e-21c6-4114-9130-111b9c2e051a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aeca113e-21c6-4114-9130-111b9c2e051a" (UID: "aeca113e-21c6-4114-9130-111b9c2e051a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.329594 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/271a982a-6005-4d08-b284-6b140ba25f07-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "271a982a-6005-4d08-b284-6b140ba25f07" (UID: "271a982a-6005-4d08-b284-6b140ba25f07"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.365924 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3ad4cd41-d615-4bdd-bd75-baa9c5f2a141-marketplace-trusted-ca\") pod \"3ad4cd41-d615-4bdd-bd75-baa9c5f2a141\" (UID: \"3ad4cd41-d615-4bdd-bd75-baa9c5f2a141\") " Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.366005 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a5f3df1-9238-407b-9fb1-97d41e3a7f11-utilities\") pod \"7a5f3df1-9238-407b-9fb1-97d41e3a7f11\" (UID: \"7a5f3df1-9238-407b-9fb1-97d41e3a7f11\") " Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.366029 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7n8l7\" (UniqueName: \"kubernetes.io/projected/7a5f3df1-9238-407b-9fb1-97d41e3a7f11-kube-api-access-7n8l7\") pod \"7a5f3df1-9238-407b-9fb1-97d41e3a7f11\" (UID: \"7a5f3df1-9238-407b-9fb1-97d41e3a7f11\") " Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.366199 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgsrf\" (UniqueName: \"kubernetes.io/projected/3ad4cd41-d615-4bdd-bd75-baa9c5f2a141-kube-api-access-tgsrf\") on node \"crc\" DevicePath \"\"" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.366213 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aeca113e-21c6-4114-9130-111b9c2e051a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.366224 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/271a982a-6005-4d08-b284-6b140ba25f07-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.366233 4991 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3ad4cd41-d615-4bdd-bd75-baa9c5f2a141-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.366243 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4v6x\" (UniqueName: \"kubernetes.io/projected/aeca113e-21c6-4114-9130-111b9c2e051a-kube-api-access-r4v6x\") on node \"crc\" DevicePath \"\"" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.366252 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aeca113e-21c6-4114-9130-111b9c2e051a-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.366261 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzn62\" (UniqueName: \"kubernetes.io/projected/5c39eb27-3dbb-46d7-a874-ca6605d84134-kube-api-access-fzn62\") on node \"crc\" DevicePath \"\"" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.366270 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a5f3df1-9238-407b-9fb1-97d41e3a7f11-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.366277 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckjwt\" (UniqueName: \"kubernetes.io/projected/271a982a-6005-4d08-b284-6b140ba25f07-kube-api-access-ckjwt\") on node \"crc\" DevicePath \"\"" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.366646 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ad4cd41-d615-4bdd-bd75-baa9c5f2a141-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "3ad4cd41-d615-4bdd-bd75-baa9c5f2a141" (UID: "3ad4cd41-d615-4bdd-bd75-baa9c5f2a141"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.366849 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a5f3df1-9238-407b-9fb1-97d41e3a7f11-utilities" (OuterVolumeSpecName: "utilities") pod "7a5f3df1-9238-407b-9fb1-97d41e3a7f11" (UID: "7a5f3df1-9238-407b-9fb1-97d41e3a7f11"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.368672 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a5f3df1-9238-407b-9fb1-97d41e3a7f11-kube-api-access-7n8l7" (OuterVolumeSpecName: "kube-api-access-7n8l7") pod "7a5f3df1-9238-407b-9fb1-97d41e3a7f11" (UID: "7a5f3df1-9238-407b-9fb1-97d41e3a7f11"). InnerVolumeSpecName "kube-api-access-7n8l7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.381059 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c39eb27-3dbb-46d7-a874-ca6605d84134-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5c39eb27-3dbb-46d7-a874-ca6605d84134" (UID: "5c39eb27-3dbb-46d7-a874-ca6605d84134"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.396016 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.406682 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.422463 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.445631 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.465196 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.467521 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c39eb27-3dbb-46d7-a874-ca6605d84134-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.467724 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a5f3df1-9238-407b-9fb1-97d41e3a7f11-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.467894 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7n8l7\" (UniqueName: \"kubernetes.io/projected/7a5f3df1-9238-407b-9fb1-97d41e3a7f11-kube-api-access-7n8l7\") on node \"crc\" DevicePath \"\"" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.468237 4991 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3ad4cd41-d615-4bdd-bd75-baa9c5f2a141-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.480982 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.558143 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.588469 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.758967 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.760012 4991 generic.go:334] "Generic (PLEG): container finished" podID="7a5f3df1-9238-407b-9fb1-97d41e3a7f11" containerID="40132ee862b553cee0e0369420776ae9d494e2c0fba515b19494dc1bee2ecfcd" exitCode=0 Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.760101 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-52fr4" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.760108 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-52fr4" event={"ID":"7a5f3df1-9238-407b-9fb1-97d41e3a7f11","Type":"ContainerDied","Data":"40132ee862b553cee0e0369420776ae9d494e2c0fba515b19494dc1bee2ecfcd"} Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.760178 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-52fr4" event={"ID":"7a5f3df1-9238-407b-9fb1-97d41e3a7f11","Type":"ContainerDied","Data":"41e9f134e7bb2926307389d8dbf96ff866c7b71d8613170122a4ce274d6eba65"} Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.760211 4991 scope.go:117] "RemoveContainer" containerID="40132ee862b553cee0e0369420776ae9d494e2c0fba515b19494dc1bee2ecfcd" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.762390 4991 generic.go:334] "Generic (PLEG): container finished" podID="3ad4cd41-d615-4bdd-bd75-baa9c5f2a141" containerID="4dd4142a6e0a5e5ee98e00853d18280493948f18798e563154f022b9d049909f" exitCode=0 Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.762448 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rxb7h" event={"ID":"3ad4cd41-d615-4bdd-bd75-baa9c5f2a141","Type":"ContainerDied","Data":"4dd4142a6e0a5e5ee98e00853d18280493948f18798e563154f022b9d049909f"} Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.762485 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rxb7h" event={"ID":"3ad4cd41-d615-4bdd-bd75-baa9c5f2a141","Type":"ContainerDied","Data":"12db8e228facd084989f34c8d8ac1bfedea5b12f57cc6e4b5ac5be7a5351e850"} Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.762531 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rxb7h" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.767638 4991 generic.go:334] "Generic (PLEG): container finished" podID="5c39eb27-3dbb-46d7-a874-ca6605d84134" containerID="8415567781f5bd3ca845fbd841dd4b03813472e3c0a5abeb15b4f37fad780ade" exitCode=0 Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.767708 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p8x59" event={"ID":"5c39eb27-3dbb-46d7-a874-ca6605d84134","Type":"ContainerDied","Data":"8415567781f5bd3ca845fbd841dd4b03813472e3c0a5abeb15b4f37fad780ade"} Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.767742 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p8x59" event={"ID":"5c39eb27-3dbb-46d7-a874-ca6605d84134","Type":"ContainerDied","Data":"02b0650ee0ebaf197aa328805de4859f606d7e9be0ebf16636c8b36fcaf84d53"} Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.768270 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p8x59" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.770860 4991 generic.go:334] "Generic (PLEG): container finished" podID="271a982a-6005-4d08-b284-6b140ba25f07" containerID="50c61b2e11494df050da0f7745ea53d0e2e31b47d82bf2a342b6b3dd89d06ceb" exitCode=0 Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.770921 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6cvk9" event={"ID":"271a982a-6005-4d08-b284-6b140ba25f07","Type":"ContainerDied","Data":"50c61b2e11494df050da0f7745ea53d0e2e31b47d82bf2a342b6b3dd89d06ceb"} Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.770965 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6cvk9" event={"ID":"271a982a-6005-4d08-b284-6b140ba25f07","Type":"ContainerDied","Data":"8038e52bbcca226d5646d41345f2853f3b6998cdadc3cb1c0b432a13fb630402"} Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.771241 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6cvk9" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.782648 4991 scope.go:117] "RemoveContainer" containerID="f16e2167fa4e0ebce63189aad65ac840fd78eb795b5be099e16c527d85fb857f" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.783037 4991 generic.go:334] "Generic (PLEG): container finished" podID="aeca113e-21c6-4114-9130-111b9c2e051a" containerID="076097247f296c0a7e90c17b745e9a8f6e92d905fc06abfd983c6b9b686e4725" exitCode=0 Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.783097 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wmtfj" event={"ID":"aeca113e-21c6-4114-9130-111b9c2e051a","Type":"ContainerDied","Data":"076097247f296c0a7e90c17b745e9a8f6e92d905fc06abfd983c6b9b686e4725"} Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.783146 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wmtfj" event={"ID":"aeca113e-21c6-4114-9130-111b9c2e051a","Type":"ContainerDied","Data":"a933c843ac99e140d4d26dcbcefee59c0b8452eaea3fb196321d20b783e74663"} Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.783186 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wmtfj" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.783794 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.793693 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-52fr4"] Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.798342 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-52fr4"] Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.821445 4991 scope.go:117] "RemoveContainer" containerID="2568abe8c7fc7ba168b0f5d6e8d76fb98844230102825e6d9668dad007f0702e" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.822310 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rxb7h"] Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.829073 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rxb7h"] Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.833632 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.836219 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p8x59"] Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.844655 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-p8x59"] Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.848235 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6cvk9"] Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.857768 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6cvk9"] Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.861188 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wmtfj"] Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.861599 4991 scope.go:117] "RemoveContainer" containerID="40132ee862b553cee0e0369420776ae9d494e2c0fba515b19494dc1bee2ecfcd" Dec 06 01:06:51 crc kubenswrapper[4991]: E1206 01:06:51.862280 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40132ee862b553cee0e0369420776ae9d494e2c0fba515b19494dc1bee2ecfcd\": container with ID starting with 40132ee862b553cee0e0369420776ae9d494e2c0fba515b19494dc1bee2ecfcd not found: ID does not exist" containerID="40132ee862b553cee0e0369420776ae9d494e2c0fba515b19494dc1bee2ecfcd" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.862316 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40132ee862b553cee0e0369420776ae9d494e2c0fba515b19494dc1bee2ecfcd"} err="failed to get container status \"40132ee862b553cee0e0369420776ae9d494e2c0fba515b19494dc1bee2ecfcd\": rpc error: code = NotFound desc = could not find container \"40132ee862b553cee0e0369420776ae9d494e2c0fba515b19494dc1bee2ecfcd\": container with ID starting with 40132ee862b553cee0e0369420776ae9d494e2c0fba515b19494dc1bee2ecfcd not found: ID does not exist" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.862346 4991 scope.go:117] "RemoveContainer" containerID="f16e2167fa4e0ebce63189aad65ac840fd78eb795b5be099e16c527d85fb857f" Dec 06 01:06:51 crc kubenswrapper[4991]: E1206 01:06:51.862790 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f16e2167fa4e0ebce63189aad65ac840fd78eb795b5be099e16c527d85fb857f\": container with ID starting with f16e2167fa4e0ebce63189aad65ac840fd78eb795b5be099e16c527d85fb857f not found: ID does not exist" containerID="f16e2167fa4e0ebce63189aad65ac840fd78eb795b5be099e16c527d85fb857f" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.862822 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f16e2167fa4e0ebce63189aad65ac840fd78eb795b5be099e16c527d85fb857f"} err="failed to get container status \"f16e2167fa4e0ebce63189aad65ac840fd78eb795b5be099e16c527d85fb857f\": rpc error: code = NotFound desc = could not find container \"f16e2167fa4e0ebce63189aad65ac840fd78eb795b5be099e16c527d85fb857f\": container with ID starting with f16e2167fa4e0ebce63189aad65ac840fd78eb795b5be099e16c527d85fb857f not found: ID does not exist" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.862842 4991 scope.go:117] "RemoveContainer" containerID="2568abe8c7fc7ba168b0f5d6e8d76fb98844230102825e6d9668dad007f0702e" Dec 06 01:06:51 crc kubenswrapper[4991]: E1206 01:06:51.863179 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2568abe8c7fc7ba168b0f5d6e8d76fb98844230102825e6d9668dad007f0702e\": container with ID starting with 2568abe8c7fc7ba168b0f5d6e8d76fb98844230102825e6d9668dad007f0702e not found: ID does not exist" containerID="2568abe8c7fc7ba168b0f5d6e8d76fb98844230102825e6d9668dad007f0702e" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.863214 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2568abe8c7fc7ba168b0f5d6e8d76fb98844230102825e6d9668dad007f0702e"} err="failed to get container status \"2568abe8c7fc7ba168b0f5d6e8d76fb98844230102825e6d9668dad007f0702e\": rpc error: code = NotFound desc = could not find container \"2568abe8c7fc7ba168b0f5d6e8d76fb98844230102825e6d9668dad007f0702e\": container with ID starting with 2568abe8c7fc7ba168b0f5d6e8d76fb98844230102825e6d9668dad007f0702e not found: ID does not exist" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.863230 4991 scope.go:117] "RemoveContainer" containerID="4dd4142a6e0a5e5ee98e00853d18280493948f18798e563154f022b9d049909f" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.865274 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wmtfj"] Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.893913 4991 scope.go:117] "RemoveContainer" containerID="4dd4142a6e0a5e5ee98e00853d18280493948f18798e563154f022b9d049909f" Dec 06 01:06:51 crc kubenswrapper[4991]: E1206 01:06:51.894625 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dd4142a6e0a5e5ee98e00853d18280493948f18798e563154f022b9d049909f\": container with ID starting with 4dd4142a6e0a5e5ee98e00853d18280493948f18798e563154f022b9d049909f not found: ID does not exist" containerID="4dd4142a6e0a5e5ee98e00853d18280493948f18798e563154f022b9d049909f" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.894696 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dd4142a6e0a5e5ee98e00853d18280493948f18798e563154f022b9d049909f"} err="failed to get container status \"4dd4142a6e0a5e5ee98e00853d18280493948f18798e563154f022b9d049909f\": rpc error: code = NotFound desc = could not find container \"4dd4142a6e0a5e5ee98e00853d18280493948f18798e563154f022b9d049909f\": container with ID starting with 4dd4142a6e0a5e5ee98e00853d18280493948f18798e563154f022b9d049909f not found: ID does not exist" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.894747 4991 scope.go:117] "RemoveContainer" containerID="8415567781f5bd3ca845fbd841dd4b03813472e3c0a5abeb15b4f37fad780ade" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.908787 4991 scope.go:117] "RemoveContainer" containerID="ca3bf8d5feedd4cccfb5b2f45a7a49a2405309013121bf2522fea69736a279ae" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.913291 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.924056 4991 scope.go:117] "RemoveContainer" containerID="867c41330627daa5c0d98002304ca56b3621bf8b95ffaf8d247634a23c9df42e" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.938171 4991 scope.go:117] "RemoveContainer" containerID="8415567781f5bd3ca845fbd841dd4b03813472e3c0a5abeb15b4f37fad780ade" Dec 06 01:06:51 crc kubenswrapper[4991]: E1206 01:06:51.938493 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8415567781f5bd3ca845fbd841dd4b03813472e3c0a5abeb15b4f37fad780ade\": container with ID starting with 8415567781f5bd3ca845fbd841dd4b03813472e3c0a5abeb15b4f37fad780ade not found: ID does not exist" containerID="8415567781f5bd3ca845fbd841dd4b03813472e3c0a5abeb15b4f37fad780ade" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.938556 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8415567781f5bd3ca845fbd841dd4b03813472e3c0a5abeb15b4f37fad780ade"} err="failed to get container status \"8415567781f5bd3ca845fbd841dd4b03813472e3c0a5abeb15b4f37fad780ade\": rpc error: code = NotFound desc = could not find container \"8415567781f5bd3ca845fbd841dd4b03813472e3c0a5abeb15b4f37fad780ade\": container with ID starting with 8415567781f5bd3ca845fbd841dd4b03813472e3c0a5abeb15b4f37fad780ade not found: ID does not exist" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.938597 4991 scope.go:117] "RemoveContainer" containerID="ca3bf8d5feedd4cccfb5b2f45a7a49a2405309013121bf2522fea69736a279ae" Dec 06 01:06:51 crc kubenswrapper[4991]: E1206 01:06:51.939114 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca3bf8d5feedd4cccfb5b2f45a7a49a2405309013121bf2522fea69736a279ae\": container with ID starting with ca3bf8d5feedd4cccfb5b2f45a7a49a2405309013121bf2522fea69736a279ae not found: ID does not exist" containerID="ca3bf8d5feedd4cccfb5b2f45a7a49a2405309013121bf2522fea69736a279ae" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.939152 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca3bf8d5feedd4cccfb5b2f45a7a49a2405309013121bf2522fea69736a279ae"} err="failed to get container status \"ca3bf8d5feedd4cccfb5b2f45a7a49a2405309013121bf2522fea69736a279ae\": rpc error: code = NotFound desc = could not find container \"ca3bf8d5feedd4cccfb5b2f45a7a49a2405309013121bf2522fea69736a279ae\": container with ID starting with ca3bf8d5feedd4cccfb5b2f45a7a49a2405309013121bf2522fea69736a279ae not found: ID does not exist" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.939185 4991 scope.go:117] "RemoveContainer" containerID="867c41330627daa5c0d98002304ca56b3621bf8b95ffaf8d247634a23c9df42e" Dec 06 01:06:51 crc kubenswrapper[4991]: E1206 01:06:51.939496 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"867c41330627daa5c0d98002304ca56b3621bf8b95ffaf8d247634a23c9df42e\": container with ID starting with 867c41330627daa5c0d98002304ca56b3621bf8b95ffaf8d247634a23c9df42e not found: ID does not exist" containerID="867c41330627daa5c0d98002304ca56b3621bf8b95ffaf8d247634a23c9df42e" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.939524 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"867c41330627daa5c0d98002304ca56b3621bf8b95ffaf8d247634a23c9df42e"} err="failed to get container status \"867c41330627daa5c0d98002304ca56b3621bf8b95ffaf8d247634a23c9df42e\": rpc error: code = NotFound desc = could not find container \"867c41330627daa5c0d98002304ca56b3621bf8b95ffaf8d247634a23c9df42e\": container with ID starting with 867c41330627daa5c0d98002304ca56b3621bf8b95ffaf8d247634a23c9df42e not found: ID does not exist" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.939545 4991 scope.go:117] "RemoveContainer" containerID="50c61b2e11494df050da0f7745ea53d0e2e31b47d82bf2a342b6b3dd89d06ceb" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.954069 4991 scope.go:117] "RemoveContainer" containerID="3ba272f5c190abf1548de6c2dc71037a2f3b5ba74634deaf11a324c4a50d8dbf" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.968313 4991 scope.go:117] "RemoveContainer" containerID="b53d9d8a3a656f17888627f62f97748bd3013161a76c2c95673aeef2eb8fbac2" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.980191 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.981003 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.992782 4991 scope.go:117] "RemoveContainer" containerID="50c61b2e11494df050da0f7745ea53d0e2e31b47d82bf2a342b6b3dd89d06ceb" Dec 06 01:06:51 crc kubenswrapper[4991]: E1206 01:06:51.993314 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50c61b2e11494df050da0f7745ea53d0e2e31b47d82bf2a342b6b3dd89d06ceb\": container with ID starting with 50c61b2e11494df050da0f7745ea53d0e2e31b47d82bf2a342b6b3dd89d06ceb not found: ID does not exist" containerID="50c61b2e11494df050da0f7745ea53d0e2e31b47d82bf2a342b6b3dd89d06ceb" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.993365 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50c61b2e11494df050da0f7745ea53d0e2e31b47d82bf2a342b6b3dd89d06ceb"} err="failed to get container status \"50c61b2e11494df050da0f7745ea53d0e2e31b47d82bf2a342b6b3dd89d06ceb\": rpc error: code = NotFound desc = could not find container \"50c61b2e11494df050da0f7745ea53d0e2e31b47d82bf2a342b6b3dd89d06ceb\": container with ID starting with 50c61b2e11494df050da0f7745ea53d0e2e31b47d82bf2a342b6b3dd89d06ceb not found: ID does not exist" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.993402 4991 scope.go:117] "RemoveContainer" containerID="3ba272f5c190abf1548de6c2dc71037a2f3b5ba74634deaf11a324c4a50d8dbf" Dec 06 01:06:51 crc kubenswrapper[4991]: E1206 01:06:51.993906 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ba272f5c190abf1548de6c2dc71037a2f3b5ba74634deaf11a324c4a50d8dbf\": container with ID starting with 3ba272f5c190abf1548de6c2dc71037a2f3b5ba74634deaf11a324c4a50d8dbf not found: ID does not exist" containerID="3ba272f5c190abf1548de6c2dc71037a2f3b5ba74634deaf11a324c4a50d8dbf" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.994020 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ba272f5c190abf1548de6c2dc71037a2f3b5ba74634deaf11a324c4a50d8dbf"} err="failed to get container status \"3ba272f5c190abf1548de6c2dc71037a2f3b5ba74634deaf11a324c4a50d8dbf\": rpc error: code = NotFound desc = could not find container \"3ba272f5c190abf1548de6c2dc71037a2f3b5ba74634deaf11a324c4a50d8dbf\": container with ID starting with 3ba272f5c190abf1548de6c2dc71037a2f3b5ba74634deaf11a324c4a50d8dbf not found: ID does not exist" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.994074 4991 scope.go:117] "RemoveContainer" containerID="b53d9d8a3a656f17888627f62f97748bd3013161a76c2c95673aeef2eb8fbac2" Dec 06 01:06:51 crc kubenswrapper[4991]: E1206 01:06:51.994534 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b53d9d8a3a656f17888627f62f97748bd3013161a76c2c95673aeef2eb8fbac2\": container with ID starting with b53d9d8a3a656f17888627f62f97748bd3013161a76c2c95673aeef2eb8fbac2 not found: ID does not exist" containerID="b53d9d8a3a656f17888627f62f97748bd3013161a76c2c95673aeef2eb8fbac2" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.994570 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b53d9d8a3a656f17888627f62f97748bd3013161a76c2c95673aeef2eb8fbac2"} err="failed to get container status \"b53d9d8a3a656f17888627f62f97748bd3013161a76c2c95673aeef2eb8fbac2\": rpc error: code = NotFound desc = could not find container \"b53d9d8a3a656f17888627f62f97748bd3013161a76c2c95673aeef2eb8fbac2\": container with ID starting with b53d9d8a3a656f17888627f62f97748bd3013161a76c2c95673aeef2eb8fbac2 not found: ID does not exist" Dec 06 01:06:51 crc kubenswrapper[4991]: I1206 01:06:51.994590 4991 scope.go:117] "RemoveContainer" containerID="076097247f296c0a7e90c17b745e9a8f6e92d905fc06abfd983c6b9b686e4725" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.016122 4991 scope.go:117] "RemoveContainer" containerID="2eb8303ad5af329db74c56ceeba5acbac0fe2e8e85e5e67628ea177274d0d666" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.025783 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.037283 4991 scope.go:117] "RemoveContainer" containerID="6e525f9840cda74b087fab59022cc662c2ba4f41c037e860476319fe4257fea1" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.052029 4991 scope.go:117] "RemoveContainer" containerID="076097247f296c0a7e90c17b745e9a8f6e92d905fc06abfd983c6b9b686e4725" Dec 06 01:06:52 crc kubenswrapper[4991]: E1206 01:06:52.052591 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"076097247f296c0a7e90c17b745e9a8f6e92d905fc06abfd983c6b9b686e4725\": container with ID starting with 076097247f296c0a7e90c17b745e9a8f6e92d905fc06abfd983c6b9b686e4725 not found: ID does not exist" containerID="076097247f296c0a7e90c17b745e9a8f6e92d905fc06abfd983c6b9b686e4725" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.052643 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"076097247f296c0a7e90c17b745e9a8f6e92d905fc06abfd983c6b9b686e4725"} err="failed to get container status \"076097247f296c0a7e90c17b745e9a8f6e92d905fc06abfd983c6b9b686e4725\": rpc error: code = NotFound desc = could not find container \"076097247f296c0a7e90c17b745e9a8f6e92d905fc06abfd983c6b9b686e4725\": container with ID starting with 076097247f296c0a7e90c17b745e9a8f6e92d905fc06abfd983c6b9b686e4725 not found: ID does not exist" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.052677 4991 scope.go:117] "RemoveContainer" containerID="2eb8303ad5af329db74c56ceeba5acbac0fe2e8e85e5e67628ea177274d0d666" Dec 06 01:06:52 crc kubenswrapper[4991]: E1206 01:06:52.053214 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2eb8303ad5af329db74c56ceeba5acbac0fe2e8e85e5e67628ea177274d0d666\": container with ID starting with 2eb8303ad5af329db74c56ceeba5acbac0fe2e8e85e5e67628ea177274d0d666 not found: ID does not exist" containerID="2eb8303ad5af329db74c56ceeba5acbac0fe2e8e85e5e67628ea177274d0d666" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.053254 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2eb8303ad5af329db74c56ceeba5acbac0fe2e8e85e5e67628ea177274d0d666"} err="failed to get container status \"2eb8303ad5af329db74c56ceeba5acbac0fe2e8e85e5e67628ea177274d0d666\": rpc error: code = NotFound desc = could not find container \"2eb8303ad5af329db74c56ceeba5acbac0fe2e8e85e5e67628ea177274d0d666\": container with ID starting with 2eb8303ad5af329db74c56ceeba5acbac0fe2e8e85e5e67628ea177274d0d666 not found: ID does not exist" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.053276 4991 scope.go:117] "RemoveContainer" containerID="6e525f9840cda74b087fab59022cc662c2ba4f41c037e860476319fe4257fea1" Dec 06 01:06:52 crc kubenswrapper[4991]: E1206 01:06:52.053759 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e525f9840cda74b087fab59022cc662c2ba4f41c037e860476319fe4257fea1\": container with ID starting with 6e525f9840cda74b087fab59022cc662c2ba4f41c037e860476319fe4257fea1 not found: ID does not exist" containerID="6e525f9840cda74b087fab59022cc662c2ba4f41c037e860476319fe4257fea1" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.053797 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e525f9840cda74b087fab59022cc662c2ba4f41c037e860476319fe4257fea1"} err="failed to get container status \"6e525f9840cda74b087fab59022cc662c2ba4f41c037e860476319fe4257fea1\": rpc error: code = NotFound desc = could not find container \"6e525f9840cda74b087fab59022cc662c2ba4f41c037e860476319fe4257fea1\": container with ID starting with 6e525f9840cda74b087fab59022cc662c2ba4f41c037e860476319fe4257fea1 not found: ID does not exist" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.058168 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.069651 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.075213 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.105409 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.173060 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.304727 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.412976 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.437151 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.474959 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.573081 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.644437 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.659450 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.698561 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.747527 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xs7vk"] Dec 06 01:06:52 crc kubenswrapper[4991]: E1206 01:06:52.752156 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c39eb27-3dbb-46d7-a874-ca6605d84134" containerName="registry-server" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.752215 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c39eb27-3dbb-46d7-a874-ca6605d84134" containerName="registry-server" Dec 06 01:06:52 crc kubenswrapper[4991]: E1206 01:06:52.752246 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeca113e-21c6-4114-9130-111b9c2e051a" containerName="extract-content" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.752254 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeca113e-21c6-4114-9130-111b9c2e051a" containerName="extract-content" Dec 06 01:06:52 crc kubenswrapper[4991]: E1206 01:06:52.752268 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeca113e-21c6-4114-9130-111b9c2e051a" containerName="registry-server" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.752276 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeca113e-21c6-4114-9130-111b9c2e051a" containerName="registry-server" Dec 06 01:06:52 crc kubenswrapper[4991]: E1206 01:06:52.752290 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c39eb27-3dbb-46d7-a874-ca6605d84134" containerName="extract-utilities" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.752298 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c39eb27-3dbb-46d7-a874-ca6605d84134" containerName="extract-utilities" Dec 06 01:06:52 crc kubenswrapper[4991]: E1206 01:06:52.752319 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ad4cd41-d615-4bdd-bd75-baa9c5f2a141" containerName="marketplace-operator" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.752327 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ad4cd41-d615-4bdd-bd75-baa9c5f2a141" containerName="marketplace-operator" Dec 06 01:06:52 crc kubenswrapper[4991]: E1206 01:06:52.752343 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="271a982a-6005-4d08-b284-6b140ba25f07" containerName="extract-content" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.752351 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="271a982a-6005-4d08-b284-6b140ba25f07" containerName="extract-content" Dec 06 01:06:52 crc kubenswrapper[4991]: E1206 01:06:52.752361 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a5f3df1-9238-407b-9fb1-97d41e3a7f11" containerName="extract-content" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.752370 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a5f3df1-9238-407b-9fb1-97d41e3a7f11" containerName="extract-content" Dec 06 01:06:52 crc kubenswrapper[4991]: E1206 01:06:52.752388 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="271a982a-6005-4d08-b284-6b140ba25f07" containerName="extract-utilities" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.752395 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="271a982a-6005-4d08-b284-6b140ba25f07" containerName="extract-utilities" Dec 06 01:06:52 crc kubenswrapper[4991]: E1206 01:06:52.752409 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a5f3df1-9238-407b-9fb1-97d41e3a7f11" containerName="registry-server" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.752416 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a5f3df1-9238-407b-9fb1-97d41e3a7f11" containerName="registry-server" Dec 06 01:06:52 crc kubenswrapper[4991]: E1206 01:06:52.752433 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c39eb27-3dbb-46d7-a874-ca6605d84134" containerName="extract-content" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.752440 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c39eb27-3dbb-46d7-a874-ca6605d84134" containerName="extract-content" Dec 06 01:06:52 crc kubenswrapper[4991]: E1206 01:06:52.752455 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="271a982a-6005-4d08-b284-6b140ba25f07" containerName="registry-server" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.752462 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="271a982a-6005-4d08-b284-6b140ba25f07" containerName="registry-server" Dec 06 01:06:52 crc kubenswrapper[4991]: E1206 01:06:52.752471 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ea93134-143a-4533-a84c-d23e1f1e505c" containerName="installer" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.752478 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ea93134-143a-4533-a84c-d23e1f1e505c" containerName="installer" Dec 06 01:06:52 crc kubenswrapper[4991]: E1206 01:06:52.752494 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeca113e-21c6-4114-9130-111b9c2e051a" containerName="extract-utilities" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.752502 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeca113e-21c6-4114-9130-111b9c2e051a" containerName="extract-utilities" Dec 06 01:06:52 crc kubenswrapper[4991]: E1206 01:06:52.752513 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a5f3df1-9238-407b-9fb1-97d41e3a7f11" containerName="extract-utilities" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.752522 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a5f3df1-9238-407b-9fb1-97d41e3a7f11" containerName="extract-utilities" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.752857 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ad4cd41-d615-4bdd-bd75-baa9c5f2a141" containerName="marketplace-operator" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.752879 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="271a982a-6005-4d08-b284-6b140ba25f07" containerName="registry-server" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.752900 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeca113e-21c6-4114-9130-111b9c2e051a" containerName="registry-server" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.752920 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c39eb27-3dbb-46d7-a874-ca6605d84134" containerName="registry-server" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.752936 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ea93134-143a-4533-a84c-d23e1f1e505c" containerName="installer" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.752962 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a5f3df1-9238-407b-9fb1-97d41e3a7f11" containerName="registry-server" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.753783 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xs7vk" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.758004 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.763671 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.764014 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.768765 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.776733 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.782864 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xs7vk"] Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.801769 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxvwb\" (UniqueName: \"kubernetes.io/projected/10a1dada-c415-427c-a5ed-36eac2391451-kube-api-access-qxvwb\") pod \"marketplace-operator-79b997595-xs7vk\" (UID: \"10a1dada-c415-427c-a5ed-36eac2391451\") " pod="openshift-marketplace/marketplace-operator-79b997595-xs7vk" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.801831 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/10a1dada-c415-427c-a5ed-36eac2391451-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xs7vk\" (UID: \"10a1dada-c415-427c-a5ed-36eac2391451\") " pod="openshift-marketplace/marketplace-operator-79b997595-xs7vk" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.801896 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/10a1dada-c415-427c-a5ed-36eac2391451-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xs7vk\" (UID: \"10a1dada-c415-427c-a5ed-36eac2391451\") " pod="openshift-marketplace/marketplace-operator-79b997595-xs7vk" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.903430 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxvwb\" (UniqueName: \"kubernetes.io/projected/10a1dada-c415-427c-a5ed-36eac2391451-kube-api-access-qxvwb\") pod \"marketplace-operator-79b997595-xs7vk\" (UID: \"10a1dada-c415-427c-a5ed-36eac2391451\") " pod="openshift-marketplace/marketplace-operator-79b997595-xs7vk" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.903500 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/10a1dada-c415-427c-a5ed-36eac2391451-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xs7vk\" (UID: \"10a1dada-c415-427c-a5ed-36eac2391451\") " pod="openshift-marketplace/marketplace-operator-79b997595-xs7vk" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.903548 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/10a1dada-c415-427c-a5ed-36eac2391451-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xs7vk\" (UID: \"10a1dada-c415-427c-a5ed-36eac2391451\") " pod="openshift-marketplace/marketplace-operator-79b997595-xs7vk" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.905354 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/10a1dada-c415-427c-a5ed-36eac2391451-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xs7vk\" (UID: \"10a1dada-c415-427c-a5ed-36eac2391451\") " pod="openshift-marketplace/marketplace-operator-79b997595-xs7vk" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.910191 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/10a1dada-c415-427c-a5ed-36eac2391451-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xs7vk\" (UID: \"10a1dada-c415-427c-a5ed-36eac2391451\") " pod="openshift-marketplace/marketplace-operator-79b997595-xs7vk" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.918340 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 06 01:06:52 crc kubenswrapper[4991]: I1206 01:06:52.923230 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxvwb\" (UniqueName: \"kubernetes.io/projected/10a1dada-c415-427c-a5ed-36eac2391451-kube-api-access-qxvwb\") pod \"marketplace-operator-79b997595-xs7vk\" (UID: \"10a1dada-c415-427c-a5ed-36eac2391451\") " pod="openshift-marketplace/marketplace-operator-79b997595-xs7vk" Dec 06 01:06:53 crc kubenswrapper[4991]: I1206 01:06:53.028336 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 06 01:06:53 crc kubenswrapper[4991]: I1206 01:06:53.045583 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="271a982a-6005-4d08-b284-6b140ba25f07" path="/var/lib/kubelet/pods/271a982a-6005-4d08-b284-6b140ba25f07/volumes" Dec 06 01:06:53 crc kubenswrapper[4991]: I1206 01:06:53.046478 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ad4cd41-d615-4bdd-bd75-baa9c5f2a141" path="/var/lib/kubelet/pods/3ad4cd41-d615-4bdd-bd75-baa9c5f2a141/volumes" Dec 06 01:06:53 crc kubenswrapper[4991]: I1206 01:06:53.047144 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c39eb27-3dbb-46d7-a874-ca6605d84134" path="/var/lib/kubelet/pods/5c39eb27-3dbb-46d7-a874-ca6605d84134/volumes" Dec 06 01:06:53 crc kubenswrapper[4991]: I1206 01:06:53.048701 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a5f3df1-9238-407b-9fb1-97d41e3a7f11" path="/var/lib/kubelet/pods/7a5f3df1-9238-407b-9fb1-97d41e3a7f11/volumes" Dec 06 01:06:53 crc kubenswrapper[4991]: I1206 01:06:53.049483 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aeca113e-21c6-4114-9130-111b9c2e051a" path="/var/lib/kubelet/pods/aeca113e-21c6-4114-9130-111b9c2e051a/volumes" Dec 06 01:06:53 crc kubenswrapper[4991]: I1206 01:06:53.081724 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xs7vk" Dec 06 01:06:53 crc kubenswrapper[4991]: I1206 01:06:53.213568 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 06 01:06:53 crc kubenswrapper[4991]: I1206 01:06:53.238469 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 06 01:06:53 crc kubenswrapper[4991]: I1206 01:06:53.249548 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 06 01:06:53 crc kubenswrapper[4991]: I1206 01:06:53.260567 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 06 01:06:53 crc kubenswrapper[4991]: I1206 01:06:53.307888 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 06 01:06:53 crc kubenswrapper[4991]: I1206 01:06:53.328244 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xs7vk"] Dec 06 01:06:53 crc kubenswrapper[4991]: I1206 01:06:53.401267 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 06 01:06:53 crc kubenswrapper[4991]: I1206 01:06:53.405790 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 06 01:06:53 crc kubenswrapper[4991]: I1206 01:06:53.473046 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 06 01:06:53 crc kubenswrapper[4991]: I1206 01:06:53.506769 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 06 01:06:53 crc kubenswrapper[4991]: I1206 01:06:53.682811 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 06 01:06:53 crc kubenswrapper[4991]: I1206 01:06:53.706861 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 06 01:06:53 crc kubenswrapper[4991]: I1206 01:06:53.798106 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 06 01:06:53 crc kubenswrapper[4991]: I1206 01:06:53.818312 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xs7vk" event={"ID":"10a1dada-c415-427c-a5ed-36eac2391451","Type":"ContainerStarted","Data":"b764a6fca770d124d9ae6ac671cbb7479be23c8737025e1100cb6af9e61952b1"} Dec 06 01:06:53 crc kubenswrapper[4991]: I1206 01:06:53.818396 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xs7vk" event={"ID":"10a1dada-c415-427c-a5ed-36eac2391451","Type":"ContainerStarted","Data":"08294cd9b4b289b8b9748392f67c2f07cad8f28062adc4e6718b6c8278344633"} Dec 06 01:06:53 crc kubenswrapper[4991]: I1206 01:06:53.818776 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-xs7vk" Dec 06 01:06:53 crc kubenswrapper[4991]: I1206 01:06:53.833583 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-xs7vk" Dec 06 01:06:53 crc kubenswrapper[4991]: I1206 01:06:53.838874 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-xs7vk" podStartSLOduration=1.838855837 podStartE2EDuration="1.838855837s" podCreationTimestamp="2025-12-06 01:06:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:06:53.836453899 +0000 UTC m=+287.186744397" watchObservedRunningTime="2025-12-06 01:06:53.838855837 +0000 UTC m=+287.189146305" Dec 06 01:06:53 crc kubenswrapper[4991]: I1206 01:06:53.874823 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 06 01:06:53 crc kubenswrapper[4991]: I1206 01:06:53.900028 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 06 01:06:53 crc kubenswrapper[4991]: I1206 01:06:53.930027 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 06 01:06:53 crc kubenswrapper[4991]: I1206 01:06:53.988486 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 06 01:06:54 crc kubenswrapper[4991]: I1206 01:06:54.054676 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 06 01:06:54 crc kubenswrapper[4991]: I1206 01:06:54.100977 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 06 01:06:54 crc kubenswrapper[4991]: I1206 01:06:54.118707 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 06 01:06:54 crc kubenswrapper[4991]: I1206 01:06:54.151725 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 06 01:06:54 crc kubenswrapper[4991]: I1206 01:06:54.219440 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 06 01:06:54 crc kubenswrapper[4991]: I1206 01:06:54.235180 4991 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 06 01:06:54 crc kubenswrapper[4991]: I1206 01:06:54.592773 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 06 01:06:54 crc kubenswrapper[4991]: I1206 01:06:54.606650 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 06 01:06:54 crc kubenswrapper[4991]: I1206 01:06:54.609551 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 06 01:06:54 crc kubenswrapper[4991]: I1206 01:06:54.636893 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 06 01:06:54 crc kubenswrapper[4991]: I1206 01:06:54.846457 4991 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 06 01:06:54 crc kubenswrapper[4991]: I1206 01:06:54.848624 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 06 01:06:54 crc kubenswrapper[4991]: I1206 01:06:54.899484 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 06 01:06:54 crc kubenswrapper[4991]: I1206 01:06:54.906373 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 06 01:06:55 crc kubenswrapper[4991]: I1206 01:06:55.036285 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 06 01:06:55 crc kubenswrapper[4991]: I1206 01:06:55.171454 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 06 01:06:55 crc kubenswrapper[4991]: I1206 01:06:55.288314 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 06 01:06:55 crc kubenswrapper[4991]: I1206 01:06:55.395537 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 06 01:06:55 crc kubenswrapper[4991]: I1206 01:06:55.490921 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 06 01:06:55 crc kubenswrapper[4991]: I1206 01:06:55.654671 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 06 01:06:55 crc kubenswrapper[4991]: I1206 01:06:55.777042 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 06 01:06:55 crc kubenswrapper[4991]: I1206 01:06:55.834345 4991 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 06 01:06:55 crc kubenswrapper[4991]: I1206 01:06:55.869972 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 06 01:06:55 crc kubenswrapper[4991]: I1206 01:06:55.916229 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 06 01:06:56 crc kubenswrapper[4991]: I1206 01:06:56.023583 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 06 01:06:56 crc kubenswrapper[4991]: I1206 01:06:56.045168 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 06 01:06:56 crc kubenswrapper[4991]: I1206 01:06:56.277061 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 06 01:06:56 crc kubenswrapper[4991]: I1206 01:06:56.388197 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 06 01:06:56 crc kubenswrapper[4991]: I1206 01:06:56.575971 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 06 01:06:56 crc kubenswrapper[4991]: I1206 01:06:56.690755 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 06 01:06:56 crc kubenswrapper[4991]: I1206 01:06:56.735846 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 06 01:06:56 crc kubenswrapper[4991]: I1206 01:06:56.844934 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 06 01:06:56 crc kubenswrapper[4991]: I1206 01:06:56.870733 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 06 01:06:56 crc kubenswrapper[4991]: I1206 01:06:56.929997 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 06 01:06:57 crc kubenswrapper[4991]: I1206 01:06:57.037745 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 06 01:06:57 crc kubenswrapper[4991]: I1206 01:06:57.096367 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 06 01:06:57 crc kubenswrapper[4991]: I1206 01:06:57.098693 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 06 01:06:57 crc kubenswrapper[4991]: I1206 01:06:57.404166 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 06 01:06:57 crc kubenswrapper[4991]: I1206 01:06:57.832205 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 06 01:06:57 crc kubenswrapper[4991]: I1206 01:06:57.978953 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 06 01:06:58 crc kubenswrapper[4991]: I1206 01:06:58.164352 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 06 01:06:58 crc kubenswrapper[4991]: I1206 01:06:58.242285 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 06 01:06:58 crc kubenswrapper[4991]: I1206 01:06:58.450361 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 06 01:06:58 crc kubenswrapper[4991]: I1206 01:06:58.507141 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 06 01:06:59 crc kubenswrapper[4991]: I1206 01:06:59.261251 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 06 01:06:59 crc kubenswrapper[4991]: I1206 01:06:59.804300 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 06 01:07:03 crc kubenswrapper[4991]: I1206 01:07:03.452229 4991 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 06 01:07:03 crc kubenswrapper[4991]: I1206 01:07:03.453124 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://61edcb31ad91e0541ff3f0bc06fff677b8873ce12cdac2364dc0c80412f471ca" gracePeriod=5 Dec 06 01:07:08 crc kubenswrapper[4991]: I1206 01:07:08.925432 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 06 01:07:08 crc kubenswrapper[4991]: I1206 01:07:08.925733 4991 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="61edcb31ad91e0541ff3f0bc06fff677b8873ce12cdac2364dc0c80412f471ca" exitCode=137 Dec 06 01:07:09 crc kubenswrapper[4991]: I1206 01:07:09.070363 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 06 01:07:09 crc kubenswrapper[4991]: I1206 01:07:09.070487 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 01:07:09 crc kubenswrapper[4991]: I1206 01:07:09.171928 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 06 01:07:09 crc kubenswrapper[4991]: I1206 01:07:09.172088 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 06 01:07:09 crc kubenswrapper[4991]: I1206 01:07:09.172179 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 06 01:07:09 crc kubenswrapper[4991]: I1206 01:07:09.172227 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 06 01:07:09 crc kubenswrapper[4991]: I1206 01:07:09.172278 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 01:07:09 crc kubenswrapper[4991]: I1206 01:07:09.172338 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 01:07:09 crc kubenswrapper[4991]: I1206 01:07:09.172282 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 06 01:07:09 crc kubenswrapper[4991]: I1206 01:07:09.172406 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 01:07:09 crc kubenswrapper[4991]: I1206 01:07:09.172427 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 01:07:09 crc kubenswrapper[4991]: I1206 01:07:09.172758 4991 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 06 01:07:09 crc kubenswrapper[4991]: I1206 01:07:09.172783 4991 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 06 01:07:09 crc kubenswrapper[4991]: I1206 01:07:09.172802 4991 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 06 01:07:09 crc kubenswrapper[4991]: I1206 01:07:09.172821 4991 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 06 01:07:09 crc kubenswrapper[4991]: I1206 01:07:09.187571 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 01:07:09 crc kubenswrapper[4991]: I1206 01:07:09.273519 4991 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 06 01:07:09 crc kubenswrapper[4991]: I1206 01:07:09.936184 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 06 01:07:09 crc kubenswrapper[4991]: I1206 01:07:09.936296 4991 scope.go:117] "RemoveContainer" containerID="61edcb31ad91e0541ff3f0bc06fff677b8873ce12cdac2364dc0c80412f471ca" Dec 06 01:07:09 crc kubenswrapper[4991]: I1206 01:07:09.936425 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 06 01:07:11 crc kubenswrapper[4991]: I1206 01:07:11.045284 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 06 01:07:11 crc kubenswrapper[4991]: I1206 01:07:11.045902 4991 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Dec 06 01:07:11 crc kubenswrapper[4991]: I1206 01:07:11.056253 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 06 01:07:11 crc kubenswrapper[4991]: I1206 01:07:11.056299 4991 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="8f035929-3c4b-4de2-a05f-dc77fb296a35" Dec 06 01:07:11 crc kubenswrapper[4991]: I1206 01:07:11.059765 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 06 01:07:11 crc kubenswrapper[4991]: I1206 01:07:11.059807 4991 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="8f035929-3c4b-4de2-a05f-dc77fb296a35" Dec 06 01:07:23 crc kubenswrapper[4991]: I1206 01:07:23.089305 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mw769"] Dec 06 01:07:23 crc kubenswrapper[4991]: I1206 01:07:23.090531 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-mw769" podUID="3acd3a6f-6dab-4bff-8f82-b96a8abc1555" containerName="controller-manager" containerID="cri-o://e47c7d5cc241809a44f7ec9e7d0a97f6033840782844d9a11ea835f8dbe5c6cb" gracePeriod=30 Dec 06 01:07:23 crc kubenswrapper[4991]: I1206 01:07:23.207226 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-nhs9f"] Dec 06 01:07:23 crc kubenswrapper[4991]: I1206 01:07:23.207544 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nhs9f" podUID="6ef47549-ccde-4565-935c-8a92998c5eb3" containerName="route-controller-manager" containerID="cri-o://0121e0344eecde6b772d3c8043ed674d1c1545009fc52db676ef55668f45abdc" gracePeriod=30 Dec 06 01:07:23 crc kubenswrapper[4991]: I1206 01:07:23.505868 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-mw769" Dec 06 01:07:23 crc kubenswrapper[4991]: I1206 01:07:23.569942 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nhs9f" Dec 06 01:07:23 crc kubenswrapper[4991]: I1206 01:07:23.601095 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ef47549-ccde-4565-935c-8a92998c5eb3-serving-cert\") pod \"6ef47549-ccde-4565-935c-8a92998c5eb3\" (UID: \"6ef47549-ccde-4565-935c-8a92998c5eb3\") " Dec 06 01:07:23 crc kubenswrapper[4991]: I1206 01:07:23.601167 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vd82h\" (UniqueName: \"kubernetes.io/projected/3acd3a6f-6dab-4bff-8f82-b96a8abc1555-kube-api-access-vd82h\") pod \"3acd3a6f-6dab-4bff-8f82-b96a8abc1555\" (UID: \"3acd3a6f-6dab-4bff-8f82-b96a8abc1555\") " Dec 06 01:07:23 crc kubenswrapper[4991]: I1206 01:07:23.601208 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3acd3a6f-6dab-4bff-8f82-b96a8abc1555-proxy-ca-bundles\") pod \"3acd3a6f-6dab-4bff-8f82-b96a8abc1555\" (UID: \"3acd3a6f-6dab-4bff-8f82-b96a8abc1555\") " Dec 06 01:07:23 crc kubenswrapper[4991]: I1206 01:07:23.601233 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6ef47549-ccde-4565-935c-8a92998c5eb3-client-ca\") pod \"6ef47549-ccde-4565-935c-8a92998c5eb3\" (UID: \"6ef47549-ccde-4565-935c-8a92998c5eb3\") " Dec 06 01:07:23 crc kubenswrapper[4991]: I1206 01:07:23.601271 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3acd3a6f-6dab-4bff-8f82-b96a8abc1555-config\") pod \"3acd3a6f-6dab-4bff-8f82-b96a8abc1555\" (UID: \"3acd3a6f-6dab-4bff-8f82-b96a8abc1555\") " Dec 06 01:07:23 crc kubenswrapper[4991]: I1206 01:07:23.601333 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ef47549-ccde-4565-935c-8a92998c5eb3-config\") pod \"6ef47549-ccde-4565-935c-8a92998c5eb3\" (UID: \"6ef47549-ccde-4565-935c-8a92998c5eb3\") " Dec 06 01:07:23 crc kubenswrapper[4991]: I1206 01:07:23.601399 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3acd3a6f-6dab-4bff-8f82-b96a8abc1555-client-ca\") pod \"3acd3a6f-6dab-4bff-8f82-b96a8abc1555\" (UID: \"3acd3a6f-6dab-4bff-8f82-b96a8abc1555\") " Dec 06 01:07:23 crc kubenswrapper[4991]: I1206 01:07:23.601419 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3acd3a6f-6dab-4bff-8f82-b96a8abc1555-serving-cert\") pod \"3acd3a6f-6dab-4bff-8f82-b96a8abc1555\" (UID: \"3acd3a6f-6dab-4bff-8f82-b96a8abc1555\") " Dec 06 01:07:23 crc kubenswrapper[4991]: I1206 01:07:23.601446 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbm75\" (UniqueName: \"kubernetes.io/projected/6ef47549-ccde-4565-935c-8a92998c5eb3-kube-api-access-bbm75\") pod \"6ef47549-ccde-4565-935c-8a92998c5eb3\" (UID: \"6ef47549-ccde-4565-935c-8a92998c5eb3\") " Dec 06 01:07:23 crc kubenswrapper[4991]: I1206 01:07:23.602768 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3acd3a6f-6dab-4bff-8f82-b96a8abc1555-client-ca" (OuterVolumeSpecName: "client-ca") pod "3acd3a6f-6dab-4bff-8f82-b96a8abc1555" (UID: "3acd3a6f-6dab-4bff-8f82-b96a8abc1555"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:07:23 crc kubenswrapper[4991]: I1206 01:07:23.602816 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3acd3a6f-6dab-4bff-8f82-b96a8abc1555-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "3acd3a6f-6dab-4bff-8f82-b96a8abc1555" (UID: "3acd3a6f-6dab-4bff-8f82-b96a8abc1555"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:07:23 crc kubenswrapper[4991]: I1206 01:07:23.602851 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3acd3a6f-6dab-4bff-8f82-b96a8abc1555-config" (OuterVolumeSpecName: "config") pod "3acd3a6f-6dab-4bff-8f82-b96a8abc1555" (UID: "3acd3a6f-6dab-4bff-8f82-b96a8abc1555"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:07:23 crc kubenswrapper[4991]: I1206 01:07:23.603010 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ef47549-ccde-4565-935c-8a92998c5eb3-client-ca" (OuterVolumeSpecName: "client-ca") pod "6ef47549-ccde-4565-935c-8a92998c5eb3" (UID: "6ef47549-ccde-4565-935c-8a92998c5eb3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:07:23 crc kubenswrapper[4991]: I1206 01:07:23.603042 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ef47549-ccde-4565-935c-8a92998c5eb3-config" (OuterVolumeSpecName: "config") pod "6ef47549-ccde-4565-935c-8a92998c5eb3" (UID: "6ef47549-ccde-4565-935c-8a92998c5eb3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:07:23 crc kubenswrapper[4991]: I1206 01:07:23.609810 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ef47549-ccde-4565-935c-8a92998c5eb3-kube-api-access-bbm75" (OuterVolumeSpecName: "kube-api-access-bbm75") pod "6ef47549-ccde-4565-935c-8a92998c5eb3" (UID: "6ef47549-ccde-4565-935c-8a92998c5eb3"). InnerVolumeSpecName "kube-api-access-bbm75". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:07:23 crc kubenswrapper[4991]: I1206 01:07:23.610677 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3acd3a6f-6dab-4bff-8f82-b96a8abc1555-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3acd3a6f-6dab-4bff-8f82-b96a8abc1555" (UID: "3acd3a6f-6dab-4bff-8f82-b96a8abc1555"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:07:23 crc kubenswrapper[4991]: I1206 01:07:23.618135 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ef47549-ccde-4565-935c-8a92998c5eb3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6ef47549-ccde-4565-935c-8a92998c5eb3" (UID: "6ef47549-ccde-4565-935c-8a92998c5eb3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:07:23 crc kubenswrapper[4991]: I1206 01:07:23.619847 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3acd3a6f-6dab-4bff-8f82-b96a8abc1555-kube-api-access-vd82h" (OuterVolumeSpecName: "kube-api-access-vd82h") pod "3acd3a6f-6dab-4bff-8f82-b96a8abc1555" (UID: "3acd3a6f-6dab-4bff-8f82-b96a8abc1555"). InnerVolumeSpecName "kube-api-access-vd82h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:07:23 crc kubenswrapper[4991]: I1206 01:07:23.703653 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3acd3a6f-6dab-4bff-8f82-b96a8abc1555-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:07:23 crc kubenswrapper[4991]: I1206 01:07:23.703719 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ef47549-ccde-4565-935c-8a92998c5eb3-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:07:23 crc kubenswrapper[4991]: I1206 01:07:23.703730 4991 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3acd3a6f-6dab-4bff-8f82-b96a8abc1555-client-ca\") on node \"crc\" DevicePath \"\"" Dec 06 01:07:23 crc kubenswrapper[4991]: I1206 01:07:23.703743 4991 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3acd3a6f-6dab-4bff-8f82-b96a8abc1555-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 01:07:23 crc kubenswrapper[4991]: I1206 01:07:23.703754 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbm75\" (UniqueName: \"kubernetes.io/projected/6ef47549-ccde-4565-935c-8a92998c5eb3-kube-api-access-bbm75\") on node \"crc\" DevicePath \"\"" Dec 06 01:07:23 crc kubenswrapper[4991]: I1206 01:07:23.703768 4991 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ef47549-ccde-4565-935c-8a92998c5eb3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 01:07:23 crc kubenswrapper[4991]: I1206 01:07:23.703778 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vd82h\" (UniqueName: \"kubernetes.io/projected/3acd3a6f-6dab-4bff-8f82-b96a8abc1555-kube-api-access-vd82h\") on node \"crc\" DevicePath \"\"" Dec 06 01:07:23 crc kubenswrapper[4991]: I1206 01:07:23.703791 4991 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3acd3a6f-6dab-4bff-8f82-b96a8abc1555-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 06 01:07:23 crc kubenswrapper[4991]: I1206 01:07:23.703810 4991 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6ef47549-ccde-4565-935c-8a92998c5eb3-client-ca\") on node \"crc\" DevicePath \"\"" Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.034927 4991 generic.go:334] "Generic (PLEG): container finished" podID="6ef47549-ccde-4565-935c-8a92998c5eb3" containerID="0121e0344eecde6b772d3c8043ed674d1c1545009fc52db676ef55668f45abdc" exitCode=0 Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.034996 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nhs9f" event={"ID":"6ef47549-ccde-4565-935c-8a92998c5eb3","Type":"ContainerDied","Data":"0121e0344eecde6b772d3c8043ed674d1c1545009fc52db676ef55668f45abdc"} Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.035345 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nhs9f" event={"ID":"6ef47549-ccde-4565-935c-8a92998c5eb3","Type":"ContainerDied","Data":"55eb1c88b0be911f701d88c8ed61ffb30352ccb0fff37e3de488a1e8822ab894"} Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.035368 4991 scope.go:117] "RemoveContainer" containerID="0121e0344eecde6b772d3c8043ed674d1c1545009fc52db676ef55668f45abdc" Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.035007 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nhs9f" Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.037384 4991 generic.go:334] "Generic (PLEG): container finished" podID="3acd3a6f-6dab-4bff-8f82-b96a8abc1555" containerID="e47c7d5cc241809a44f7ec9e7d0a97f6033840782844d9a11ea835f8dbe5c6cb" exitCode=0 Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.037411 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-mw769" event={"ID":"3acd3a6f-6dab-4bff-8f82-b96a8abc1555","Type":"ContainerDied","Data":"e47c7d5cc241809a44f7ec9e7d0a97f6033840782844d9a11ea835f8dbe5c6cb"} Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.037427 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-mw769" Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.037436 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-mw769" event={"ID":"3acd3a6f-6dab-4bff-8f82-b96a8abc1555","Type":"ContainerDied","Data":"6213860d80a4a1853515dff639c19fe1976a0094936fa472122680656692a3cd"} Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.057068 4991 scope.go:117] "RemoveContainer" containerID="0121e0344eecde6b772d3c8043ed674d1c1545009fc52db676ef55668f45abdc" Dec 06 01:07:24 crc kubenswrapper[4991]: E1206 01:07:24.057546 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0121e0344eecde6b772d3c8043ed674d1c1545009fc52db676ef55668f45abdc\": container with ID starting with 0121e0344eecde6b772d3c8043ed674d1c1545009fc52db676ef55668f45abdc not found: ID does not exist" containerID="0121e0344eecde6b772d3c8043ed674d1c1545009fc52db676ef55668f45abdc" Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.057595 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0121e0344eecde6b772d3c8043ed674d1c1545009fc52db676ef55668f45abdc"} err="failed to get container status \"0121e0344eecde6b772d3c8043ed674d1c1545009fc52db676ef55668f45abdc\": rpc error: code = NotFound desc = could not find container \"0121e0344eecde6b772d3c8043ed674d1c1545009fc52db676ef55668f45abdc\": container with ID starting with 0121e0344eecde6b772d3c8043ed674d1c1545009fc52db676ef55668f45abdc not found: ID does not exist" Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.057628 4991 scope.go:117] "RemoveContainer" containerID="e47c7d5cc241809a44f7ec9e7d0a97f6033840782844d9a11ea835f8dbe5c6cb" Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.080645 4991 scope.go:117] "RemoveContainer" containerID="e47c7d5cc241809a44f7ec9e7d0a97f6033840782844d9a11ea835f8dbe5c6cb" Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.080950 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-nhs9f"] Dec 06 01:07:24 crc kubenswrapper[4991]: E1206 01:07:24.081736 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e47c7d5cc241809a44f7ec9e7d0a97f6033840782844d9a11ea835f8dbe5c6cb\": container with ID starting with e47c7d5cc241809a44f7ec9e7d0a97f6033840782844d9a11ea835f8dbe5c6cb not found: ID does not exist" containerID="e47c7d5cc241809a44f7ec9e7d0a97f6033840782844d9a11ea835f8dbe5c6cb" Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.081778 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e47c7d5cc241809a44f7ec9e7d0a97f6033840782844d9a11ea835f8dbe5c6cb"} err="failed to get container status \"e47c7d5cc241809a44f7ec9e7d0a97f6033840782844d9a11ea835f8dbe5c6cb\": rpc error: code = NotFound desc = could not find container \"e47c7d5cc241809a44f7ec9e7d0a97f6033840782844d9a11ea835f8dbe5c6cb\": container with ID starting with e47c7d5cc241809a44f7ec9e7d0a97f6033840782844d9a11ea835f8dbe5c6cb not found: ID does not exist" Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.089275 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-nhs9f"] Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.094153 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mw769"] Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.096843 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mw769"] Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.430676 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-866ddd4d9c-mkmxf"] Dec 06 01:07:24 crc kubenswrapper[4991]: E1206 01:07:24.431052 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ef47549-ccde-4565-935c-8a92998c5eb3" containerName="route-controller-manager" Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.431075 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ef47549-ccde-4565-935c-8a92998c5eb3" containerName="route-controller-manager" Dec 06 01:07:24 crc kubenswrapper[4991]: E1206 01:07:24.431095 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.431108 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 06 01:07:24 crc kubenswrapper[4991]: E1206 01:07:24.431138 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3acd3a6f-6dab-4bff-8f82-b96a8abc1555" containerName="controller-manager" Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.431151 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="3acd3a6f-6dab-4bff-8f82-b96a8abc1555" containerName="controller-manager" Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.431311 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ef47549-ccde-4565-935c-8a92998c5eb3" containerName="route-controller-manager" Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.431342 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="3acd3a6f-6dab-4bff-8f82-b96a8abc1555" containerName="controller-manager" Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.431363 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.431960 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-866ddd4d9c-mkmxf" Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.434574 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.435339 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.435890 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.436145 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.436540 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.437149 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.445537 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.447067 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-866ddd4d9c-mkmxf"] Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.516823 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cc016a33-50b7-4e93-991a-102e4cc5a630-client-ca\") pod \"controller-manager-866ddd4d9c-mkmxf\" (UID: \"cc016a33-50b7-4e93-991a-102e4cc5a630\") " pod="openshift-controller-manager/controller-manager-866ddd4d9c-mkmxf" Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.516930 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc016a33-50b7-4e93-991a-102e4cc5a630-config\") pod \"controller-manager-866ddd4d9c-mkmxf\" (UID: \"cc016a33-50b7-4e93-991a-102e4cc5a630\") " pod="openshift-controller-manager/controller-manager-866ddd4d9c-mkmxf" Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.517039 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cc016a33-50b7-4e93-991a-102e4cc5a630-proxy-ca-bundles\") pod \"controller-manager-866ddd4d9c-mkmxf\" (UID: \"cc016a33-50b7-4e93-991a-102e4cc5a630\") " pod="openshift-controller-manager/controller-manager-866ddd4d9c-mkmxf" Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.517086 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l28mh\" (UniqueName: \"kubernetes.io/projected/cc016a33-50b7-4e93-991a-102e4cc5a630-kube-api-access-l28mh\") pod \"controller-manager-866ddd4d9c-mkmxf\" (UID: \"cc016a33-50b7-4e93-991a-102e4cc5a630\") " pod="openshift-controller-manager/controller-manager-866ddd4d9c-mkmxf" Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.517121 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc016a33-50b7-4e93-991a-102e4cc5a630-serving-cert\") pod \"controller-manager-866ddd4d9c-mkmxf\" (UID: \"cc016a33-50b7-4e93-991a-102e4cc5a630\") " pod="openshift-controller-manager/controller-manager-866ddd4d9c-mkmxf" Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.619200 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cc016a33-50b7-4e93-991a-102e4cc5a630-client-ca\") pod \"controller-manager-866ddd4d9c-mkmxf\" (UID: \"cc016a33-50b7-4e93-991a-102e4cc5a630\") " pod="openshift-controller-manager/controller-manager-866ddd4d9c-mkmxf" Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.619587 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc016a33-50b7-4e93-991a-102e4cc5a630-config\") pod \"controller-manager-866ddd4d9c-mkmxf\" (UID: \"cc016a33-50b7-4e93-991a-102e4cc5a630\") " pod="openshift-controller-manager/controller-manager-866ddd4d9c-mkmxf" Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.619701 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cc016a33-50b7-4e93-991a-102e4cc5a630-proxy-ca-bundles\") pod \"controller-manager-866ddd4d9c-mkmxf\" (UID: \"cc016a33-50b7-4e93-991a-102e4cc5a630\") " pod="openshift-controller-manager/controller-manager-866ddd4d9c-mkmxf" Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.619794 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l28mh\" (UniqueName: \"kubernetes.io/projected/cc016a33-50b7-4e93-991a-102e4cc5a630-kube-api-access-l28mh\") pod \"controller-manager-866ddd4d9c-mkmxf\" (UID: \"cc016a33-50b7-4e93-991a-102e4cc5a630\") " pod="openshift-controller-manager/controller-manager-866ddd4d9c-mkmxf" Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.619892 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc016a33-50b7-4e93-991a-102e4cc5a630-serving-cert\") pod \"controller-manager-866ddd4d9c-mkmxf\" (UID: \"cc016a33-50b7-4e93-991a-102e4cc5a630\") " pod="openshift-controller-manager/controller-manager-866ddd4d9c-mkmxf" Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.621925 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cc016a33-50b7-4e93-991a-102e4cc5a630-proxy-ca-bundles\") pod \"controller-manager-866ddd4d9c-mkmxf\" (UID: \"cc016a33-50b7-4e93-991a-102e4cc5a630\") " pod="openshift-controller-manager/controller-manager-866ddd4d9c-mkmxf" Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.622587 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc016a33-50b7-4e93-991a-102e4cc5a630-config\") pod \"controller-manager-866ddd4d9c-mkmxf\" (UID: \"cc016a33-50b7-4e93-991a-102e4cc5a630\") " pod="openshift-controller-manager/controller-manager-866ddd4d9c-mkmxf" Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.622658 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cc016a33-50b7-4e93-991a-102e4cc5a630-client-ca\") pod \"controller-manager-866ddd4d9c-mkmxf\" (UID: \"cc016a33-50b7-4e93-991a-102e4cc5a630\") " pod="openshift-controller-manager/controller-manager-866ddd4d9c-mkmxf" Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.633100 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc016a33-50b7-4e93-991a-102e4cc5a630-serving-cert\") pod \"controller-manager-866ddd4d9c-mkmxf\" (UID: \"cc016a33-50b7-4e93-991a-102e4cc5a630\") " pod="openshift-controller-manager/controller-manager-866ddd4d9c-mkmxf" Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.660837 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l28mh\" (UniqueName: \"kubernetes.io/projected/cc016a33-50b7-4e93-991a-102e4cc5a630-kube-api-access-l28mh\") pod \"controller-manager-866ddd4d9c-mkmxf\" (UID: \"cc016a33-50b7-4e93-991a-102e4cc5a630\") " pod="openshift-controller-manager/controller-manager-866ddd4d9c-mkmxf" Dec 06 01:07:24 crc kubenswrapper[4991]: I1206 01:07:24.806593 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-866ddd4d9c-mkmxf" Dec 06 01:07:25 crc kubenswrapper[4991]: I1206 01:07:25.046257 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3acd3a6f-6dab-4bff-8f82-b96a8abc1555" path="/var/lib/kubelet/pods/3acd3a6f-6dab-4bff-8f82-b96a8abc1555/volumes" Dec 06 01:07:25 crc kubenswrapper[4991]: I1206 01:07:25.047311 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ef47549-ccde-4565-935c-8a92998c5eb3" path="/var/lib/kubelet/pods/6ef47549-ccde-4565-935c-8a92998c5eb3/volumes" Dec 06 01:07:25 crc kubenswrapper[4991]: I1206 01:07:25.072715 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-866ddd4d9c-mkmxf"] Dec 06 01:07:25 crc kubenswrapper[4991]: I1206 01:07:25.430361 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8b979b485-tgfbr"] Dec 06 01:07:25 crc kubenswrapper[4991]: I1206 01:07:25.431238 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8b979b485-tgfbr" Dec 06 01:07:25 crc kubenswrapper[4991]: I1206 01:07:25.434122 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 06 01:07:25 crc kubenswrapper[4991]: I1206 01:07:25.435048 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 06 01:07:25 crc kubenswrapper[4991]: I1206 01:07:25.438734 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 06 01:07:25 crc kubenswrapper[4991]: I1206 01:07:25.438917 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 06 01:07:25 crc kubenswrapper[4991]: I1206 01:07:25.439525 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 06 01:07:25 crc kubenswrapper[4991]: I1206 01:07:25.440166 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 06 01:07:25 crc kubenswrapper[4991]: I1206 01:07:25.466009 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8b979b485-tgfbr"] Dec 06 01:07:25 crc kubenswrapper[4991]: I1206 01:07:25.532326 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkm6k\" (UniqueName: \"kubernetes.io/projected/a8276f0d-1980-482f-bddb-7c84c2723e03-kube-api-access-mkm6k\") pod \"route-controller-manager-8b979b485-tgfbr\" (UID: \"a8276f0d-1980-482f-bddb-7c84c2723e03\") " pod="openshift-route-controller-manager/route-controller-manager-8b979b485-tgfbr" Dec 06 01:07:25 crc kubenswrapper[4991]: I1206 01:07:25.532391 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8276f0d-1980-482f-bddb-7c84c2723e03-config\") pod \"route-controller-manager-8b979b485-tgfbr\" (UID: \"a8276f0d-1980-482f-bddb-7c84c2723e03\") " pod="openshift-route-controller-manager/route-controller-manager-8b979b485-tgfbr" Dec 06 01:07:25 crc kubenswrapper[4991]: I1206 01:07:25.532427 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8276f0d-1980-482f-bddb-7c84c2723e03-client-ca\") pod \"route-controller-manager-8b979b485-tgfbr\" (UID: \"a8276f0d-1980-482f-bddb-7c84c2723e03\") " pod="openshift-route-controller-manager/route-controller-manager-8b979b485-tgfbr" Dec 06 01:07:25 crc kubenswrapper[4991]: I1206 01:07:25.532459 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8276f0d-1980-482f-bddb-7c84c2723e03-serving-cert\") pod \"route-controller-manager-8b979b485-tgfbr\" (UID: \"a8276f0d-1980-482f-bddb-7c84c2723e03\") " pod="openshift-route-controller-manager/route-controller-manager-8b979b485-tgfbr" Dec 06 01:07:25 crc kubenswrapper[4991]: I1206 01:07:25.633754 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkm6k\" (UniqueName: \"kubernetes.io/projected/a8276f0d-1980-482f-bddb-7c84c2723e03-kube-api-access-mkm6k\") pod \"route-controller-manager-8b979b485-tgfbr\" (UID: \"a8276f0d-1980-482f-bddb-7c84c2723e03\") " pod="openshift-route-controller-manager/route-controller-manager-8b979b485-tgfbr" Dec 06 01:07:25 crc kubenswrapper[4991]: I1206 01:07:25.634129 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8276f0d-1980-482f-bddb-7c84c2723e03-config\") pod \"route-controller-manager-8b979b485-tgfbr\" (UID: \"a8276f0d-1980-482f-bddb-7c84c2723e03\") " pod="openshift-route-controller-manager/route-controller-manager-8b979b485-tgfbr" Dec 06 01:07:25 crc kubenswrapper[4991]: I1206 01:07:25.634173 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8276f0d-1980-482f-bddb-7c84c2723e03-client-ca\") pod \"route-controller-manager-8b979b485-tgfbr\" (UID: \"a8276f0d-1980-482f-bddb-7c84c2723e03\") " pod="openshift-route-controller-manager/route-controller-manager-8b979b485-tgfbr" Dec 06 01:07:25 crc kubenswrapper[4991]: I1206 01:07:25.634211 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8276f0d-1980-482f-bddb-7c84c2723e03-serving-cert\") pod \"route-controller-manager-8b979b485-tgfbr\" (UID: \"a8276f0d-1980-482f-bddb-7c84c2723e03\") " pod="openshift-route-controller-manager/route-controller-manager-8b979b485-tgfbr" Dec 06 01:07:25 crc kubenswrapper[4991]: I1206 01:07:25.636223 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8276f0d-1980-482f-bddb-7c84c2723e03-config\") pod \"route-controller-manager-8b979b485-tgfbr\" (UID: \"a8276f0d-1980-482f-bddb-7c84c2723e03\") " pod="openshift-route-controller-manager/route-controller-manager-8b979b485-tgfbr" Dec 06 01:07:25 crc kubenswrapper[4991]: I1206 01:07:25.636893 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8276f0d-1980-482f-bddb-7c84c2723e03-client-ca\") pod \"route-controller-manager-8b979b485-tgfbr\" (UID: \"a8276f0d-1980-482f-bddb-7c84c2723e03\") " pod="openshift-route-controller-manager/route-controller-manager-8b979b485-tgfbr" Dec 06 01:07:25 crc kubenswrapper[4991]: I1206 01:07:25.640725 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8276f0d-1980-482f-bddb-7c84c2723e03-serving-cert\") pod \"route-controller-manager-8b979b485-tgfbr\" (UID: \"a8276f0d-1980-482f-bddb-7c84c2723e03\") " pod="openshift-route-controller-manager/route-controller-manager-8b979b485-tgfbr" Dec 06 01:07:25 crc kubenswrapper[4991]: I1206 01:07:25.676318 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkm6k\" (UniqueName: \"kubernetes.io/projected/a8276f0d-1980-482f-bddb-7c84c2723e03-kube-api-access-mkm6k\") pod \"route-controller-manager-8b979b485-tgfbr\" (UID: \"a8276f0d-1980-482f-bddb-7c84c2723e03\") " pod="openshift-route-controller-manager/route-controller-manager-8b979b485-tgfbr" Dec 06 01:07:25 crc kubenswrapper[4991]: I1206 01:07:25.750578 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8b979b485-tgfbr" Dec 06 01:07:26 crc kubenswrapper[4991]: I1206 01:07:26.001868 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8b979b485-tgfbr"] Dec 06 01:07:26 crc kubenswrapper[4991]: W1206 01:07:26.010295 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8276f0d_1980_482f_bddb_7c84c2723e03.slice/crio-54293d7364f5aa39d9583142fe56d31ae773149ef3d4b851f27dfe207a7264b3 WatchSource:0}: Error finding container 54293d7364f5aa39d9583142fe56d31ae773149ef3d4b851f27dfe207a7264b3: Status 404 returned error can't find the container with id 54293d7364f5aa39d9583142fe56d31ae773149ef3d4b851f27dfe207a7264b3 Dec 06 01:07:26 crc kubenswrapper[4991]: I1206 01:07:26.063879 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8b979b485-tgfbr" event={"ID":"a8276f0d-1980-482f-bddb-7c84c2723e03","Type":"ContainerStarted","Data":"54293d7364f5aa39d9583142fe56d31ae773149ef3d4b851f27dfe207a7264b3"} Dec 06 01:07:26 crc kubenswrapper[4991]: I1206 01:07:26.065911 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-866ddd4d9c-mkmxf" event={"ID":"cc016a33-50b7-4e93-991a-102e4cc5a630","Type":"ContainerStarted","Data":"3b98e2ee8ecfe91e4d8a2a43ab8193cb756547ccd54cabba9abc5e553c1a91b0"} Dec 06 01:07:26 crc kubenswrapper[4991]: I1206 01:07:26.065941 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-866ddd4d9c-mkmxf" event={"ID":"cc016a33-50b7-4e93-991a-102e4cc5a630","Type":"ContainerStarted","Data":"dfa478083fc0c1634fd60c1f9f280513452ff994fb0942a8fa11db0e9ea6c548"} Dec 06 01:07:26 crc kubenswrapper[4991]: I1206 01:07:26.066724 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-866ddd4d9c-mkmxf" Dec 06 01:07:26 crc kubenswrapper[4991]: I1206 01:07:26.074949 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-866ddd4d9c-mkmxf" Dec 06 01:07:26 crc kubenswrapper[4991]: I1206 01:07:26.084931 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-866ddd4d9c-mkmxf" podStartSLOduration=3.084908028 podStartE2EDuration="3.084908028s" podCreationTimestamp="2025-12-06 01:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:07:26.084295133 +0000 UTC m=+319.434585611" watchObservedRunningTime="2025-12-06 01:07:26.084908028 +0000 UTC m=+319.435198496" Dec 06 01:07:27 crc kubenswrapper[4991]: I1206 01:07:27.075862 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8b979b485-tgfbr" event={"ID":"a8276f0d-1980-482f-bddb-7c84c2723e03","Type":"ContainerStarted","Data":"c3f021e0ef965f5c40a101d81619c63dcb86e6dad745213acc68c128cc22de02"} Dec 06 01:07:27 crc kubenswrapper[4991]: I1206 01:07:27.098451 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-8b979b485-tgfbr" podStartSLOduration=4.098417802 podStartE2EDuration="4.098417802s" podCreationTimestamp="2025-12-06 01:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:07:27.096816493 +0000 UTC m=+320.447106991" watchObservedRunningTime="2025-12-06 01:07:27.098417802 +0000 UTC m=+320.448708280" Dec 06 01:07:28 crc kubenswrapper[4991]: I1206 01:07:28.082562 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-8b979b485-tgfbr" Dec 06 01:07:28 crc kubenswrapper[4991]: I1206 01:07:28.091655 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-8b979b485-tgfbr" Dec 06 01:07:32 crc kubenswrapper[4991]: I1206 01:07:32.659881 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mcvg9"] Dec 06 01:07:32 crc kubenswrapper[4991]: I1206 01:07:32.661836 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-mcvg9" Dec 06 01:07:32 crc kubenswrapper[4991]: I1206 01:07:32.687015 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mcvg9"] Dec 06 01:07:32 crc kubenswrapper[4991]: I1206 01:07:32.850600 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7c9d5871-1739-4375-a96e-b8ec6c552e2b-registry-certificates\") pod \"image-registry-66df7c8f76-mcvg9\" (UID: \"7c9d5871-1739-4375-a96e-b8ec6c552e2b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mcvg9" Dec 06 01:07:32 crc kubenswrapper[4991]: I1206 01:07:32.850651 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7c9d5871-1739-4375-a96e-b8ec6c552e2b-trusted-ca\") pod \"image-registry-66df7c8f76-mcvg9\" (UID: \"7c9d5871-1739-4375-a96e-b8ec6c552e2b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mcvg9" Dec 06 01:07:32 crc kubenswrapper[4991]: I1206 01:07:32.850683 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7c9d5871-1739-4375-a96e-b8ec6c552e2b-bound-sa-token\") pod \"image-registry-66df7c8f76-mcvg9\" (UID: \"7c9d5871-1739-4375-a96e-b8ec6c552e2b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mcvg9" Dec 06 01:07:32 crc kubenswrapper[4991]: I1206 01:07:32.850703 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7c9d5871-1739-4375-a96e-b8ec6c552e2b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mcvg9\" (UID: \"7c9d5871-1739-4375-a96e-b8ec6c552e2b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mcvg9" Dec 06 01:07:32 crc kubenswrapper[4991]: I1206 01:07:32.850733 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjs8l\" (UniqueName: \"kubernetes.io/projected/7c9d5871-1739-4375-a96e-b8ec6c552e2b-kube-api-access-fjs8l\") pod \"image-registry-66df7c8f76-mcvg9\" (UID: \"7c9d5871-1739-4375-a96e-b8ec6c552e2b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mcvg9" Dec 06 01:07:32 crc kubenswrapper[4991]: I1206 01:07:32.850764 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-mcvg9\" (UID: \"7c9d5871-1739-4375-a96e-b8ec6c552e2b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mcvg9" Dec 06 01:07:32 crc kubenswrapper[4991]: I1206 01:07:32.850793 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7c9d5871-1739-4375-a96e-b8ec6c552e2b-registry-tls\") pod \"image-registry-66df7c8f76-mcvg9\" (UID: \"7c9d5871-1739-4375-a96e-b8ec6c552e2b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mcvg9" Dec 06 01:07:32 crc kubenswrapper[4991]: I1206 01:07:32.850816 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7c9d5871-1739-4375-a96e-b8ec6c552e2b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mcvg9\" (UID: \"7c9d5871-1739-4375-a96e-b8ec6c552e2b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mcvg9" Dec 06 01:07:32 crc kubenswrapper[4991]: I1206 01:07:32.891048 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-mcvg9\" (UID: \"7c9d5871-1739-4375-a96e-b8ec6c552e2b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mcvg9" Dec 06 01:07:32 crc kubenswrapper[4991]: I1206 01:07:32.952057 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7c9d5871-1739-4375-a96e-b8ec6c552e2b-registry-certificates\") pod \"image-registry-66df7c8f76-mcvg9\" (UID: \"7c9d5871-1739-4375-a96e-b8ec6c552e2b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mcvg9" Dec 06 01:07:32 crc kubenswrapper[4991]: I1206 01:07:32.952118 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7c9d5871-1739-4375-a96e-b8ec6c552e2b-trusted-ca\") pod \"image-registry-66df7c8f76-mcvg9\" (UID: \"7c9d5871-1739-4375-a96e-b8ec6c552e2b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mcvg9" Dec 06 01:07:32 crc kubenswrapper[4991]: I1206 01:07:32.952154 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7c9d5871-1739-4375-a96e-b8ec6c552e2b-bound-sa-token\") pod \"image-registry-66df7c8f76-mcvg9\" (UID: \"7c9d5871-1739-4375-a96e-b8ec6c552e2b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mcvg9" Dec 06 01:07:32 crc kubenswrapper[4991]: I1206 01:07:32.952174 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7c9d5871-1739-4375-a96e-b8ec6c552e2b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mcvg9\" (UID: \"7c9d5871-1739-4375-a96e-b8ec6c552e2b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mcvg9" Dec 06 01:07:32 crc kubenswrapper[4991]: I1206 01:07:32.952206 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjs8l\" (UniqueName: \"kubernetes.io/projected/7c9d5871-1739-4375-a96e-b8ec6c552e2b-kube-api-access-fjs8l\") pod \"image-registry-66df7c8f76-mcvg9\" (UID: \"7c9d5871-1739-4375-a96e-b8ec6c552e2b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mcvg9" Dec 06 01:07:32 crc kubenswrapper[4991]: I1206 01:07:32.952256 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7c9d5871-1739-4375-a96e-b8ec6c552e2b-registry-tls\") pod \"image-registry-66df7c8f76-mcvg9\" (UID: \"7c9d5871-1739-4375-a96e-b8ec6c552e2b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mcvg9" Dec 06 01:07:32 crc kubenswrapper[4991]: I1206 01:07:32.952271 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7c9d5871-1739-4375-a96e-b8ec6c552e2b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mcvg9\" (UID: \"7c9d5871-1739-4375-a96e-b8ec6c552e2b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mcvg9" Dec 06 01:07:32 crc kubenswrapper[4991]: I1206 01:07:32.952972 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7c9d5871-1739-4375-a96e-b8ec6c552e2b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mcvg9\" (UID: \"7c9d5871-1739-4375-a96e-b8ec6c552e2b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mcvg9" Dec 06 01:07:32 crc kubenswrapper[4991]: I1206 01:07:32.954049 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7c9d5871-1739-4375-a96e-b8ec6c552e2b-trusted-ca\") pod \"image-registry-66df7c8f76-mcvg9\" (UID: \"7c9d5871-1739-4375-a96e-b8ec6c552e2b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mcvg9" Dec 06 01:07:32 crc kubenswrapper[4991]: I1206 01:07:32.954143 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7c9d5871-1739-4375-a96e-b8ec6c552e2b-registry-certificates\") pod \"image-registry-66df7c8f76-mcvg9\" (UID: \"7c9d5871-1739-4375-a96e-b8ec6c552e2b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mcvg9" Dec 06 01:07:32 crc kubenswrapper[4991]: I1206 01:07:32.961645 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7c9d5871-1739-4375-a96e-b8ec6c552e2b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mcvg9\" (UID: \"7c9d5871-1739-4375-a96e-b8ec6c552e2b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mcvg9" Dec 06 01:07:32 crc kubenswrapper[4991]: I1206 01:07:32.961681 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7c9d5871-1739-4375-a96e-b8ec6c552e2b-registry-tls\") pod \"image-registry-66df7c8f76-mcvg9\" (UID: \"7c9d5871-1739-4375-a96e-b8ec6c552e2b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mcvg9" Dec 06 01:07:32 crc kubenswrapper[4991]: I1206 01:07:32.974038 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjs8l\" (UniqueName: \"kubernetes.io/projected/7c9d5871-1739-4375-a96e-b8ec6c552e2b-kube-api-access-fjs8l\") pod \"image-registry-66df7c8f76-mcvg9\" (UID: \"7c9d5871-1739-4375-a96e-b8ec6c552e2b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mcvg9" Dec 06 01:07:32 crc kubenswrapper[4991]: I1206 01:07:32.976817 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7c9d5871-1739-4375-a96e-b8ec6c552e2b-bound-sa-token\") pod \"image-registry-66df7c8f76-mcvg9\" (UID: \"7c9d5871-1739-4375-a96e-b8ec6c552e2b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mcvg9" Dec 06 01:07:32 crc kubenswrapper[4991]: I1206 01:07:32.989909 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-mcvg9" Dec 06 01:07:33 crc kubenswrapper[4991]: I1206 01:07:33.500349 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mcvg9"] Dec 06 01:07:34 crc kubenswrapper[4991]: I1206 01:07:34.121832 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-mcvg9" event={"ID":"7c9d5871-1739-4375-a96e-b8ec6c552e2b","Type":"ContainerStarted","Data":"22a140734e101f21957cd1c45aebf234ec3ae710230b34a2afe9e0948f3b6f9c"} Dec 06 01:07:34 crc kubenswrapper[4991]: I1206 01:07:34.122380 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-mcvg9" Dec 06 01:07:34 crc kubenswrapper[4991]: I1206 01:07:34.122398 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-mcvg9" event={"ID":"7c9d5871-1739-4375-a96e-b8ec6c552e2b","Type":"ContainerStarted","Data":"6c9662d1deebc0cec9356716de4c89532a9789d451580aabdbb7419c319c662f"} Dec 06 01:07:34 crc kubenswrapper[4991]: I1206 01:07:34.144557 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-mcvg9" podStartSLOduration=2.14453528 podStartE2EDuration="2.14453528s" podCreationTimestamp="2025-12-06 01:07:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:07:34.144155929 +0000 UTC m=+327.494446397" watchObservedRunningTime="2025-12-06 01:07:34.14453528 +0000 UTC m=+327.494825748" Dec 06 01:07:40 crc kubenswrapper[4991]: I1206 01:07:40.354505 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wb2rc"] Dec 06 01:07:40 crc kubenswrapper[4991]: I1206 01:07:40.357156 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wb2rc" Dec 06 01:07:40 crc kubenswrapper[4991]: I1206 01:07:40.360292 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 06 01:07:40 crc kubenswrapper[4991]: I1206 01:07:40.379220 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wb2rc"] Dec 06 01:07:40 crc kubenswrapper[4991]: I1206 01:07:40.497185 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa0283c0-7c60-47d5-9bf8-28bf2c1d0ecc-catalog-content\") pod \"redhat-operators-wb2rc\" (UID: \"aa0283c0-7c60-47d5-9bf8-28bf2c1d0ecc\") " pod="openshift-marketplace/redhat-operators-wb2rc" Dec 06 01:07:40 crc kubenswrapper[4991]: I1206 01:07:40.497497 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzcpn\" (UniqueName: \"kubernetes.io/projected/aa0283c0-7c60-47d5-9bf8-28bf2c1d0ecc-kube-api-access-qzcpn\") pod \"redhat-operators-wb2rc\" (UID: \"aa0283c0-7c60-47d5-9bf8-28bf2c1d0ecc\") " pod="openshift-marketplace/redhat-operators-wb2rc" Dec 06 01:07:40 crc kubenswrapper[4991]: I1206 01:07:40.497657 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa0283c0-7c60-47d5-9bf8-28bf2c1d0ecc-utilities\") pod \"redhat-operators-wb2rc\" (UID: \"aa0283c0-7c60-47d5-9bf8-28bf2c1d0ecc\") " pod="openshift-marketplace/redhat-operators-wb2rc" Dec 06 01:07:40 crc kubenswrapper[4991]: I1206 01:07:40.599928 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzcpn\" (UniqueName: \"kubernetes.io/projected/aa0283c0-7c60-47d5-9bf8-28bf2c1d0ecc-kube-api-access-qzcpn\") pod \"redhat-operators-wb2rc\" (UID: \"aa0283c0-7c60-47d5-9bf8-28bf2c1d0ecc\") " pod="openshift-marketplace/redhat-operators-wb2rc" Dec 06 01:07:40 crc kubenswrapper[4991]: I1206 01:07:40.600066 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa0283c0-7c60-47d5-9bf8-28bf2c1d0ecc-utilities\") pod \"redhat-operators-wb2rc\" (UID: \"aa0283c0-7c60-47d5-9bf8-28bf2c1d0ecc\") " pod="openshift-marketplace/redhat-operators-wb2rc" Dec 06 01:07:40 crc kubenswrapper[4991]: I1206 01:07:40.600149 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa0283c0-7c60-47d5-9bf8-28bf2c1d0ecc-catalog-content\") pod \"redhat-operators-wb2rc\" (UID: \"aa0283c0-7c60-47d5-9bf8-28bf2c1d0ecc\") " pod="openshift-marketplace/redhat-operators-wb2rc" Dec 06 01:07:40 crc kubenswrapper[4991]: I1206 01:07:40.601113 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa0283c0-7c60-47d5-9bf8-28bf2c1d0ecc-catalog-content\") pod \"redhat-operators-wb2rc\" (UID: \"aa0283c0-7c60-47d5-9bf8-28bf2c1d0ecc\") " pod="openshift-marketplace/redhat-operators-wb2rc" Dec 06 01:07:40 crc kubenswrapper[4991]: I1206 01:07:40.601258 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa0283c0-7c60-47d5-9bf8-28bf2c1d0ecc-utilities\") pod \"redhat-operators-wb2rc\" (UID: \"aa0283c0-7c60-47d5-9bf8-28bf2c1d0ecc\") " pod="openshift-marketplace/redhat-operators-wb2rc" Dec 06 01:07:40 crc kubenswrapper[4991]: I1206 01:07:40.630221 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzcpn\" (UniqueName: \"kubernetes.io/projected/aa0283c0-7c60-47d5-9bf8-28bf2c1d0ecc-kube-api-access-qzcpn\") pod \"redhat-operators-wb2rc\" (UID: \"aa0283c0-7c60-47d5-9bf8-28bf2c1d0ecc\") " pod="openshift-marketplace/redhat-operators-wb2rc" Dec 06 01:07:40 crc kubenswrapper[4991]: I1206 01:07:40.708934 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wb2rc" Dec 06 01:07:40 crc kubenswrapper[4991]: I1206 01:07:40.744209 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-chxpn"] Dec 06 01:07:40 crc kubenswrapper[4991]: I1206 01:07:40.746129 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-chxpn" Dec 06 01:07:40 crc kubenswrapper[4991]: I1206 01:07:40.750428 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 06 01:07:40 crc kubenswrapper[4991]: I1206 01:07:40.774477 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-chxpn"] Dec 06 01:07:40 crc kubenswrapper[4991]: I1206 01:07:40.910425 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s4xn\" (UniqueName: \"kubernetes.io/projected/a9de55b2-75a9-4843-bfee-297b2522e7f0-kube-api-access-8s4xn\") pod \"redhat-marketplace-chxpn\" (UID: \"a9de55b2-75a9-4843-bfee-297b2522e7f0\") " pod="openshift-marketplace/redhat-marketplace-chxpn" Dec 06 01:07:40 crc kubenswrapper[4991]: I1206 01:07:40.910485 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9de55b2-75a9-4843-bfee-297b2522e7f0-utilities\") pod \"redhat-marketplace-chxpn\" (UID: \"a9de55b2-75a9-4843-bfee-297b2522e7f0\") " pod="openshift-marketplace/redhat-marketplace-chxpn" Dec 06 01:07:40 crc kubenswrapper[4991]: I1206 01:07:40.910525 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9de55b2-75a9-4843-bfee-297b2522e7f0-catalog-content\") pod \"redhat-marketplace-chxpn\" (UID: \"a9de55b2-75a9-4843-bfee-297b2522e7f0\") " pod="openshift-marketplace/redhat-marketplace-chxpn" Dec 06 01:07:41 crc kubenswrapper[4991]: I1206 01:07:41.012887 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9de55b2-75a9-4843-bfee-297b2522e7f0-catalog-content\") pod \"redhat-marketplace-chxpn\" (UID: \"a9de55b2-75a9-4843-bfee-297b2522e7f0\") " pod="openshift-marketplace/redhat-marketplace-chxpn" Dec 06 01:07:41 crc kubenswrapper[4991]: I1206 01:07:41.013139 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8s4xn\" (UniqueName: \"kubernetes.io/projected/a9de55b2-75a9-4843-bfee-297b2522e7f0-kube-api-access-8s4xn\") pod \"redhat-marketplace-chxpn\" (UID: \"a9de55b2-75a9-4843-bfee-297b2522e7f0\") " pod="openshift-marketplace/redhat-marketplace-chxpn" Dec 06 01:07:41 crc kubenswrapper[4991]: I1206 01:07:41.013205 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9de55b2-75a9-4843-bfee-297b2522e7f0-utilities\") pod \"redhat-marketplace-chxpn\" (UID: \"a9de55b2-75a9-4843-bfee-297b2522e7f0\") " pod="openshift-marketplace/redhat-marketplace-chxpn" Dec 06 01:07:41 crc kubenswrapper[4991]: I1206 01:07:41.014343 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9de55b2-75a9-4843-bfee-297b2522e7f0-utilities\") pod \"redhat-marketplace-chxpn\" (UID: \"a9de55b2-75a9-4843-bfee-297b2522e7f0\") " pod="openshift-marketplace/redhat-marketplace-chxpn" Dec 06 01:07:41 crc kubenswrapper[4991]: I1206 01:07:41.015323 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9de55b2-75a9-4843-bfee-297b2522e7f0-catalog-content\") pod \"redhat-marketplace-chxpn\" (UID: \"a9de55b2-75a9-4843-bfee-297b2522e7f0\") " pod="openshift-marketplace/redhat-marketplace-chxpn" Dec 06 01:07:41 crc kubenswrapper[4991]: I1206 01:07:41.037571 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s4xn\" (UniqueName: \"kubernetes.io/projected/a9de55b2-75a9-4843-bfee-297b2522e7f0-kube-api-access-8s4xn\") pod \"redhat-marketplace-chxpn\" (UID: \"a9de55b2-75a9-4843-bfee-297b2522e7f0\") " pod="openshift-marketplace/redhat-marketplace-chxpn" Dec 06 01:07:41 crc kubenswrapper[4991]: I1206 01:07:41.132570 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-chxpn" Dec 06 01:07:41 crc kubenswrapper[4991]: I1206 01:07:41.189839 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wb2rc"] Dec 06 01:07:41 crc kubenswrapper[4991]: W1206 01:07:41.205361 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa0283c0_7c60_47d5_9bf8_28bf2c1d0ecc.slice/crio-e946a4f6c8b769a24af0bf533fd2daef5531e26015dd25c9e3c2a0c35591d196 WatchSource:0}: Error finding container e946a4f6c8b769a24af0bf533fd2daef5531e26015dd25c9e3c2a0c35591d196: Status 404 returned error can't find the container with id e946a4f6c8b769a24af0bf533fd2daef5531e26015dd25c9e3c2a0c35591d196 Dec 06 01:07:41 crc kubenswrapper[4991]: I1206 01:07:41.588956 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-chxpn"] Dec 06 01:07:41 crc kubenswrapper[4991]: W1206 01:07:41.595106 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9de55b2_75a9_4843_bfee_297b2522e7f0.slice/crio-f32573e07d0abd4e497f6b8a74a4c2747e112d2ebfdef7e747658c17a75c90b1 WatchSource:0}: Error finding container f32573e07d0abd4e497f6b8a74a4c2747e112d2ebfdef7e747658c17a75c90b1: Status 404 returned error can't find the container with id f32573e07d0abd4e497f6b8a74a4c2747e112d2ebfdef7e747658c17a75c90b1 Dec 06 01:07:42 crc kubenswrapper[4991]: I1206 01:07:42.183121 4991 generic.go:334] "Generic (PLEG): container finished" podID="a9de55b2-75a9-4843-bfee-297b2522e7f0" containerID="23d900353650a2158ec43ab5c2ba97bb814b68c3f5b7c3b116121ea54b0a0692" exitCode=0 Dec 06 01:07:42 crc kubenswrapper[4991]: I1206 01:07:42.183216 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-chxpn" event={"ID":"a9de55b2-75a9-4843-bfee-297b2522e7f0","Type":"ContainerDied","Data":"23d900353650a2158ec43ab5c2ba97bb814b68c3f5b7c3b116121ea54b0a0692"} Dec 06 01:07:42 crc kubenswrapper[4991]: I1206 01:07:42.183621 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-chxpn" event={"ID":"a9de55b2-75a9-4843-bfee-297b2522e7f0","Type":"ContainerStarted","Data":"f32573e07d0abd4e497f6b8a74a4c2747e112d2ebfdef7e747658c17a75c90b1"} Dec 06 01:07:42 crc kubenswrapper[4991]: I1206 01:07:42.186213 4991 generic.go:334] "Generic (PLEG): container finished" podID="aa0283c0-7c60-47d5-9bf8-28bf2c1d0ecc" containerID="3e2f2baa12917fbd57feb01635bb667d2335f661f42c0bc9b99847c8606077f8" exitCode=0 Dec 06 01:07:42 crc kubenswrapper[4991]: I1206 01:07:42.186270 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wb2rc" event={"ID":"aa0283c0-7c60-47d5-9bf8-28bf2c1d0ecc","Type":"ContainerDied","Data":"3e2f2baa12917fbd57feb01635bb667d2335f661f42c0bc9b99847c8606077f8"} Dec 06 01:07:42 crc kubenswrapper[4991]: I1206 01:07:42.186308 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wb2rc" event={"ID":"aa0283c0-7c60-47d5-9bf8-28bf2c1d0ecc","Type":"ContainerStarted","Data":"e946a4f6c8b769a24af0bf533fd2daef5531e26015dd25c9e3c2a0c35591d196"} Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.141128 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-w9mn9"] Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.143129 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w9mn9" Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.145481 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.149596 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-866ddd4d9c-mkmxf"] Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.149855 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-866ddd4d9c-mkmxf" podUID="cc016a33-50b7-4e93-991a-102e4cc5a630" containerName="controller-manager" containerID="cri-o://3b98e2ee8ecfe91e4d8a2a43ab8193cb756547ccd54cabba9abc5e553c1a91b0" gracePeriod=30 Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.179471 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8b979b485-tgfbr"] Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.179741 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-8b979b485-tgfbr" podUID="a8276f0d-1980-482f-bddb-7c84c2723e03" containerName="route-controller-manager" containerID="cri-o://c3f021e0ef965f5c40a101d81619c63dcb86e6dad745213acc68c128cc22de02" gracePeriod=30 Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.239807 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-chxpn" event={"ID":"a9de55b2-75a9-4843-bfee-297b2522e7f0","Type":"ContainerStarted","Data":"eaf01866e623c2ad84cdf0dfc6630455499a00ca9f2e7036fb4c38975a4ff7f9"} Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.255411 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97d40092-c83e-4410-9535-73e4aad9eafc-utilities\") pod \"certified-operators-w9mn9\" (UID: \"97d40092-c83e-4410-9535-73e4aad9eafc\") " pod="openshift-marketplace/certified-operators-w9mn9" Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.255470 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlvtp\" (UniqueName: \"kubernetes.io/projected/97d40092-c83e-4410-9535-73e4aad9eafc-kube-api-access-hlvtp\") pod \"certified-operators-w9mn9\" (UID: \"97d40092-c83e-4410-9535-73e4aad9eafc\") " pod="openshift-marketplace/certified-operators-w9mn9" Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.255492 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97d40092-c83e-4410-9535-73e4aad9eafc-catalog-content\") pod \"certified-operators-w9mn9\" (UID: \"97d40092-c83e-4410-9535-73e4aad9eafc\") " pod="openshift-marketplace/certified-operators-w9mn9" Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.288253 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w9mn9"] Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.355470 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-v9ncc"] Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.356652 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v9ncc" Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.356928 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97d40092-c83e-4410-9535-73e4aad9eafc-utilities\") pod \"certified-operators-w9mn9\" (UID: \"97d40092-c83e-4410-9535-73e4aad9eafc\") " pod="openshift-marketplace/certified-operators-w9mn9" Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.357008 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlvtp\" (UniqueName: \"kubernetes.io/projected/97d40092-c83e-4410-9535-73e4aad9eafc-kube-api-access-hlvtp\") pod \"certified-operators-w9mn9\" (UID: \"97d40092-c83e-4410-9535-73e4aad9eafc\") " pod="openshift-marketplace/certified-operators-w9mn9" Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.357038 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97d40092-c83e-4410-9535-73e4aad9eafc-catalog-content\") pod \"certified-operators-w9mn9\" (UID: \"97d40092-c83e-4410-9535-73e4aad9eafc\") " pod="openshift-marketplace/certified-operators-w9mn9" Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.357575 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97d40092-c83e-4410-9535-73e4aad9eafc-catalog-content\") pod \"certified-operators-w9mn9\" (UID: \"97d40092-c83e-4410-9535-73e4aad9eafc\") " pod="openshift-marketplace/certified-operators-w9mn9" Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.357731 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97d40092-c83e-4410-9535-73e4aad9eafc-utilities\") pod \"certified-operators-w9mn9\" (UID: \"97d40092-c83e-4410-9535-73e4aad9eafc\") " pod="openshift-marketplace/certified-operators-w9mn9" Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.363228 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.375317 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v9ncc"] Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.401058 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlvtp\" (UniqueName: \"kubernetes.io/projected/97d40092-c83e-4410-9535-73e4aad9eafc-kube-api-access-hlvtp\") pod \"certified-operators-w9mn9\" (UID: \"97d40092-c83e-4410-9535-73e4aad9eafc\") " pod="openshift-marketplace/certified-operators-w9mn9" Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.458046 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9f080b8-d4ac-4dd5-9ce8-850cb9303ae1-catalog-content\") pod \"community-operators-v9ncc\" (UID: \"d9f080b8-d4ac-4dd5-9ce8-850cb9303ae1\") " pod="openshift-marketplace/community-operators-v9ncc" Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.458164 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzvq5\" (UniqueName: \"kubernetes.io/projected/d9f080b8-d4ac-4dd5-9ce8-850cb9303ae1-kube-api-access-gzvq5\") pod \"community-operators-v9ncc\" (UID: \"d9f080b8-d4ac-4dd5-9ce8-850cb9303ae1\") " pod="openshift-marketplace/community-operators-v9ncc" Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.458184 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9f080b8-d4ac-4dd5-9ce8-850cb9303ae1-utilities\") pod \"community-operators-v9ncc\" (UID: \"d9f080b8-d4ac-4dd5-9ce8-850cb9303ae1\") " pod="openshift-marketplace/community-operators-v9ncc" Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.467381 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w9mn9" Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.560183 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9f080b8-d4ac-4dd5-9ce8-850cb9303ae1-catalog-content\") pod \"community-operators-v9ncc\" (UID: \"d9f080b8-d4ac-4dd5-9ce8-850cb9303ae1\") " pod="openshift-marketplace/community-operators-v9ncc" Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.560629 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzvq5\" (UniqueName: \"kubernetes.io/projected/d9f080b8-d4ac-4dd5-9ce8-850cb9303ae1-kube-api-access-gzvq5\") pod \"community-operators-v9ncc\" (UID: \"d9f080b8-d4ac-4dd5-9ce8-850cb9303ae1\") " pod="openshift-marketplace/community-operators-v9ncc" Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.560651 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9f080b8-d4ac-4dd5-9ce8-850cb9303ae1-utilities\") pod \"community-operators-v9ncc\" (UID: \"d9f080b8-d4ac-4dd5-9ce8-850cb9303ae1\") " pod="openshift-marketplace/community-operators-v9ncc" Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.560898 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9f080b8-d4ac-4dd5-9ce8-850cb9303ae1-catalog-content\") pod \"community-operators-v9ncc\" (UID: \"d9f080b8-d4ac-4dd5-9ce8-850cb9303ae1\") " pod="openshift-marketplace/community-operators-v9ncc" Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.561152 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9f080b8-d4ac-4dd5-9ce8-850cb9303ae1-utilities\") pod \"community-operators-v9ncc\" (UID: \"d9f080b8-d4ac-4dd5-9ce8-850cb9303ae1\") " pod="openshift-marketplace/community-operators-v9ncc" Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.581959 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzvq5\" (UniqueName: \"kubernetes.io/projected/d9f080b8-d4ac-4dd5-9ce8-850cb9303ae1-kube-api-access-gzvq5\") pod \"community-operators-v9ncc\" (UID: \"d9f080b8-d4ac-4dd5-9ce8-850cb9303ae1\") " pod="openshift-marketplace/community-operators-v9ncc" Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.726751 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w9mn9"] Dec 06 01:07:43 crc kubenswrapper[4991]: W1206 01:07:43.733376 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97d40092_c83e_4410_9535_73e4aad9eafc.slice/crio-1e17ff2931935b75fee5b841e2411d8293f43a9edd09b67167d48e993185c5c6 WatchSource:0}: Error finding container 1e17ff2931935b75fee5b841e2411d8293f43a9edd09b67167d48e993185c5c6: Status 404 returned error can't find the container with id 1e17ff2931935b75fee5b841e2411d8293f43a9edd09b67167d48e993185c5c6 Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.739751 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v9ncc" Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.745958 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8b979b485-tgfbr" Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.764112 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8276f0d-1980-482f-bddb-7c84c2723e03-client-ca\") pod \"a8276f0d-1980-482f-bddb-7c84c2723e03\" (UID: \"a8276f0d-1980-482f-bddb-7c84c2723e03\") " Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.764162 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkm6k\" (UniqueName: \"kubernetes.io/projected/a8276f0d-1980-482f-bddb-7c84c2723e03-kube-api-access-mkm6k\") pod \"a8276f0d-1980-482f-bddb-7c84c2723e03\" (UID: \"a8276f0d-1980-482f-bddb-7c84c2723e03\") " Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.764194 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8276f0d-1980-482f-bddb-7c84c2723e03-config\") pod \"a8276f0d-1980-482f-bddb-7c84c2723e03\" (UID: \"a8276f0d-1980-482f-bddb-7c84c2723e03\") " Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.764216 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8276f0d-1980-482f-bddb-7c84c2723e03-serving-cert\") pod \"a8276f0d-1980-482f-bddb-7c84c2723e03\" (UID: \"a8276f0d-1980-482f-bddb-7c84c2723e03\") " Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.765737 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8276f0d-1980-482f-bddb-7c84c2723e03-config" (OuterVolumeSpecName: "config") pod "a8276f0d-1980-482f-bddb-7c84c2723e03" (UID: "a8276f0d-1980-482f-bddb-7c84c2723e03"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.766711 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8276f0d-1980-482f-bddb-7c84c2723e03-client-ca" (OuterVolumeSpecName: "client-ca") pod "a8276f0d-1980-482f-bddb-7c84c2723e03" (UID: "a8276f0d-1980-482f-bddb-7c84c2723e03"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.768168 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8276f0d-1980-482f-bddb-7c84c2723e03-kube-api-access-mkm6k" (OuterVolumeSpecName: "kube-api-access-mkm6k") pod "a8276f0d-1980-482f-bddb-7c84c2723e03" (UID: "a8276f0d-1980-482f-bddb-7c84c2723e03"). InnerVolumeSpecName "kube-api-access-mkm6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.773696 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8276f0d-1980-482f-bddb-7c84c2723e03-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a8276f0d-1980-482f-bddb-7c84c2723e03" (UID: "a8276f0d-1980-482f-bddb-7c84c2723e03"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.833501 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-866ddd4d9c-mkmxf" Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.872337 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc016a33-50b7-4e93-991a-102e4cc5a630-config\") pod \"cc016a33-50b7-4e93-991a-102e4cc5a630\" (UID: \"cc016a33-50b7-4e93-991a-102e4cc5a630\") " Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.872880 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cc016a33-50b7-4e93-991a-102e4cc5a630-client-ca\") pod \"cc016a33-50b7-4e93-991a-102e4cc5a630\" (UID: \"cc016a33-50b7-4e93-991a-102e4cc5a630\") " Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.872939 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l28mh\" (UniqueName: \"kubernetes.io/projected/cc016a33-50b7-4e93-991a-102e4cc5a630-kube-api-access-l28mh\") pod \"cc016a33-50b7-4e93-991a-102e4cc5a630\" (UID: \"cc016a33-50b7-4e93-991a-102e4cc5a630\") " Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.872999 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cc016a33-50b7-4e93-991a-102e4cc5a630-proxy-ca-bundles\") pod \"cc016a33-50b7-4e93-991a-102e4cc5a630\" (UID: \"cc016a33-50b7-4e93-991a-102e4cc5a630\") " Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.873061 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc016a33-50b7-4e93-991a-102e4cc5a630-serving-cert\") pod \"cc016a33-50b7-4e93-991a-102e4cc5a630\" (UID: \"cc016a33-50b7-4e93-991a-102e4cc5a630\") " Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.874851 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc016a33-50b7-4e93-991a-102e4cc5a630-client-ca" (OuterVolumeSpecName: "client-ca") pod "cc016a33-50b7-4e93-991a-102e4cc5a630" (UID: "cc016a33-50b7-4e93-991a-102e4cc5a630"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.875271 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc016a33-50b7-4e93-991a-102e4cc5a630-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "cc016a33-50b7-4e93-991a-102e4cc5a630" (UID: "cc016a33-50b7-4e93-991a-102e4cc5a630"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.875471 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8276f0d-1980-482f-bddb-7c84c2723e03-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.875485 4991 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8276f0d-1980-482f-bddb-7c84c2723e03-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.875495 4991 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cc016a33-50b7-4e93-991a-102e4cc5a630-client-ca\") on node \"crc\" DevicePath \"\"" Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.875504 4991 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cc016a33-50b7-4e93-991a-102e4cc5a630-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.875514 4991 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8276f0d-1980-482f-bddb-7c84c2723e03-client-ca\") on node \"crc\" DevicePath \"\"" Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.875531 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkm6k\" (UniqueName: \"kubernetes.io/projected/a8276f0d-1980-482f-bddb-7c84c2723e03-kube-api-access-mkm6k\") on node \"crc\" DevicePath \"\"" Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.877247 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc016a33-50b7-4e93-991a-102e4cc5a630-kube-api-access-l28mh" (OuterVolumeSpecName: "kube-api-access-l28mh") pod "cc016a33-50b7-4e93-991a-102e4cc5a630" (UID: "cc016a33-50b7-4e93-991a-102e4cc5a630"). InnerVolumeSpecName "kube-api-access-l28mh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.878219 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc016a33-50b7-4e93-991a-102e4cc5a630-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "cc016a33-50b7-4e93-991a-102e4cc5a630" (UID: "cc016a33-50b7-4e93-991a-102e4cc5a630"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.878859 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc016a33-50b7-4e93-991a-102e4cc5a630-config" (OuterVolumeSpecName: "config") pod "cc016a33-50b7-4e93-991a-102e4cc5a630" (UID: "cc016a33-50b7-4e93-991a-102e4cc5a630"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.975919 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc016a33-50b7-4e93-991a-102e4cc5a630-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.975963 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l28mh\" (UniqueName: \"kubernetes.io/projected/cc016a33-50b7-4e93-991a-102e4cc5a630-kube-api-access-l28mh\") on node \"crc\" DevicePath \"\"" Dec 06 01:07:43 crc kubenswrapper[4991]: I1206 01:07:43.975976 4991 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc016a33-50b7-4e93-991a-102e4cc5a630-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.203587 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v9ncc"] Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.248484 4991 generic.go:334] "Generic (PLEG): container finished" podID="a8276f0d-1980-482f-bddb-7c84c2723e03" containerID="c3f021e0ef965f5c40a101d81619c63dcb86e6dad745213acc68c128cc22de02" exitCode=0 Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.248556 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8b979b485-tgfbr" event={"ID":"a8276f0d-1980-482f-bddb-7c84c2723e03","Type":"ContainerDied","Data":"c3f021e0ef965f5c40a101d81619c63dcb86e6dad745213acc68c128cc22de02"} Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.248609 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8b979b485-tgfbr" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.248640 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8b979b485-tgfbr" event={"ID":"a8276f0d-1980-482f-bddb-7c84c2723e03","Type":"ContainerDied","Data":"54293d7364f5aa39d9583142fe56d31ae773149ef3d4b851f27dfe207a7264b3"} Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.248682 4991 scope.go:117] "RemoveContainer" containerID="c3f021e0ef965f5c40a101d81619c63dcb86e6dad745213acc68c128cc22de02" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.253936 4991 generic.go:334] "Generic (PLEG): container finished" podID="cc016a33-50b7-4e93-991a-102e4cc5a630" containerID="3b98e2ee8ecfe91e4d8a2a43ab8193cb756547ccd54cabba9abc5e553c1a91b0" exitCode=0 Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.254023 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-866ddd4d9c-mkmxf" event={"ID":"cc016a33-50b7-4e93-991a-102e4cc5a630","Type":"ContainerDied","Data":"3b98e2ee8ecfe91e4d8a2a43ab8193cb756547ccd54cabba9abc5e553c1a91b0"} Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.254047 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-866ddd4d9c-mkmxf" event={"ID":"cc016a33-50b7-4e93-991a-102e4cc5a630","Type":"ContainerDied","Data":"dfa478083fc0c1634fd60c1f9f280513452ff994fb0942a8fa11db0e9ea6c548"} Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.254129 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-866ddd4d9c-mkmxf" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.266686 4991 generic.go:334] "Generic (PLEG): container finished" podID="97d40092-c83e-4410-9535-73e4aad9eafc" containerID="ed07a27b1f76dbc673f059d485475c6f0503e8fb66b24c859520afd3f59c9dc6" exitCode=0 Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.266796 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w9mn9" event={"ID":"97d40092-c83e-4410-9535-73e4aad9eafc","Type":"ContainerDied","Data":"ed07a27b1f76dbc673f059d485475c6f0503e8fb66b24c859520afd3f59c9dc6"} Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.266840 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w9mn9" event={"ID":"97d40092-c83e-4410-9535-73e4aad9eafc","Type":"ContainerStarted","Data":"1e17ff2931935b75fee5b841e2411d8293f43a9edd09b67167d48e993185c5c6"} Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.270746 4991 scope.go:117] "RemoveContainer" containerID="c3f021e0ef965f5c40a101d81619c63dcb86e6dad745213acc68c128cc22de02" Dec 06 01:07:44 crc kubenswrapper[4991]: E1206 01:07:44.271324 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3f021e0ef965f5c40a101d81619c63dcb86e6dad745213acc68c128cc22de02\": container with ID starting with c3f021e0ef965f5c40a101d81619c63dcb86e6dad745213acc68c128cc22de02 not found: ID does not exist" containerID="c3f021e0ef965f5c40a101d81619c63dcb86e6dad745213acc68c128cc22de02" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.271371 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3f021e0ef965f5c40a101d81619c63dcb86e6dad745213acc68c128cc22de02"} err="failed to get container status \"c3f021e0ef965f5c40a101d81619c63dcb86e6dad745213acc68c128cc22de02\": rpc error: code = NotFound desc = could not find container \"c3f021e0ef965f5c40a101d81619c63dcb86e6dad745213acc68c128cc22de02\": container with ID starting with c3f021e0ef965f5c40a101d81619c63dcb86e6dad745213acc68c128cc22de02 not found: ID does not exist" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.271404 4991 scope.go:117] "RemoveContainer" containerID="3b98e2ee8ecfe91e4d8a2a43ab8193cb756547ccd54cabba9abc5e553c1a91b0" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.273324 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v9ncc" event={"ID":"d9f080b8-d4ac-4dd5-9ce8-850cb9303ae1","Type":"ContainerStarted","Data":"e217d651c434e0581ef6f96d1ddd7f8e602208fe6335540f0ccd8ce4354e6688"} Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.281347 4991 generic.go:334] "Generic (PLEG): container finished" podID="a9de55b2-75a9-4843-bfee-297b2522e7f0" containerID="eaf01866e623c2ad84cdf0dfc6630455499a00ca9f2e7036fb4c38975a4ff7f9" exitCode=0 Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.282009 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-chxpn" event={"ID":"a9de55b2-75a9-4843-bfee-297b2522e7f0","Type":"ContainerDied","Data":"eaf01866e623c2ad84cdf0dfc6630455499a00ca9f2e7036fb4c38975a4ff7f9"} Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.286496 4991 generic.go:334] "Generic (PLEG): container finished" podID="aa0283c0-7c60-47d5-9bf8-28bf2c1d0ecc" containerID="57bec8ee9526a66faeb65c555795ace12c48e7bb649709d47997a0d007bb8edb" exitCode=0 Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.286534 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wb2rc" event={"ID":"aa0283c0-7c60-47d5-9bf8-28bf2c1d0ecc","Type":"ContainerDied","Data":"57bec8ee9526a66faeb65c555795ace12c48e7bb649709d47997a0d007bb8edb"} Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.321317 4991 scope.go:117] "RemoveContainer" containerID="3b98e2ee8ecfe91e4d8a2a43ab8193cb756547ccd54cabba9abc5e553c1a91b0" Dec 06 01:07:44 crc kubenswrapper[4991]: E1206 01:07:44.324049 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b98e2ee8ecfe91e4d8a2a43ab8193cb756547ccd54cabba9abc5e553c1a91b0\": container with ID starting with 3b98e2ee8ecfe91e4d8a2a43ab8193cb756547ccd54cabba9abc5e553c1a91b0 not found: ID does not exist" containerID="3b98e2ee8ecfe91e4d8a2a43ab8193cb756547ccd54cabba9abc5e553c1a91b0" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.324101 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b98e2ee8ecfe91e4d8a2a43ab8193cb756547ccd54cabba9abc5e553c1a91b0"} err="failed to get container status \"3b98e2ee8ecfe91e4d8a2a43ab8193cb756547ccd54cabba9abc5e553c1a91b0\": rpc error: code = NotFound desc = could not find container \"3b98e2ee8ecfe91e4d8a2a43ab8193cb756547ccd54cabba9abc5e553c1a91b0\": container with ID starting with 3b98e2ee8ecfe91e4d8a2a43ab8193cb756547ccd54cabba9abc5e553c1a91b0 not found: ID does not exist" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.333201 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8b979b485-tgfbr"] Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.337851 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8b979b485-tgfbr"] Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.371300 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-866ddd4d9c-mkmxf"] Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.376351 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-866ddd4d9c-mkmxf"] Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.447822 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-85c8d48899-x89xc"] Dec 06 01:07:44 crc kubenswrapper[4991]: E1206 01:07:44.448211 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8276f0d-1980-482f-bddb-7c84c2723e03" containerName="route-controller-manager" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.448236 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8276f0d-1980-482f-bddb-7c84c2723e03" containerName="route-controller-manager" Dec 06 01:07:44 crc kubenswrapper[4991]: E1206 01:07:44.448263 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc016a33-50b7-4e93-991a-102e4cc5a630" containerName="controller-manager" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.448271 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc016a33-50b7-4e93-991a-102e4cc5a630" containerName="controller-manager" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.448388 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc016a33-50b7-4e93-991a-102e4cc5a630" containerName="controller-manager" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.448413 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8276f0d-1980-482f-bddb-7c84c2723e03" containerName="route-controller-manager" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.449037 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-85c8d48899-x89xc" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.454541 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.454628 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.459655 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.460014 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.460223 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.460443 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.464318 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b8595ddbb-dkxx4"] Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.464972 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-85c8d48899-x89xc"] Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.465009 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b8595ddbb-dkxx4"] Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.465113 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b8595ddbb-dkxx4" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.465349 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.468424 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.468640 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.468752 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.468869 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.469014 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.469208 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.584444 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e65f7916-b0cc-4636-8569-7d6236524909-client-ca\") pod \"controller-manager-85c8d48899-x89xc\" (UID: \"e65f7916-b0cc-4636-8569-7d6236524909\") " pod="openshift-controller-manager/controller-manager-85c8d48899-x89xc" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.584636 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8ada602-1af2-47e9-b5ad-0f2fa8805d95-serving-cert\") pod \"route-controller-manager-b8595ddbb-dkxx4\" (UID: \"f8ada602-1af2-47e9-b5ad-0f2fa8805d95\") " pod="openshift-route-controller-manager/route-controller-manager-b8595ddbb-dkxx4" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.584750 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e65f7916-b0cc-4636-8569-7d6236524909-config\") pod \"controller-manager-85c8d48899-x89xc\" (UID: \"e65f7916-b0cc-4636-8569-7d6236524909\") " pod="openshift-controller-manager/controller-manager-85c8d48899-x89xc" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.584803 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5fk9\" (UniqueName: \"kubernetes.io/projected/f8ada602-1af2-47e9-b5ad-0f2fa8805d95-kube-api-access-g5fk9\") pod \"route-controller-manager-b8595ddbb-dkxx4\" (UID: \"f8ada602-1af2-47e9-b5ad-0f2fa8805d95\") " pod="openshift-route-controller-manager/route-controller-manager-b8595ddbb-dkxx4" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.584841 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8ada602-1af2-47e9-b5ad-0f2fa8805d95-config\") pod \"route-controller-manager-b8595ddbb-dkxx4\" (UID: \"f8ada602-1af2-47e9-b5ad-0f2fa8805d95\") " pod="openshift-route-controller-manager/route-controller-manager-b8595ddbb-dkxx4" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.585046 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f8ada602-1af2-47e9-b5ad-0f2fa8805d95-client-ca\") pod \"route-controller-manager-b8595ddbb-dkxx4\" (UID: \"f8ada602-1af2-47e9-b5ad-0f2fa8805d95\") " pod="openshift-route-controller-manager/route-controller-manager-b8595ddbb-dkxx4" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.585107 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e65f7916-b0cc-4636-8569-7d6236524909-serving-cert\") pod \"controller-manager-85c8d48899-x89xc\" (UID: \"e65f7916-b0cc-4636-8569-7d6236524909\") " pod="openshift-controller-manager/controller-manager-85c8d48899-x89xc" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.585139 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e65f7916-b0cc-4636-8569-7d6236524909-proxy-ca-bundles\") pod \"controller-manager-85c8d48899-x89xc\" (UID: \"e65f7916-b0cc-4636-8569-7d6236524909\") " pod="openshift-controller-manager/controller-manager-85c8d48899-x89xc" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.585173 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbdnx\" (UniqueName: \"kubernetes.io/projected/e65f7916-b0cc-4636-8569-7d6236524909-kube-api-access-jbdnx\") pod \"controller-manager-85c8d48899-x89xc\" (UID: \"e65f7916-b0cc-4636-8569-7d6236524909\") " pod="openshift-controller-manager/controller-manager-85c8d48899-x89xc" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.688263 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8ada602-1af2-47e9-b5ad-0f2fa8805d95-serving-cert\") pod \"route-controller-manager-b8595ddbb-dkxx4\" (UID: \"f8ada602-1af2-47e9-b5ad-0f2fa8805d95\") " pod="openshift-route-controller-manager/route-controller-manager-b8595ddbb-dkxx4" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.688346 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e65f7916-b0cc-4636-8569-7d6236524909-client-ca\") pod \"controller-manager-85c8d48899-x89xc\" (UID: \"e65f7916-b0cc-4636-8569-7d6236524909\") " pod="openshift-controller-manager/controller-manager-85c8d48899-x89xc" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.688410 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e65f7916-b0cc-4636-8569-7d6236524909-config\") pod \"controller-manager-85c8d48899-x89xc\" (UID: \"e65f7916-b0cc-4636-8569-7d6236524909\") " pod="openshift-controller-manager/controller-manager-85c8d48899-x89xc" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.688451 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5fk9\" (UniqueName: \"kubernetes.io/projected/f8ada602-1af2-47e9-b5ad-0f2fa8805d95-kube-api-access-g5fk9\") pod \"route-controller-manager-b8595ddbb-dkxx4\" (UID: \"f8ada602-1af2-47e9-b5ad-0f2fa8805d95\") " pod="openshift-route-controller-manager/route-controller-manager-b8595ddbb-dkxx4" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.688505 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8ada602-1af2-47e9-b5ad-0f2fa8805d95-config\") pod \"route-controller-manager-b8595ddbb-dkxx4\" (UID: \"f8ada602-1af2-47e9-b5ad-0f2fa8805d95\") " pod="openshift-route-controller-manager/route-controller-manager-b8595ddbb-dkxx4" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.689540 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f8ada602-1af2-47e9-b5ad-0f2fa8805d95-client-ca\") pod \"route-controller-manager-b8595ddbb-dkxx4\" (UID: \"f8ada602-1af2-47e9-b5ad-0f2fa8805d95\") " pod="openshift-route-controller-manager/route-controller-manager-b8595ddbb-dkxx4" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.689609 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e65f7916-b0cc-4636-8569-7d6236524909-serving-cert\") pod \"controller-manager-85c8d48899-x89xc\" (UID: \"e65f7916-b0cc-4636-8569-7d6236524909\") " pod="openshift-controller-manager/controller-manager-85c8d48899-x89xc" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.689652 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e65f7916-b0cc-4636-8569-7d6236524909-proxy-ca-bundles\") pod \"controller-manager-85c8d48899-x89xc\" (UID: \"e65f7916-b0cc-4636-8569-7d6236524909\") " pod="openshift-controller-manager/controller-manager-85c8d48899-x89xc" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.689706 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbdnx\" (UniqueName: \"kubernetes.io/projected/e65f7916-b0cc-4636-8569-7d6236524909-kube-api-access-jbdnx\") pod \"controller-manager-85c8d48899-x89xc\" (UID: \"e65f7916-b0cc-4636-8569-7d6236524909\") " pod="openshift-controller-manager/controller-manager-85c8d48899-x89xc" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.689914 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8ada602-1af2-47e9-b5ad-0f2fa8805d95-config\") pod \"route-controller-manager-b8595ddbb-dkxx4\" (UID: \"f8ada602-1af2-47e9-b5ad-0f2fa8805d95\") " pod="openshift-route-controller-manager/route-controller-manager-b8595ddbb-dkxx4" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.690033 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e65f7916-b0cc-4636-8569-7d6236524909-client-ca\") pod \"controller-manager-85c8d48899-x89xc\" (UID: \"e65f7916-b0cc-4636-8569-7d6236524909\") " pod="openshift-controller-manager/controller-manager-85c8d48899-x89xc" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.690357 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e65f7916-b0cc-4636-8569-7d6236524909-config\") pod \"controller-manager-85c8d48899-x89xc\" (UID: \"e65f7916-b0cc-4636-8569-7d6236524909\") " pod="openshift-controller-manager/controller-manager-85c8d48899-x89xc" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.690906 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e65f7916-b0cc-4636-8569-7d6236524909-proxy-ca-bundles\") pod \"controller-manager-85c8d48899-x89xc\" (UID: \"e65f7916-b0cc-4636-8569-7d6236524909\") " pod="openshift-controller-manager/controller-manager-85c8d48899-x89xc" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.691411 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f8ada602-1af2-47e9-b5ad-0f2fa8805d95-client-ca\") pod \"route-controller-manager-b8595ddbb-dkxx4\" (UID: \"f8ada602-1af2-47e9-b5ad-0f2fa8805d95\") " pod="openshift-route-controller-manager/route-controller-manager-b8595ddbb-dkxx4" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.698812 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e65f7916-b0cc-4636-8569-7d6236524909-serving-cert\") pod \"controller-manager-85c8d48899-x89xc\" (UID: \"e65f7916-b0cc-4636-8569-7d6236524909\") " pod="openshift-controller-manager/controller-manager-85c8d48899-x89xc" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.706103 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8ada602-1af2-47e9-b5ad-0f2fa8805d95-serving-cert\") pod \"route-controller-manager-b8595ddbb-dkxx4\" (UID: \"f8ada602-1af2-47e9-b5ad-0f2fa8805d95\") " pod="openshift-route-controller-manager/route-controller-manager-b8595ddbb-dkxx4" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.720639 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5fk9\" (UniqueName: \"kubernetes.io/projected/f8ada602-1af2-47e9-b5ad-0f2fa8805d95-kube-api-access-g5fk9\") pod \"route-controller-manager-b8595ddbb-dkxx4\" (UID: \"f8ada602-1af2-47e9-b5ad-0f2fa8805d95\") " pod="openshift-route-controller-manager/route-controller-manager-b8595ddbb-dkxx4" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.723650 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbdnx\" (UniqueName: \"kubernetes.io/projected/e65f7916-b0cc-4636-8569-7d6236524909-kube-api-access-jbdnx\") pod \"controller-manager-85c8d48899-x89xc\" (UID: \"e65f7916-b0cc-4636-8569-7d6236524909\") " pod="openshift-controller-manager/controller-manager-85c8d48899-x89xc" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.826433 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-85c8d48899-x89xc" Dec 06 01:07:44 crc kubenswrapper[4991]: I1206 01:07:44.834955 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b8595ddbb-dkxx4" Dec 06 01:07:45 crc kubenswrapper[4991]: I1206 01:07:45.046455 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8276f0d-1980-482f-bddb-7c84c2723e03" path="/var/lib/kubelet/pods/a8276f0d-1980-482f-bddb-7c84c2723e03/volumes" Dec 06 01:07:45 crc kubenswrapper[4991]: I1206 01:07:45.047464 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc016a33-50b7-4e93-991a-102e4cc5a630" path="/var/lib/kubelet/pods/cc016a33-50b7-4e93-991a-102e4cc5a630/volumes" Dec 06 01:07:45 crc kubenswrapper[4991]: I1206 01:07:45.067697 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b8595ddbb-dkxx4"] Dec 06 01:07:45 crc kubenswrapper[4991]: I1206 01:07:45.157382 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-85c8d48899-x89xc"] Dec 06 01:07:45 crc kubenswrapper[4991]: W1206 01:07:45.167605 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode65f7916_b0cc_4636_8569_7d6236524909.slice/crio-8613d866104b1ca10d42c0e0e8a53efbc776b1ca99018ead993e719095ad109f WatchSource:0}: Error finding container 8613d866104b1ca10d42c0e0e8a53efbc776b1ca99018ead993e719095ad109f: Status 404 returned error can't find the container with id 8613d866104b1ca10d42c0e0e8a53efbc776b1ca99018ead993e719095ad109f Dec 06 01:07:45 crc kubenswrapper[4991]: I1206 01:07:45.292754 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-chxpn" event={"ID":"a9de55b2-75a9-4843-bfee-297b2522e7f0","Type":"ContainerStarted","Data":"4e3f30df78e7f01db76c6fa2afbf46905d7ef99f6842d8a3c9b12b5736cbae88"} Dec 06 01:07:45 crc kubenswrapper[4991]: I1206 01:07:45.298766 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wb2rc" event={"ID":"aa0283c0-7c60-47d5-9bf8-28bf2c1d0ecc","Type":"ContainerStarted","Data":"d610944289abca68d4208a6cb0a07da884db8c73618d357365c27d8e1e7a847f"} Dec 06 01:07:45 crc kubenswrapper[4991]: I1206 01:07:45.303009 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b8595ddbb-dkxx4" event={"ID":"f8ada602-1af2-47e9-b5ad-0f2fa8805d95","Type":"ContainerStarted","Data":"14ea08db21410cae571cc73d2b57f2a1c62bb9acf4cec27d909e726df59bff32"} Dec 06 01:07:45 crc kubenswrapper[4991]: I1206 01:07:45.303054 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b8595ddbb-dkxx4" event={"ID":"f8ada602-1af2-47e9-b5ad-0f2fa8805d95","Type":"ContainerStarted","Data":"02ca896f9f86e31f525282d67abf673a685dc4867071b960cca016706167e744"} Dec 06 01:07:45 crc kubenswrapper[4991]: I1206 01:07:45.303942 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-b8595ddbb-dkxx4" Dec 06 01:07:45 crc kubenswrapper[4991]: I1206 01:07:45.305225 4991 generic.go:334] "Generic (PLEG): container finished" podID="d9f080b8-d4ac-4dd5-9ce8-850cb9303ae1" containerID="b62a939a91418cfb4ae1c8fc2bfedda516e92dc64ad67d6b412609180071f8bd" exitCode=0 Dec 06 01:07:45 crc kubenswrapper[4991]: I1206 01:07:45.305283 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v9ncc" event={"ID":"d9f080b8-d4ac-4dd5-9ce8-850cb9303ae1","Type":"ContainerDied","Data":"b62a939a91418cfb4ae1c8fc2bfedda516e92dc64ad67d6b412609180071f8bd"} Dec 06 01:07:45 crc kubenswrapper[4991]: I1206 01:07:45.305859 4991 patch_prober.go:28] interesting pod/route-controller-manager-b8595ddbb-dkxx4 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": dial tcp 10.217.0.67:8443: connect: connection refused" start-of-body= Dec 06 01:07:45 crc kubenswrapper[4991]: I1206 01:07:45.305893 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-b8595ddbb-dkxx4" podUID="f8ada602-1af2-47e9-b5ad-0f2fa8805d95" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": dial tcp 10.217.0.67:8443: connect: connection refused" Dec 06 01:07:45 crc kubenswrapper[4991]: I1206 01:07:45.309015 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-85c8d48899-x89xc" event={"ID":"e65f7916-b0cc-4636-8569-7d6236524909","Type":"ContainerStarted","Data":"f8be74106c84d8b56107c27043b4e18157cddde710e7ed0a44ae68a4a760f5f6"} Dec 06 01:07:45 crc kubenswrapper[4991]: I1206 01:07:45.309049 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-85c8d48899-x89xc" event={"ID":"e65f7916-b0cc-4636-8569-7d6236524909","Type":"ContainerStarted","Data":"8613d866104b1ca10d42c0e0e8a53efbc776b1ca99018ead993e719095ad109f"} Dec 06 01:07:45 crc kubenswrapper[4991]: I1206 01:07:45.309574 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-85c8d48899-x89xc" Dec 06 01:07:45 crc kubenswrapper[4991]: I1206 01:07:45.311756 4991 patch_prober.go:28] interesting pod/controller-manager-85c8d48899-x89xc container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.66:8443/healthz\": dial tcp 10.217.0.66:8443: connect: connection refused" start-of-body= Dec 06 01:07:45 crc kubenswrapper[4991]: I1206 01:07:45.311814 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-85c8d48899-x89xc" podUID="e65f7916-b0cc-4636-8569-7d6236524909" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.66:8443/healthz\": dial tcp 10.217.0.66:8443: connect: connection refused" Dec 06 01:07:45 crc kubenswrapper[4991]: I1206 01:07:45.316555 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-chxpn" podStartSLOduration=2.753343012 podStartE2EDuration="5.316531452s" podCreationTimestamp="2025-12-06 01:07:40 +0000 UTC" firstStartedPulling="2025-12-06 01:07:42.185290156 +0000 UTC m=+335.535580664" lastFinishedPulling="2025-12-06 01:07:44.748478626 +0000 UTC m=+338.098769104" observedRunningTime="2025-12-06 01:07:45.314611007 +0000 UTC m=+338.664901475" watchObservedRunningTime="2025-12-06 01:07:45.316531452 +0000 UTC m=+338.666821920" Dec 06 01:07:45 crc kubenswrapper[4991]: I1206 01:07:45.339066 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-85c8d48899-x89xc" podStartSLOduration=2.339036791 podStartE2EDuration="2.339036791s" podCreationTimestamp="2025-12-06 01:07:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:07:45.335663095 +0000 UTC m=+338.685953563" watchObservedRunningTime="2025-12-06 01:07:45.339036791 +0000 UTC m=+338.689327259" Dec 06 01:07:45 crc kubenswrapper[4991]: I1206 01:07:45.356231 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-b8595ddbb-dkxx4" podStartSLOduration=2.356209319 podStartE2EDuration="2.356209319s" podCreationTimestamp="2025-12-06 01:07:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:07:45.352130023 +0000 UTC m=+338.702420491" watchObservedRunningTime="2025-12-06 01:07:45.356209319 +0000 UTC m=+338.706499787" Dec 06 01:07:46 crc kubenswrapper[4991]: I1206 01:07:46.317362 4991 generic.go:334] "Generic (PLEG): container finished" podID="97d40092-c83e-4410-9535-73e4aad9eafc" containerID="501f03431f43766253c9d9bd2e419da0309aa1f83336cb114b0be45ec6d07a7a" exitCode=0 Dec 06 01:07:46 crc kubenswrapper[4991]: I1206 01:07:46.317447 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w9mn9" event={"ID":"97d40092-c83e-4410-9535-73e4aad9eafc","Type":"ContainerDied","Data":"501f03431f43766253c9d9bd2e419da0309aa1f83336cb114b0be45ec6d07a7a"} Dec 06 01:07:46 crc kubenswrapper[4991]: I1206 01:07:46.322430 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v9ncc" event={"ID":"d9f080b8-d4ac-4dd5-9ce8-850cb9303ae1","Type":"ContainerStarted","Data":"d27704c080e82d06c1597200d1e004e1c12ca8ddeaa5265a73cce3195b218c67"} Dec 06 01:07:46 crc kubenswrapper[4991]: I1206 01:07:46.326935 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-b8595ddbb-dkxx4" Dec 06 01:07:46 crc kubenswrapper[4991]: I1206 01:07:46.328468 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-85c8d48899-x89xc" Dec 06 01:07:46 crc kubenswrapper[4991]: I1206 01:07:46.339369 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wb2rc" podStartSLOduration=3.771166839 podStartE2EDuration="6.339344681s" podCreationTimestamp="2025-12-06 01:07:40 +0000 UTC" firstStartedPulling="2025-12-06 01:07:42.190063332 +0000 UTC m=+335.540353800" lastFinishedPulling="2025-12-06 01:07:44.758241164 +0000 UTC m=+338.108531642" observedRunningTime="2025-12-06 01:07:45.391594905 +0000 UTC m=+338.741885373" watchObservedRunningTime="2025-12-06 01:07:46.339344681 +0000 UTC m=+339.689635149" Dec 06 01:07:47 crc kubenswrapper[4991]: I1206 01:07:47.330037 4991 generic.go:334] "Generic (PLEG): container finished" podID="d9f080b8-d4ac-4dd5-9ce8-850cb9303ae1" containerID="d27704c080e82d06c1597200d1e004e1c12ca8ddeaa5265a73cce3195b218c67" exitCode=0 Dec 06 01:07:47 crc kubenswrapper[4991]: I1206 01:07:47.330133 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v9ncc" event={"ID":"d9f080b8-d4ac-4dd5-9ce8-850cb9303ae1","Type":"ContainerDied","Data":"d27704c080e82d06c1597200d1e004e1c12ca8ddeaa5265a73cce3195b218c67"} Dec 06 01:07:47 crc kubenswrapper[4991]: I1206 01:07:47.332127 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w9mn9" event={"ID":"97d40092-c83e-4410-9535-73e4aad9eafc","Type":"ContainerStarted","Data":"b00d02b98f14652d5943650c4068f70222317b8538f5cb2fe6970cbc6b181d4d"} Dec 06 01:07:47 crc kubenswrapper[4991]: I1206 01:07:47.370265 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-w9mn9" podStartSLOduration=1.87423954 podStartE2EDuration="4.370232681s" podCreationTimestamp="2025-12-06 01:07:43 +0000 UTC" firstStartedPulling="2025-12-06 01:07:44.270580733 +0000 UTC m=+337.620871211" lastFinishedPulling="2025-12-06 01:07:46.766573874 +0000 UTC m=+340.116864352" observedRunningTime="2025-12-06 01:07:47.369008017 +0000 UTC m=+340.719298495" watchObservedRunningTime="2025-12-06 01:07:47.370232681 +0000 UTC m=+340.720523149" Dec 06 01:07:48 crc kubenswrapper[4991]: I1206 01:07:48.339655 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v9ncc" event={"ID":"d9f080b8-d4ac-4dd5-9ce8-850cb9303ae1","Type":"ContainerStarted","Data":"e99c85b8acca0f5ae18cd565fe550fbe325d47404989464a336b994bf35b7dca"} Dec 06 01:07:50 crc kubenswrapper[4991]: I1206 01:07:50.709411 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wb2rc" Dec 06 01:07:50 crc kubenswrapper[4991]: I1206 01:07:50.709530 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wb2rc" Dec 06 01:07:50 crc kubenswrapper[4991]: I1206 01:07:50.765686 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wb2rc" Dec 06 01:07:50 crc kubenswrapper[4991]: I1206 01:07:50.794576 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-v9ncc" podStartSLOduration=5.266063253 podStartE2EDuration="7.794557667s" podCreationTimestamp="2025-12-06 01:07:43 +0000 UTC" firstStartedPulling="2025-12-06 01:07:45.307130865 +0000 UTC m=+338.657421333" lastFinishedPulling="2025-12-06 01:07:47.835625289 +0000 UTC m=+341.185915747" observedRunningTime="2025-12-06 01:07:48.366452586 +0000 UTC m=+341.716743074" watchObservedRunningTime="2025-12-06 01:07:50.794557667 +0000 UTC m=+344.144848135" Dec 06 01:07:51 crc kubenswrapper[4991]: I1206 01:07:51.132697 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-chxpn" Dec 06 01:07:51 crc kubenswrapper[4991]: I1206 01:07:51.133027 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-chxpn" Dec 06 01:07:51 crc kubenswrapper[4991]: I1206 01:07:51.185715 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-chxpn" Dec 06 01:07:51 crc kubenswrapper[4991]: I1206 01:07:51.415349 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-chxpn" Dec 06 01:07:51 crc kubenswrapper[4991]: I1206 01:07:51.418093 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wb2rc" Dec 06 01:07:53 crc kubenswrapper[4991]: I1206 01:07:53.001779 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-mcvg9" Dec 06 01:07:53 crc kubenswrapper[4991]: I1206 01:07:53.106197 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-v8msm"] Dec 06 01:07:53 crc kubenswrapper[4991]: I1206 01:07:53.467850 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-w9mn9" Dec 06 01:07:53 crc kubenswrapper[4991]: I1206 01:07:53.467982 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-w9mn9" Dec 06 01:07:53 crc kubenswrapper[4991]: I1206 01:07:53.537733 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-w9mn9" Dec 06 01:07:53 crc kubenswrapper[4991]: I1206 01:07:53.740143 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-v9ncc" Dec 06 01:07:53 crc kubenswrapper[4991]: I1206 01:07:53.740227 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-v9ncc" Dec 06 01:07:53 crc kubenswrapper[4991]: I1206 01:07:53.820041 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-v9ncc" Dec 06 01:07:54 crc kubenswrapper[4991]: I1206 01:07:54.440815 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-v9ncc" Dec 06 01:07:54 crc kubenswrapper[4991]: I1206 01:07:54.451053 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-w9mn9" Dec 06 01:08:16 crc kubenswrapper[4991]: I1206 01:08:16.740483 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 01:08:16 crc kubenswrapper[4991]: I1206 01:08:16.741446 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 01:08:18 crc kubenswrapper[4991]: I1206 01:08:18.170865 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" podUID="4afb29d9-e4be-4c64-88d4-823d89c15ec8" containerName="registry" containerID="cri-o://c4463813345be68b0861983adfa026ac9ebc3d8366ca7332b286582d723c53b1" gracePeriod=30 Dec 06 01:08:18 crc kubenswrapper[4991]: I1206 01:08:18.581047 4991 generic.go:334] "Generic (PLEG): container finished" podID="4afb29d9-e4be-4c64-88d4-823d89c15ec8" containerID="c4463813345be68b0861983adfa026ac9ebc3d8366ca7332b286582d723c53b1" exitCode=0 Dec 06 01:08:18 crc kubenswrapper[4991]: I1206 01:08:18.581118 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" event={"ID":"4afb29d9-e4be-4c64-88d4-823d89c15ec8","Type":"ContainerDied","Data":"c4463813345be68b0861983adfa026ac9ebc3d8366ca7332b286582d723c53b1"} Dec 06 01:08:18 crc kubenswrapper[4991]: I1206 01:08:18.724627 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:08:18 crc kubenswrapper[4991]: I1206 01:08:18.764904 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4afb29d9-e4be-4c64-88d4-823d89c15ec8-ca-trust-extracted\") pod \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " Dec 06 01:08:18 crc kubenswrapper[4991]: I1206 01:08:18.765084 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4afb29d9-e4be-4c64-88d4-823d89c15ec8-bound-sa-token\") pod \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " Dec 06 01:08:18 crc kubenswrapper[4991]: I1206 01:08:18.765123 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4afb29d9-e4be-4c64-88d4-823d89c15ec8-trusted-ca\") pod \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " Dec 06 01:08:18 crc kubenswrapper[4991]: I1206 01:08:18.765314 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " Dec 06 01:08:18 crc kubenswrapper[4991]: I1206 01:08:18.765375 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4afb29d9-e4be-4c64-88d4-823d89c15ec8-registry-certificates\") pod \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " Dec 06 01:08:18 crc kubenswrapper[4991]: I1206 01:08:18.765452 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4afb29d9-e4be-4c64-88d4-823d89c15ec8-installation-pull-secrets\") pod \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " Dec 06 01:08:18 crc kubenswrapper[4991]: I1206 01:08:18.765499 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8n24\" (UniqueName: \"kubernetes.io/projected/4afb29d9-e4be-4c64-88d4-823d89c15ec8-kube-api-access-b8n24\") pod \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " Dec 06 01:08:18 crc kubenswrapper[4991]: I1206 01:08:18.765533 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4afb29d9-e4be-4c64-88d4-823d89c15ec8-registry-tls\") pod \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\" (UID: \"4afb29d9-e4be-4c64-88d4-823d89c15ec8\") " Dec 06 01:08:18 crc kubenswrapper[4991]: I1206 01:08:18.767335 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4afb29d9-e4be-4c64-88d4-823d89c15ec8-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "4afb29d9-e4be-4c64-88d4-823d89c15ec8" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:08:18 crc kubenswrapper[4991]: I1206 01:08:18.767977 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4afb29d9-e4be-4c64-88d4-823d89c15ec8-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "4afb29d9-e4be-4c64-88d4-823d89c15ec8" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:08:18 crc kubenswrapper[4991]: I1206 01:08:18.776243 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4afb29d9-e4be-4c64-88d4-823d89c15ec8-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "4afb29d9-e4be-4c64-88d4-823d89c15ec8" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:08:18 crc kubenswrapper[4991]: I1206 01:08:18.776348 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4afb29d9-e4be-4c64-88d4-823d89c15ec8-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "4afb29d9-e4be-4c64-88d4-823d89c15ec8" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:08:18 crc kubenswrapper[4991]: I1206 01:08:18.776408 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4afb29d9-e4be-4c64-88d4-823d89c15ec8-kube-api-access-b8n24" (OuterVolumeSpecName: "kube-api-access-b8n24") pod "4afb29d9-e4be-4c64-88d4-823d89c15ec8" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8"). InnerVolumeSpecName "kube-api-access-b8n24". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:08:18 crc kubenswrapper[4991]: I1206 01:08:18.781214 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4afb29d9-e4be-4c64-88d4-823d89c15ec8-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "4afb29d9-e4be-4c64-88d4-823d89c15ec8" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:08:18 crc kubenswrapper[4991]: I1206 01:08:18.783161 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "4afb29d9-e4be-4c64-88d4-823d89c15ec8" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 06 01:08:18 crc kubenswrapper[4991]: I1206 01:08:18.787886 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4afb29d9-e4be-4c64-88d4-823d89c15ec8-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "4afb29d9-e4be-4c64-88d4-823d89c15ec8" (UID: "4afb29d9-e4be-4c64-88d4-823d89c15ec8"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:08:18 crc kubenswrapper[4991]: I1206 01:08:18.867239 4991 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4afb29d9-e4be-4c64-88d4-823d89c15ec8-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 06 01:08:18 crc kubenswrapper[4991]: I1206 01:08:18.867276 4991 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4afb29d9-e4be-4c64-88d4-823d89c15ec8-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 06 01:08:18 crc kubenswrapper[4991]: I1206 01:08:18.867286 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8n24\" (UniqueName: \"kubernetes.io/projected/4afb29d9-e4be-4c64-88d4-823d89c15ec8-kube-api-access-b8n24\") on node \"crc\" DevicePath \"\"" Dec 06 01:08:18 crc kubenswrapper[4991]: I1206 01:08:18.867298 4991 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4afb29d9-e4be-4c64-88d4-823d89c15ec8-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 06 01:08:18 crc kubenswrapper[4991]: I1206 01:08:18.867307 4991 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4afb29d9-e4be-4c64-88d4-823d89c15ec8-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 06 01:08:18 crc kubenswrapper[4991]: I1206 01:08:18.867316 4991 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4afb29d9-e4be-4c64-88d4-823d89c15ec8-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 06 01:08:18 crc kubenswrapper[4991]: I1206 01:08:18.867326 4991 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4afb29d9-e4be-4c64-88d4-823d89c15ec8-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 06 01:08:19 crc kubenswrapper[4991]: I1206 01:08:19.594686 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" event={"ID":"4afb29d9-e4be-4c64-88d4-823d89c15ec8","Type":"ContainerDied","Data":"509438158e14c60c9a0c12f72509bea658ccf58dad60b25dfdafcfb7ea2f3001"} Dec 06 01:08:19 crc kubenswrapper[4991]: I1206 01:08:19.594781 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-v8msm" Dec 06 01:08:19 crc kubenswrapper[4991]: I1206 01:08:19.595434 4991 scope.go:117] "RemoveContainer" containerID="c4463813345be68b0861983adfa026ac9ebc3d8366ca7332b286582d723c53b1" Dec 06 01:08:19 crc kubenswrapper[4991]: I1206 01:08:19.631369 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-v8msm"] Dec 06 01:08:19 crc kubenswrapper[4991]: I1206 01:08:19.639183 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-v8msm"] Dec 06 01:08:21 crc kubenswrapper[4991]: I1206 01:08:21.054409 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4afb29d9-e4be-4c64-88d4-823d89c15ec8" path="/var/lib/kubelet/pods/4afb29d9-e4be-4c64-88d4-823d89c15ec8/volumes" Dec 06 01:08:46 crc kubenswrapper[4991]: I1206 01:08:46.739497 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 01:08:46 crc kubenswrapper[4991]: I1206 01:08:46.740106 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 01:09:16 crc kubenswrapper[4991]: I1206 01:09:16.739751 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 01:09:16 crc kubenswrapper[4991]: I1206 01:09:16.740656 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 01:09:16 crc kubenswrapper[4991]: I1206 01:09:16.740752 4991 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" Dec 06 01:09:16 crc kubenswrapper[4991]: I1206 01:09:16.742207 4991 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fa487fa248ed7047ff7af2295096d5dbccca7d43cd1f2ade21a82400151996cb"} pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 01:09:16 crc kubenswrapper[4991]: I1206 01:09:16.742330 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" containerID="cri-o://fa487fa248ed7047ff7af2295096d5dbccca7d43cd1f2ade21a82400151996cb" gracePeriod=600 Dec 06 01:09:17 crc kubenswrapper[4991]: I1206 01:09:17.015773 4991 generic.go:334] "Generic (PLEG): container finished" podID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerID="fa487fa248ed7047ff7af2295096d5dbccca7d43cd1f2ade21a82400151996cb" exitCode=0 Dec 06 01:09:17 crc kubenswrapper[4991]: I1206 01:09:17.015873 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" event={"ID":"f35cea72-9e31-4432-9e3b-16a50ebce794","Type":"ContainerDied","Data":"fa487fa248ed7047ff7af2295096d5dbccca7d43cd1f2ade21a82400151996cb"} Dec 06 01:09:17 crc kubenswrapper[4991]: I1206 01:09:17.016204 4991 scope.go:117] "RemoveContainer" containerID="674001323aaa7a65c9d6df014b5b35f7afb939aa6afe57ac9858923e3fe6ce15" Dec 06 01:09:18 crc kubenswrapper[4991]: I1206 01:09:18.028294 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" event={"ID":"f35cea72-9e31-4432-9e3b-16a50ebce794","Type":"ContainerStarted","Data":"170286243e72a237ea08578f06e304cbd6d9204823fba253174d3c0df2ac7b5e"} Dec 06 01:11:46 crc kubenswrapper[4991]: I1206 01:11:46.739711 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 01:11:46 crc kubenswrapper[4991]: I1206 01:11:46.740609 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 01:12:16 crc kubenswrapper[4991]: I1206 01:12:16.739792 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 01:12:16 crc kubenswrapper[4991]: I1206 01:12:16.740674 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 01:12:46 crc kubenswrapper[4991]: I1206 01:12:46.740425 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 01:12:46 crc kubenswrapper[4991]: I1206 01:12:46.741072 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 01:12:46 crc kubenswrapper[4991]: I1206 01:12:46.741146 4991 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" Dec 06 01:12:46 crc kubenswrapper[4991]: I1206 01:12:46.742123 4991 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"170286243e72a237ea08578f06e304cbd6d9204823fba253174d3c0df2ac7b5e"} pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 01:12:46 crc kubenswrapper[4991]: I1206 01:12:46.742215 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" containerID="cri-o://170286243e72a237ea08578f06e304cbd6d9204823fba253174d3c0df2ac7b5e" gracePeriod=600 Dec 06 01:12:47 crc kubenswrapper[4991]: I1206 01:12:47.650494 4991 generic.go:334] "Generic (PLEG): container finished" podID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerID="170286243e72a237ea08578f06e304cbd6d9204823fba253174d3c0df2ac7b5e" exitCode=0 Dec 06 01:12:47 crc kubenswrapper[4991]: I1206 01:12:47.650594 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" event={"ID":"f35cea72-9e31-4432-9e3b-16a50ebce794","Type":"ContainerDied","Data":"170286243e72a237ea08578f06e304cbd6d9204823fba253174d3c0df2ac7b5e"} Dec 06 01:12:47 crc kubenswrapper[4991]: I1206 01:12:47.650943 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" event={"ID":"f35cea72-9e31-4432-9e3b-16a50ebce794","Type":"ContainerStarted","Data":"42613daf70f47f0e109690b595c837fa9976654bde53b0036709cbe0f06846cc"} Dec 06 01:12:47 crc kubenswrapper[4991]: I1206 01:12:47.651013 4991 scope.go:117] "RemoveContainer" containerID="fa487fa248ed7047ff7af2295096d5dbccca7d43cd1f2ade21a82400151996cb" Dec 06 01:14:53 crc kubenswrapper[4991]: I1206 01:14:53.945071 4991 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 06 01:15:00 crc kubenswrapper[4991]: I1206 01:15:00.196559 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416395-kn927"] Dec 06 01:15:00 crc kubenswrapper[4991]: E1206 01:15:00.197590 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4afb29d9-e4be-4c64-88d4-823d89c15ec8" containerName="registry" Dec 06 01:15:00 crc kubenswrapper[4991]: I1206 01:15:00.197611 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="4afb29d9-e4be-4c64-88d4-823d89c15ec8" containerName="registry" Dec 06 01:15:00 crc kubenswrapper[4991]: I1206 01:15:00.197764 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="4afb29d9-e4be-4c64-88d4-823d89c15ec8" containerName="registry" Dec 06 01:15:00 crc kubenswrapper[4991]: I1206 01:15:00.198426 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416395-kn927" Dec 06 01:15:00 crc kubenswrapper[4991]: I1206 01:15:00.201705 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 06 01:15:00 crc kubenswrapper[4991]: I1206 01:15:00.201890 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 06 01:15:00 crc kubenswrapper[4991]: I1206 01:15:00.209397 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416395-kn927"] Dec 06 01:15:00 crc kubenswrapper[4991]: I1206 01:15:00.335777 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e4d64bf7-c39d-4ec3-86c6-e14d46d20388-secret-volume\") pod \"collect-profiles-29416395-kn927\" (UID: \"e4d64bf7-c39d-4ec3-86c6-e14d46d20388\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416395-kn927" Dec 06 01:15:00 crc kubenswrapper[4991]: I1206 01:15:00.335859 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e4d64bf7-c39d-4ec3-86c6-e14d46d20388-config-volume\") pod \"collect-profiles-29416395-kn927\" (UID: \"e4d64bf7-c39d-4ec3-86c6-e14d46d20388\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416395-kn927" Dec 06 01:15:00 crc kubenswrapper[4991]: I1206 01:15:00.335900 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxpmm\" (UniqueName: \"kubernetes.io/projected/e4d64bf7-c39d-4ec3-86c6-e14d46d20388-kube-api-access-mxpmm\") pod \"collect-profiles-29416395-kn927\" (UID: \"e4d64bf7-c39d-4ec3-86c6-e14d46d20388\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416395-kn927" Dec 06 01:15:00 crc kubenswrapper[4991]: I1206 01:15:00.437459 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e4d64bf7-c39d-4ec3-86c6-e14d46d20388-secret-volume\") pod \"collect-profiles-29416395-kn927\" (UID: \"e4d64bf7-c39d-4ec3-86c6-e14d46d20388\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416395-kn927" Dec 06 01:15:00 crc kubenswrapper[4991]: I1206 01:15:00.437536 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e4d64bf7-c39d-4ec3-86c6-e14d46d20388-config-volume\") pod \"collect-profiles-29416395-kn927\" (UID: \"e4d64bf7-c39d-4ec3-86c6-e14d46d20388\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416395-kn927" Dec 06 01:15:00 crc kubenswrapper[4991]: I1206 01:15:00.437566 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxpmm\" (UniqueName: \"kubernetes.io/projected/e4d64bf7-c39d-4ec3-86c6-e14d46d20388-kube-api-access-mxpmm\") pod \"collect-profiles-29416395-kn927\" (UID: \"e4d64bf7-c39d-4ec3-86c6-e14d46d20388\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416395-kn927" Dec 06 01:15:00 crc kubenswrapper[4991]: I1206 01:15:00.439279 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e4d64bf7-c39d-4ec3-86c6-e14d46d20388-config-volume\") pod \"collect-profiles-29416395-kn927\" (UID: \"e4d64bf7-c39d-4ec3-86c6-e14d46d20388\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416395-kn927" Dec 06 01:15:00 crc kubenswrapper[4991]: I1206 01:15:00.445873 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e4d64bf7-c39d-4ec3-86c6-e14d46d20388-secret-volume\") pod \"collect-profiles-29416395-kn927\" (UID: \"e4d64bf7-c39d-4ec3-86c6-e14d46d20388\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416395-kn927" Dec 06 01:15:00 crc kubenswrapper[4991]: I1206 01:15:00.460141 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxpmm\" (UniqueName: \"kubernetes.io/projected/e4d64bf7-c39d-4ec3-86c6-e14d46d20388-kube-api-access-mxpmm\") pod \"collect-profiles-29416395-kn927\" (UID: \"e4d64bf7-c39d-4ec3-86c6-e14d46d20388\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416395-kn927" Dec 06 01:15:00 crc kubenswrapper[4991]: I1206 01:15:00.520227 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416395-kn927" Dec 06 01:15:00 crc kubenswrapper[4991]: I1206 01:15:00.716905 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416395-kn927"] Dec 06 01:15:00 crc kubenswrapper[4991]: W1206 01:15:00.730394 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4d64bf7_c39d_4ec3_86c6_e14d46d20388.slice/crio-be9a497a454b48019ae57a6c565fd1c6f19fd2ae4249f80cbbc76333fc43695e WatchSource:0}: Error finding container be9a497a454b48019ae57a6c565fd1c6f19fd2ae4249f80cbbc76333fc43695e: Status 404 returned error can't find the container with id be9a497a454b48019ae57a6c565fd1c6f19fd2ae4249f80cbbc76333fc43695e Dec 06 01:15:01 crc kubenswrapper[4991]: I1206 01:15:01.709674 4991 generic.go:334] "Generic (PLEG): container finished" podID="e4d64bf7-c39d-4ec3-86c6-e14d46d20388" containerID="5e8c4810473d427850db919966144089aee9a221e28ce2fb315f808076fb0205" exitCode=0 Dec 06 01:15:01 crc kubenswrapper[4991]: I1206 01:15:01.709741 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416395-kn927" event={"ID":"e4d64bf7-c39d-4ec3-86c6-e14d46d20388","Type":"ContainerDied","Data":"5e8c4810473d427850db919966144089aee9a221e28ce2fb315f808076fb0205"} Dec 06 01:15:01 crc kubenswrapper[4991]: I1206 01:15:01.709781 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416395-kn927" event={"ID":"e4d64bf7-c39d-4ec3-86c6-e14d46d20388","Type":"ContainerStarted","Data":"be9a497a454b48019ae57a6c565fd1c6f19fd2ae4249f80cbbc76333fc43695e"} Dec 06 01:15:03 crc kubenswrapper[4991]: I1206 01:15:03.026567 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416395-kn927" Dec 06 01:15:03 crc kubenswrapper[4991]: I1206 01:15:03.199177 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e4d64bf7-c39d-4ec3-86c6-e14d46d20388-config-volume\") pod \"e4d64bf7-c39d-4ec3-86c6-e14d46d20388\" (UID: \"e4d64bf7-c39d-4ec3-86c6-e14d46d20388\") " Dec 06 01:15:03 crc kubenswrapper[4991]: I1206 01:15:03.199370 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e4d64bf7-c39d-4ec3-86c6-e14d46d20388-secret-volume\") pod \"e4d64bf7-c39d-4ec3-86c6-e14d46d20388\" (UID: \"e4d64bf7-c39d-4ec3-86c6-e14d46d20388\") " Dec 06 01:15:03 crc kubenswrapper[4991]: I1206 01:15:03.199449 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxpmm\" (UniqueName: \"kubernetes.io/projected/e4d64bf7-c39d-4ec3-86c6-e14d46d20388-kube-api-access-mxpmm\") pod \"e4d64bf7-c39d-4ec3-86c6-e14d46d20388\" (UID: \"e4d64bf7-c39d-4ec3-86c6-e14d46d20388\") " Dec 06 01:15:03 crc kubenswrapper[4991]: I1206 01:15:03.200626 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4d64bf7-c39d-4ec3-86c6-e14d46d20388-config-volume" (OuterVolumeSpecName: "config-volume") pod "e4d64bf7-c39d-4ec3-86c6-e14d46d20388" (UID: "e4d64bf7-c39d-4ec3-86c6-e14d46d20388"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:15:03 crc kubenswrapper[4991]: I1206 01:15:03.208857 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4d64bf7-c39d-4ec3-86c6-e14d46d20388-kube-api-access-mxpmm" (OuterVolumeSpecName: "kube-api-access-mxpmm") pod "e4d64bf7-c39d-4ec3-86c6-e14d46d20388" (UID: "e4d64bf7-c39d-4ec3-86c6-e14d46d20388"). InnerVolumeSpecName "kube-api-access-mxpmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:15:03 crc kubenswrapper[4991]: I1206 01:15:03.208931 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4d64bf7-c39d-4ec3-86c6-e14d46d20388-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e4d64bf7-c39d-4ec3-86c6-e14d46d20388" (UID: "e4d64bf7-c39d-4ec3-86c6-e14d46d20388"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:15:03 crc kubenswrapper[4991]: I1206 01:15:03.301359 4991 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e4d64bf7-c39d-4ec3-86c6-e14d46d20388-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 06 01:15:03 crc kubenswrapper[4991]: I1206 01:15:03.301414 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxpmm\" (UniqueName: \"kubernetes.io/projected/e4d64bf7-c39d-4ec3-86c6-e14d46d20388-kube-api-access-mxpmm\") on node \"crc\" DevicePath \"\"" Dec 06 01:15:03 crc kubenswrapper[4991]: I1206 01:15:03.301427 4991 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e4d64bf7-c39d-4ec3-86c6-e14d46d20388-config-volume\") on node \"crc\" DevicePath \"\"" Dec 06 01:15:03 crc kubenswrapper[4991]: I1206 01:15:03.725926 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416395-kn927" event={"ID":"e4d64bf7-c39d-4ec3-86c6-e14d46d20388","Type":"ContainerDied","Data":"be9a497a454b48019ae57a6c565fd1c6f19fd2ae4249f80cbbc76333fc43695e"} Dec 06 01:15:03 crc kubenswrapper[4991]: I1206 01:15:03.726483 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be9a497a454b48019ae57a6c565fd1c6f19fd2ae4249f80cbbc76333fc43695e" Dec 06 01:15:03 crc kubenswrapper[4991]: I1206 01:15:03.726037 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416395-kn927" Dec 06 01:15:10 crc kubenswrapper[4991]: I1206 01:15:10.876341 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-4w5qq"] Dec 06 01:15:10 crc kubenswrapper[4991]: E1206 01:15:10.877025 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4d64bf7-c39d-4ec3-86c6-e14d46d20388" containerName="collect-profiles" Dec 06 01:15:10 crc kubenswrapper[4991]: I1206 01:15:10.877044 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4d64bf7-c39d-4ec3-86c6-e14d46d20388" containerName="collect-profiles" Dec 06 01:15:10 crc kubenswrapper[4991]: I1206 01:15:10.877194 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4d64bf7-c39d-4ec3-86c6-e14d46d20388" containerName="collect-profiles" Dec 06 01:15:10 crc kubenswrapper[4991]: I1206 01:15:10.877683 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-4w5qq" Dec 06 01:15:10 crc kubenswrapper[4991]: I1206 01:15:10.883354 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 06 01:15:10 crc kubenswrapper[4991]: I1206 01:15:10.884222 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 06 01:15:10 crc kubenswrapper[4991]: I1206 01:15:10.886003 4991 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-pffvq" Dec 06 01:15:10 crc kubenswrapper[4991]: I1206 01:15:10.894654 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-wcfbf"] Dec 06 01:15:10 crc kubenswrapper[4991]: I1206 01:15:10.895830 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-wcfbf" Dec 06 01:15:10 crc kubenswrapper[4991]: I1206 01:15:10.897388 4991 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-lhd9w" Dec 06 01:15:10 crc kubenswrapper[4991]: I1206 01:15:10.908955 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-2zqr8"] Dec 06 01:15:10 crc kubenswrapper[4991]: I1206 01:15:10.909727 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-2zqr8" Dec 06 01:15:10 crc kubenswrapper[4991]: I1206 01:15:10.911319 4991 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-zpxg2" Dec 06 01:15:10 crc kubenswrapper[4991]: I1206 01:15:10.912263 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-4w5qq"] Dec 06 01:15:10 crc kubenswrapper[4991]: I1206 01:15:10.915747 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-2zqr8"] Dec 06 01:15:10 crc kubenswrapper[4991]: I1206 01:15:10.921757 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-wcfbf"] Dec 06 01:15:11 crc kubenswrapper[4991]: I1206 01:15:11.011540 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg8k6\" (UniqueName: \"kubernetes.io/projected/8bcf50ab-be54-4a1a-95ca-f43426ebf55e-kube-api-access-wg8k6\") pod \"cert-manager-cainjector-7f985d654d-wcfbf\" (UID: \"8bcf50ab-be54-4a1a-95ca-f43426ebf55e\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-wcfbf" Dec 06 01:15:11 crc kubenswrapper[4991]: I1206 01:15:11.012074 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b6gl\" (UniqueName: \"kubernetes.io/projected/4c9fcb25-80e7-4c9a-b31d-68a4b1b82411-kube-api-access-2b6gl\") pod \"cert-manager-5b446d88c5-4w5qq\" (UID: \"4c9fcb25-80e7-4c9a-b31d-68a4b1b82411\") " pod="cert-manager/cert-manager-5b446d88c5-4w5qq" Dec 06 01:15:11 crc kubenswrapper[4991]: I1206 01:15:11.114051 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg8k6\" (UniqueName: \"kubernetes.io/projected/8bcf50ab-be54-4a1a-95ca-f43426ebf55e-kube-api-access-wg8k6\") pod \"cert-manager-cainjector-7f985d654d-wcfbf\" (UID: \"8bcf50ab-be54-4a1a-95ca-f43426ebf55e\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-wcfbf" Dec 06 01:15:11 crc kubenswrapper[4991]: I1206 01:15:11.114184 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2b6gl\" (UniqueName: \"kubernetes.io/projected/4c9fcb25-80e7-4c9a-b31d-68a4b1b82411-kube-api-access-2b6gl\") pod \"cert-manager-5b446d88c5-4w5qq\" (UID: \"4c9fcb25-80e7-4c9a-b31d-68a4b1b82411\") " pod="cert-manager/cert-manager-5b446d88c5-4w5qq" Dec 06 01:15:11 crc kubenswrapper[4991]: I1206 01:15:11.114285 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df4vp\" (UniqueName: \"kubernetes.io/projected/217fa056-c208-4c13-af81-12e5e1c4d231-kube-api-access-df4vp\") pod \"cert-manager-webhook-5655c58dd6-2zqr8\" (UID: \"217fa056-c208-4c13-af81-12e5e1c4d231\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-2zqr8" Dec 06 01:15:11 crc kubenswrapper[4991]: I1206 01:15:11.134860 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg8k6\" (UniqueName: \"kubernetes.io/projected/8bcf50ab-be54-4a1a-95ca-f43426ebf55e-kube-api-access-wg8k6\") pod \"cert-manager-cainjector-7f985d654d-wcfbf\" (UID: \"8bcf50ab-be54-4a1a-95ca-f43426ebf55e\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-wcfbf" Dec 06 01:15:11 crc kubenswrapper[4991]: I1206 01:15:11.136852 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b6gl\" (UniqueName: \"kubernetes.io/projected/4c9fcb25-80e7-4c9a-b31d-68a4b1b82411-kube-api-access-2b6gl\") pod \"cert-manager-5b446d88c5-4w5qq\" (UID: \"4c9fcb25-80e7-4c9a-b31d-68a4b1b82411\") " pod="cert-manager/cert-manager-5b446d88c5-4w5qq" Dec 06 01:15:11 crc kubenswrapper[4991]: I1206 01:15:11.194903 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-4w5qq" Dec 06 01:15:11 crc kubenswrapper[4991]: I1206 01:15:11.209637 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-wcfbf" Dec 06 01:15:11 crc kubenswrapper[4991]: I1206 01:15:11.214919 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-df4vp\" (UniqueName: \"kubernetes.io/projected/217fa056-c208-4c13-af81-12e5e1c4d231-kube-api-access-df4vp\") pod \"cert-manager-webhook-5655c58dd6-2zqr8\" (UID: \"217fa056-c208-4c13-af81-12e5e1c4d231\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-2zqr8" Dec 06 01:15:11 crc kubenswrapper[4991]: I1206 01:15:11.243828 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-df4vp\" (UniqueName: \"kubernetes.io/projected/217fa056-c208-4c13-af81-12e5e1c4d231-kube-api-access-df4vp\") pod \"cert-manager-webhook-5655c58dd6-2zqr8\" (UID: \"217fa056-c208-4c13-af81-12e5e1c4d231\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-2zqr8" Dec 06 01:15:11 crc kubenswrapper[4991]: I1206 01:15:11.431724 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-wcfbf"] Dec 06 01:15:11 crc kubenswrapper[4991]: I1206 01:15:11.443286 4991 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 01:15:11 crc kubenswrapper[4991]: I1206 01:15:11.469755 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-4w5qq"] Dec 06 01:15:11 crc kubenswrapper[4991]: W1206 01:15:11.471642 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c9fcb25_80e7_4c9a_b31d_68a4b1b82411.slice/crio-25c75628076c1cf3f1c9c57f07784d0dbb2aaf4c7334138f7aa246d46c5a980b WatchSource:0}: Error finding container 25c75628076c1cf3f1c9c57f07784d0dbb2aaf4c7334138f7aa246d46c5a980b: Status 404 returned error can't find the container with id 25c75628076c1cf3f1c9c57f07784d0dbb2aaf4c7334138f7aa246d46c5a980b Dec 06 01:15:11 crc kubenswrapper[4991]: I1206 01:15:11.522321 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-2zqr8" Dec 06 01:15:11 crc kubenswrapper[4991]: I1206 01:15:11.794589 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-2zqr8"] Dec 06 01:15:11 crc kubenswrapper[4991]: I1206 01:15:11.798332 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-4w5qq" event={"ID":"4c9fcb25-80e7-4c9a-b31d-68a4b1b82411","Type":"ContainerStarted","Data":"25c75628076c1cf3f1c9c57f07784d0dbb2aaf4c7334138f7aa246d46c5a980b"} Dec 06 01:15:11 crc kubenswrapper[4991]: W1206 01:15:11.803228 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod217fa056_c208_4c13_af81_12e5e1c4d231.slice/crio-58691d0322bc20955e50f1d64ca25e916b9348e9be658a09bfc9697a39fde7bc WatchSource:0}: Error finding container 58691d0322bc20955e50f1d64ca25e916b9348e9be658a09bfc9697a39fde7bc: Status 404 returned error can't find the container with id 58691d0322bc20955e50f1d64ca25e916b9348e9be658a09bfc9697a39fde7bc Dec 06 01:15:11 crc kubenswrapper[4991]: I1206 01:15:11.803741 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-wcfbf" event={"ID":"8bcf50ab-be54-4a1a-95ca-f43426ebf55e","Type":"ContainerStarted","Data":"074897da8ce9eee3add1a2d582c712d8163c2c9ef10872d635f58ecc7e290d5a"} Dec 06 01:15:12 crc kubenswrapper[4991]: I1206 01:15:12.814660 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-2zqr8" event={"ID":"217fa056-c208-4c13-af81-12e5e1c4d231","Type":"ContainerStarted","Data":"58691d0322bc20955e50f1d64ca25e916b9348e9be658a09bfc9697a39fde7bc"} Dec 06 01:15:14 crc kubenswrapper[4991]: I1206 01:15:14.827694 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-wcfbf" event={"ID":"8bcf50ab-be54-4a1a-95ca-f43426ebf55e","Type":"ContainerStarted","Data":"c80ac283f5a64f23781274f10b7251ea75531bbe60f0f7916eaf6c0eb95e60d6"} Dec 06 01:15:14 crc kubenswrapper[4991]: I1206 01:15:14.829764 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-4w5qq" event={"ID":"4c9fcb25-80e7-4c9a-b31d-68a4b1b82411","Type":"ContainerStarted","Data":"b108fcc6bb67bc0e5af4e19e291f5ca76c716d057b3053e1469fc80fb1615f3d"} Dec 06 01:15:14 crc kubenswrapper[4991]: I1206 01:15:14.846886 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-wcfbf" podStartSLOduration=2.2980936290000002 podStartE2EDuration="4.846858817s" podCreationTimestamp="2025-12-06 01:15:10 +0000 UTC" firstStartedPulling="2025-12-06 01:15:11.442703434 +0000 UTC m=+784.792993902" lastFinishedPulling="2025-12-06 01:15:13.991468612 +0000 UTC m=+787.341759090" observedRunningTime="2025-12-06 01:15:14.846150538 +0000 UTC m=+788.196441056" watchObservedRunningTime="2025-12-06 01:15:14.846858817 +0000 UTC m=+788.197149285" Dec 06 01:15:14 crc kubenswrapper[4991]: I1206 01:15:14.873333 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-4w5qq" podStartSLOduration=2.3077534760000002 podStartE2EDuration="4.873308721s" podCreationTimestamp="2025-12-06 01:15:10 +0000 UTC" firstStartedPulling="2025-12-06 01:15:11.47559175 +0000 UTC m=+784.825882218" lastFinishedPulling="2025-12-06 01:15:14.041146995 +0000 UTC m=+787.391437463" observedRunningTime="2025-12-06 01:15:14.865235266 +0000 UTC m=+788.215525774" watchObservedRunningTime="2025-12-06 01:15:14.873308721 +0000 UTC m=+788.223599189" Dec 06 01:15:15 crc kubenswrapper[4991]: I1206 01:15:15.843557 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-2zqr8" event={"ID":"217fa056-c208-4c13-af81-12e5e1c4d231","Type":"ContainerStarted","Data":"04928f17400ad3c152ab34699c3ad8199a05f128366e656fcb060e7929def1e2"} Dec 06 01:15:15 crc kubenswrapper[4991]: I1206 01:15:15.868345 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-2zqr8" podStartSLOduration=2.473592291 podStartE2EDuration="5.868321802s" podCreationTimestamp="2025-12-06 01:15:10 +0000 UTC" firstStartedPulling="2025-12-06 01:15:11.805772191 +0000 UTC m=+785.156062649" lastFinishedPulling="2025-12-06 01:15:15.200501692 +0000 UTC m=+788.550792160" observedRunningTime="2025-12-06 01:15:15.86336249 +0000 UTC m=+789.213652988" watchObservedRunningTime="2025-12-06 01:15:15.868321802 +0000 UTC m=+789.218612300" Dec 06 01:15:16 crc kubenswrapper[4991]: I1206 01:15:16.522390 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-2zqr8" Dec 06 01:15:16 crc kubenswrapper[4991]: I1206 01:15:16.740120 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 01:15:16 crc kubenswrapper[4991]: I1206 01:15:16.740218 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 01:15:21 crc kubenswrapper[4991]: I1206 01:15:21.525686 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-2zqr8" Dec 06 01:15:21 crc kubenswrapper[4991]: I1206 01:15:21.708963 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5t29r"] Dec 06 01:15:21 crc kubenswrapper[4991]: I1206 01:15:21.709781 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerName="ovn-controller" containerID="cri-o://a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8" gracePeriod=30 Dec 06 01:15:21 crc kubenswrapper[4991]: I1206 01:15:21.709814 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerName="nbdb" containerID="cri-o://223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d" gracePeriod=30 Dec 06 01:15:21 crc kubenswrapper[4991]: I1206 01:15:21.710064 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerName="sbdb" containerID="cri-o://fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda" gracePeriod=30 Dec 06 01:15:21 crc kubenswrapper[4991]: I1206 01:15:21.710104 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerName="ovn-acl-logging" containerID="cri-o://ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842" gracePeriod=30 Dec 06 01:15:21 crc kubenswrapper[4991]: I1206 01:15:21.710137 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156" gracePeriod=30 Dec 06 01:15:21 crc kubenswrapper[4991]: I1206 01:15:21.710082 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerName="kube-rbac-proxy-node" containerID="cri-o://a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86" gracePeriod=30 Dec 06 01:15:21 crc kubenswrapper[4991]: I1206 01:15:21.710180 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerName="northd" containerID="cri-o://8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7" gracePeriod=30 Dec 06 01:15:21 crc kubenswrapper[4991]: I1206 01:15:21.768613 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerName="ovnkube-controller" containerID="cri-o://560c4f661e91f29a61975f6700a9745378b967d04dd66204f7b0b1a0b219abd3" gracePeriod=30 Dec 06 01:15:21 crc kubenswrapper[4991]: I1206 01:15:21.890786 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5t29r_7a1c8508-3f35-44b2-988e-cc665121ce94/ovnkube-controller/3.log" Dec 06 01:15:21 crc kubenswrapper[4991]: I1206 01:15:21.893074 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5t29r_7a1c8508-3f35-44b2-988e-cc665121ce94/ovn-acl-logging/0.log" Dec 06 01:15:21 crc kubenswrapper[4991]: I1206 01:15:21.893721 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5t29r_7a1c8508-3f35-44b2-988e-cc665121ce94/ovn-controller/0.log" Dec 06 01:15:21 crc kubenswrapper[4991]: I1206 01:15:21.894184 4991 generic.go:334] "Generic (PLEG): container finished" podID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerID="019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156" exitCode=0 Dec 06 01:15:21 crc kubenswrapper[4991]: I1206 01:15:21.894224 4991 generic.go:334] "Generic (PLEG): container finished" podID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerID="a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86" exitCode=0 Dec 06 01:15:21 crc kubenswrapper[4991]: I1206 01:15:21.894238 4991 generic.go:334] "Generic (PLEG): container finished" podID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerID="ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842" exitCode=143 Dec 06 01:15:21 crc kubenswrapper[4991]: I1206 01:15:21.894253 4991 generic.go:334] "Generic (PLEG): container finished" podID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerID="a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8" exitCode=143 Dec 06 01:15:21 crc kubenswrapper[4991]: I1206 01:15:21.894310 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" event={"ID":"7a1c8508-3f35-44b2-988e-cc665121ce94","Type":"ContainerDied","Data":"019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156"} Dec 06 01:15:21 crc kubenswrapper[4991]: I1206 01:15:21.894345 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" event={"ID":"7a1c8508-3f35-44b2-988e-cc665121ce94","Type":"ContainerDied","Data":"a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86"} Dec 06 01:15:21 crc kubenswrapper[4991]: I1206 01:15:21.894365 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" event={"ID":"7a1c8508-3f35-44b2-988e-cc665121ce94","Type":"ContainerDied","Data":"ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842"} Dec 06 01:15:21 crc kubenswrapper[4991]: I1206 01:15:21.894381 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" event={"ID":"7a1c8508-3f35-44b2-988e-cc665121ce94","Type":"ContainerDied","Data":"a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8"} Dec 06 01:15:21 crc kubenswrapper[4991]: I1206 01:15:21.896915 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sq7kg_06f95c58-fd69-48f6-94be-15927aa850c2/kube-multus/2.log" Dec 06 01:15:21 crc kubenswrapper[4991]: I1206 01:15:21.897734 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sq7kg_06f95c58-fd69-48f6-94be-15927aa850c2/kube-multus/1.log" Dec 06 01:15:21 crc kubenswrapper[4991]: I1206 01:15:21.897784 4991 generic.go:334] "Generic (PLEG): container finished" podID="06f95c58-fd69-48f6-94be-15927aa850c2" containerID="f8b1d542438157988a9ed58865b298ebff8c7d79f1ab942a399a01c263d6b4eb" exitCode=2 Dec 06 01:15:21 crc kubenswrapper[4991]: I1206 01:15:21.897814 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sq7kg" event={"ID":"06f95c58-fd69-48f6-94be-15927aa850c2","Type":"ContainerDied","Data":"f8b1d542438157988a9ed58865b298ebff8c7d79f1ab942a399a01c263d6b4eb"} Dec 06 01:15:21 crc kubenswrapper[4991]: I1206 01:15:21.897848 4991 scope.go:117] "RemoveContainer" containerID="ce2778914060f77aac8e6e77fc814436aeed10afdc9565dcd71b3b4c15324072" Dec 06 01:15:21 crc kubenswrapper[4991]: I1206 01:15:21.904171 4991 scope.go:117] "RemoveContainer" containerID="f8b1d542438157988a9ed58865b298ebff8c7d79f1ab942a399a01c263d6b4eb" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.069744 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5t29r_7a1c8508-3f35-44b2-988e-cc665121ce94/ovnkube-controller/3.log" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.071642 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5t29r_7a1c8508-3f35-44b2-988e-cc665121ce94/ovn-acl-logging/0.log" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.072034 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5t29r_7a1c8508-3f35-44b2-988e-cc665121ce94/ovn-controller/0.log" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.072374 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.106533 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-host-cni-bin\") pod \"7a1c8508-3f35-44b2-988e-cc665121ce94\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.106579 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-host-kubelet\") pod \"7a1c8508-3f35-44b2-988e-cc665121ce94\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.106601 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-host-run-netns\") pod \"7a1c8508-3f35-44b2-988e-cc665121ce94\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.106623 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-log-socket\") pod \"7a1c8508-3f35-44b2-988e-cc665121ce94\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.106652 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pnfh\" (UniqueName: \"kubernetes.io/projected/7a1c8508-3f35-44b2-988e-cc665121ce94-kube-api-access-5pnfh\") pod \"7a1c8508-3f35-44b2-988e-cc665121ce94\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.106691 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-etc-openvswitch\") pod \"7a1c8508-3f35-44b2-988e-cc665121ce94\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.106714 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-host-var-lib-cni-networks-ovn-kubernetes\") pod \"7a1c8508-3f35-44b2-988e-cc665121ce94\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.106739 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-systemd-units\") pod \"7a1c8508-3f35-44b2-988e-cc665121ce94\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.106765 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-host-cni-netd\") pod \"7a1c8508-3f35-44b2-988e-cc665121ce94\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.106803 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7a1c8508-3f35-44b2-988e-cc665121ce94-env-overrides\") pod \"7a1c8508-3f35-44b2-988e-cc665121ce94\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.106830 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7a1c8508-3f35-44b2-988e-cc665121ce94-ovnkube-script-lib\") pod \"7a1c8508-3f35-44b2-988e-cc665121ce94\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.106829 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "7a1c8508-3f35-44b2-988e-cc665121ce94" (UID: "7a1c8508-3f35-44b2-988e-cc665121ce94"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.106853 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "7a1c8508-3f35-44b2-988e-cc665121ce94" (UID: "7a1c8508-3f35-44b2-988e-cc665121ce94"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.106861 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-host-slash\") pod \"7a1c8508-3f35-44b2-988e-cc665121ce94\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.106909 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-host-slash" (OuterVolumeSpecName: "host-slash") pod "7a1c8508-3f35-44b2-988e-cc665121ce94" (UID: "7a1c8508-3f35-44b2-988e-cc665121ce94"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.106935 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-log-socket" (OuterVolumeSpecName: "log-socket") pod "7a1c8508-3f35-44b2-988e-cc665121ce94" (UID: "7a1c8508-3f35-44b2-988e-cc665121ce94"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.106947 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-host-run-ovn-kubernetes\") pod \"7a1c8508-3f35-44b2-988e-cc665121ce94\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.106954 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "7a1c8508-3f35-44b2-988e-cc665121ce94" (UID: "7a1c8508-3f35-44b2-988e-cc665121ce94"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.106993 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-var-lib-openvswitch\") pod \"7a1c8508-3f35-44b2-988e-cc665121ce94\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.107004 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "7a1c8508-3f35-44b2-988e-cc665121ce94" (UID: "7a1c8508-3f35-44b2-988e-cc665121ce94"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.107027 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7a1c8508-3f35-44b2-988e-cc665121ce94-ovn-node-metrics-cert\") pod \"7a1c8508-3f35-44b2-988e-cc665121ce94\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.107033 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "7a1c8508-3f35-44b2-988e-cc665121ce94" (UID: "7a1c8508-3f35-44b2-988e-cc665121ce94"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.107121 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-run-openvswitch\") pod \"7a1c8508-3f35-44b2-988e-cc665121ce94\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.107141 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-run-systemd\") pod \"7a1c8508-3f35-44b2-988e-cc665121ce94\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.107170 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7a1c8508-3f35-44b2-988e-cc665121ce94-ovnkube-config\") pod \"7a1c8508-3f35-44b2-988e-cc665121ce94\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.107230 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-run-ovn\") pod \"7a1c8508-3f35-44b2-988e-cc665121ce94\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.107249 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-node-log\") pod \"7a1c8508-3f35-44b2-988e-cc665121ce94\" (UID: \"7a1c8508-3f35-44b2-988e-cc665121ce94\") " Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.107449 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a1c8508-3f35-44b2-988e-cc665121ce94-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "7a1c8508-3f35-44b2-988e-cc665121ce94" (UID: "7a1c8508-3f35-44b2-988e-cc665121ce94"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.107903 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a1c8508-3f35-44b2-988e-cc665121ce94-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "7a1c8508-3f35-44b2-988e-cc665121ce94" (UID: "7a1c8508-3f35-44b2-988e-cc665121ce94"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.107952 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "7a1c8508-3f35-44b2-988e-cc665121ce94" (UID: "7a1c8508-3f35-44b2-988e-cc665121ce94"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.108105 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "7a1c8508-3f35-44b2-988e-cc665121ce94" (UID: "7a1c8508-3f35-44b2-988e-cc665121ce94"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.108143 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "7a1c8508-3f35-44b2-988e-cc665121ce94" (UID: "7a1c8508-3f35-44b2-988e-cc665121ce94"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.108178 4991 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.108199 4991 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.108210 4991 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-log-socket\") on node \"crc\" DevicePath \"\"" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.108220 4991 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.108230 4991 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.108242 4991 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.108279 4991 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.108287 4991 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7a1c8508-3f35-44b2-988e-cc665121ce94-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.108296 4991 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7a1c8508-3f35-44b2-988e-cc665121ce94-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.108305 4991 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-host-slash\") on node \"crc\" DevicePath \"\"" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.108314 4991 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.108169 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "7a1c8508-3f35-44b2-988e-cc665121ce94" (UID: "7a1c8508-3f35-44b2-988e-cc665121ce94"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.108204 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "7a1c8508-3f35-44b2-988e-cc665121ce94" (UID: "7a1c8508-3f35-44b2-988e-cc665121ce94"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.108535 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a1c8508-3f35-44b2-988e-cc665121ce94-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "7a1c8508-3f35-44b2-988e-cc665121ce94" (UID: "7a1c8508-3f35-44b2-988e-cc665121ce94"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.108855 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-node-log" (OuterVolumeSpecName: "node-log") pod "7a1c8508-3f35-44b2-988e-cc665121ce94" (UID: "7a1c8508-3f35-44b2-988e-cc665121ce94"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.108891 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "7a1c8508-3f35-44b2-988e-cc665121ce94" (UID: "7a1c8508-3f35-44b2-988e-cc665121ce94"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.116273 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a1c8508-3f35-44b2-988e-cc665121ce94-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "7a1c8508-3f35-44b2-988e-cc665121ce94" (UID: "7a1c8508-3f35-44b2-988e-cc665121ce94"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.117517 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a1c8508-3f35-44b2-988e-cc665121ce94-kube-api-access-5pnfh" (OuterVolumeSpecName: "kube-api-access-5pnfh") pod "7a1c8508-3f35-44b2-988e-cc665121ce94" (UID: "7a1c8508-3f35-44b2-988e-cc665121ce94"). InnerVolumeSpecName "kube-api-access-5pnfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.130032 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4cfz6"] Dec 06 01:15:22 crc kubenswrapper[4991]: E1206 01:15:22.130317 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerName="ovnkube-controller" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.130341 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerName="ovnkube-controller" Dec 06 01:15:22 crc kubenswrapper[4991]: E1206 01:15:22.130352 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerName="kube-rbac-proxy-node" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.130359 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerName="kube-rbac-proxy-node" Dec 06 01:15:22 crc kubenswrapper[4991]: E1206 01:15:22.130374 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerName="ovn-controller" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.130382 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerName="ovn-controller" Dec 06 01:15:22 crc kubenswrapper[4991]: E1206 01:15:22.130389 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerName="nbdb" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.130395 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerName="nbdb" Dec 06 01:15:22 crc kubenswrapper[4991]: E1206 01:15:22.130410 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerName="ovn-acl-logging" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.130416 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerName="ovn-acl-logging" Dec 06 01:15:22 crc kubenswrapper[4991]: E1206 01:15:22.130426 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerName="northd" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.130432 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerName="northd" Dec 06 01:15:22 crc kubenswrapper[4991]: E1206 01:15:22.130441 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerName="kubecfg-setup" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.130447 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerName="kubecfg-setup" Dec 06 01:15:22 crc kubenswrapper[4991]: E1206 01:15:22.130458 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerName="ovnkube-controller" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.130465 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerName="ovnkube-controller" Dec 06 01:15:22 crc kubenswrapper[4991]: E1206 01:15:22.130473 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerName="sbdb" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.130480 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerName="sbdb" Dec 06 01:15:22 crc kubenswrapper[4991]: E1206 01:15:22.130492 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerName="ovnkube-controller" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.130502 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerName="ovnkube-controller" Dec 06 01:15:22 crc kubenswrapper[4991]: E1206 01:15:22.130512 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerName="ovnkube-controller" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.130530 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerName="ovnkube-controller" Dec 06 01:15:22 crc kubenswrapper[4991]: E1206 01:15:22.130540 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerName="kube-rbac-proxy-ovn-metrics" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.130548 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerName="kube-rbac-proxy-ovn-metrics" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.130683 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerName="ovnkube-controller" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.130694 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerName="nbdb" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.130703 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerName="ovnkube-controller" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.130712 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerName="ovnkube-controller" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.130721 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerName="ovn-acl-logging" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.130756 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerName="ovnkube-controller" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.130764 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerName="northd" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.130772 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerName="kube-rbac-proxy-node" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.130779 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerName="sbdb" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.130793 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerName="ovn-controller" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.130803 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerName="kube-rbac-proxy-ovn-metrics" Dec 06 01:15:22 crc kubenswrapper[4991]: E1206 01:15:22.130918 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerName="ovnkube-controller" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.130926 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerName="ovnkube-controller" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.131063 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerName="ovnkube-controller" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.132826 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.136749 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "7a1c8508-3f35-44b2-988e-cc665121ce94" (UID: "7a1c8508-3f35-44b2-988e-cc665121ce94"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.209724 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-host-kubelet\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.209825 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-var-lib-openvswitch\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.209864 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-host-cni-bin\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.209888 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-log-socket\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.210055 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfwsx\" (UniqueName: \"kubernetes.io/projected/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-kube-api-access-tfwsx\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.210117 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-host-cni-netd\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.210152 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-systemd-units\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.210181 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-run-openvswitch\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.210226 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-run-ovn\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.210256 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-node-log\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.210291 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-host-slash\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.210306 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-ovnkube-script-lib\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.210338 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.210361 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-ovnkube-config\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.210379 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-etc-openvswitch\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.210409 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-host-run-netns\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.210426 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-ovn-node-metrics-cert\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.210444 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-env-overrides\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.210469 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-run-systemd\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.210551 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-host-run-ovn-kubernetes\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.210605 4991 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.210622 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pnfh\" (UniqueName: \"kubernetes.io/projected/7a1c8508-3f35-44b2-988e-cc665121ce94-kube-api-access-5pnfh\") on node \"crc\" DevicePath \"\"" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.210644 4991 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.210658 4991 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.210667 4991 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7a1c8508-3f35-44b2-988e-cc665121ce94-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.210678 4991 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.210687 4991 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7a1c8508-3f35-44b2-988e-cc665121ce94-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.210696 4991 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.210705 4991 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7a1c8508-3f35-44b2-988e-cc665121ce94-node-log\") on node \"crc\" DevicePath \"\"" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.312360 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-ovnkube-script-lib\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.312437 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.312475 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-ovnkube-config\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.312506 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-etc-openvswitch\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.312540 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-host-run-netns\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.312585 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-ovn-node-metrics-cert\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.312630 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-env-overrides\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.312679 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-run-systemd\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.312739 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-host-run-ovn-kubernetes\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.312793 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-host-kubelet\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.312828 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-var-lib-openvswitch\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.312862 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-host-cni-bin\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.312903 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-log-socket\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.312940 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfwsx\" (UniqueName: \"kubernetes.io/projected/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-kube-api-access-tfwsx\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.313025 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-host-cni-netd\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.313076 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-systemd-units\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.313110 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-run-openvswitch\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.313153 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-run-ovn\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.313184 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-node-log\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.313216 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-host-slash\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.313326 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-host-slash\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.314553 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-ovnkube-script-lib\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.314641 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.315077 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-var-lib-openvswitch\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.315143 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-run-systemd\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.315189 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-host-kubelet\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.315238 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-host-run-netns\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.315291 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-host-cni-netd\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.315281 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-host-run-ovn-kubernetes\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.315325 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-host-cni-bin\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.315378 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-log-socket\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.315375 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-etc-openvswitch\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.315432 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-run-openvswitch\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.315493 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-run-ovn\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.315507 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-systemd-units\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.315550 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-node-log\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.316294 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-env-overrides\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.316873 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-ovnkube-config\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.320905 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-ovn-node-metrics-cert\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.345894 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfwsx\" (UniqueName: \"kubernetes.io/projected/bb27057f-d6f5-44bb-baf4-1da4ba789bb1-kube-api-access-tfwsx\") pod \"ovnkube-node-4cfz6\" (UID: \"bb27057f-d6f5-44bb-baf4-1da4ba789bb1\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.451356 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:22 crc kubenswrapper[4991]: W1206 01:15:22.482899 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb27057f_d6f5_44bb_baf4_1da4ba789bb1.slice/crio-388381e32b81363c3e042e2d27b23feb98be6973238ff5b9d4b6d50c2b840985 WatchSource:0}: Error finding container 388381e32b81363c3e042e2d27b23feb98be6973238ff5b9d4b6d50c2b840985: Status 404 returned error can't find the container with id 388381e32b81363c3e042e2d27b23feb98be6973238ff5b9d4b6d50c2b840985 Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.907786 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5t29r_7a1c8508-3f35-44b2-988e-cc665121ce94/ovnkube-controller/3.log" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.910391 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5t29r_7a1c8508-3f35-44b2-988e-cc665121ce94/ovn-acl-logging/0.log" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.910969 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5t29r_7a1c8508-3f35-44b2-988e-cc665121ce94/ovn-controller/0.log" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.911617 4991 generic.go:334] "Generic (PLEG): container finished" podID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerID="560c4f661e91f29a61975f6700a9745378b967d04dd66204f7b0b1a0b219abd3" exitCode=0 Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.911667 4991 generic.go:334] "Generic (PLEG): container finished" podID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerID="fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda" exitCode=0 Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.911686 4991 generic.go:334] "Generic (PLEG): container finished" podID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerID="223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d" exitCode=0 Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.911698 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.911726 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" event={"ID":"7a1c8508-3f35-44b2-988e-cc665121ce94","Type":"ContainerDied","Data":"560c4f661e91f29a61975f6700a9745378b967d04dd66204f7b0b1a0b219abd3"} Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.911773 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" event={"ID":"7a1c8508-3f35-44b2-988e-cc665121ce94","Type":"ContainerDied","Data":"fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda"} Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.911797 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" event={"ID":"7a1c8508-3f35-44b2-988e-cc665121ce94","Type":"ContainerDied","Data":"223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d"} Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.911816 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" event={"ID":"7a1c8508-3f35-44b2-988e-cc665121ce94","Type":"ContainerDied","Data":"8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7"} Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.911842 4991 scope.go:117] "RemoveContainer" containerID="560c4f661e91f29a61975f6700a9745378b967d04dd66204f7b0b1a0b219abd3" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.911704 4991 generic.go:334] "Generic (PLEG): container finished" podID="7a1c8508-3f35-44b2-988e-cc665121ce94" containerID="8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7" exitCode=0 Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.912162 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5t29r" event={"ID":"7a1c8508-3f35-44b2-988e-cc665121ce94","Type":"ContainerDied","Data":"a3bd3088dfcb2b7a46a300624e3817052a809cdabf3674dfcfad4d82c9659ebe"} Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.914890 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sq7kg_06f95c58-fd69-48f6-94be-15927aa850c2/kube-multus/2.log" Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.915160 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sq7kg" event={"ID":"06f95c58-fd69-48f6-94be-15927aa850c2","Type":"ContainerStarted","Data":"1be769e537e8ec7e33ac7da39b4edff169d627ff63874f774b526927bff152e1"} Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.917141 4991 generic.go:334] "Generic (PLEG): container finished" podID="bb27057f-d6f5-44bb-baf4-1da4ba789bb1" containerID="f9c7113ec854f8ea1a8a09ef620dbd95489c51764c2e3c3270d6c7f565b9e3d2" exitCode=0 Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.917203 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" event={"ID":"bb27057f-d6f5-44bb-baf4-1da4ba789bb1","Type":"ContainerDied","Data":"f9c7113ec854f8ea1a8a09ef620dbd95489c51764c2e3c3270d6c7f565b9e3d2"} Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.917238 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" event={"ID":"bb27057f-d6f5-44bb-baf4-1da4ba789bb1","Type":"ContainerStarted","Data":"388381e32b81363c3e042e2d27b23feb98be6973238ff5b9d4b6d50c2b840985"} Dec 06 01:15:22 crc kubenswrapper[4991]: I1206 01:15:22.951017 4991 scope.go:117] "RemoveContainer" containerID="7145e36d4adf53d8fd1f49d4918d8d275945f311209c21994c9048a268ed1377" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.009786 4991 scope.go:117] "RemoveContainer" containerID="fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.031102 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5t29r"] Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.033683 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5t29r"] Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.045261 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a1c8508-3f35-44b2-988e-cc665121ce94" path="/var/lib/kubelet/pods/7a1c8508-3f35-44b2-988e-cc665121ce94/volumes" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.051076 4991 scope.go:117] "RemoveContainer" containerID="223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.066418 4991 scope.go:117] "RemoveContainer" containerID="8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.095183 4991 scope.go:117] "RemoveContainer" containerID="019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.117465 4991 scope.go:117] "RemoveContainer" containerID="a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.138149 4991 scope.go:117] "RemoveContainer" containerID="ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.164805 4991 scope.go:117] "RemoveContainer" containerID="a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.188849 4991 scope.go:117] "RemoveContainer" containerID="e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.224416 4991 scope.go:117] "RemoveContainer" containerID="560c4f661e91f29a61975f6700a9745378b967d04dd66204f7b0b1a0b219abd3" Dec 06 01:15:23 crc kubenswrapper[4991]: E1206 01:15:23.225235 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"560c4f661e91f29a61975f6700a9745378b967d04dd66204f7b0b1a0b219abd3\": container with ID starting with 560c4f661e91f29a61975f6700a9745378b967d04dd66204f7b0b1a0b219abd3 not found: ID does not exist" containerID="560c4f661e91f29a61975f6700a9745378b967d04dd66204f7b0b1a0b219abd3" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.225274 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"560c4f661e91f29a61975f6700a9745378b967d04dd66204f7b0b1a0b219abd3"} err="failed to get container status \"560c4f661e91f29a61975f6700a9745378b967d04dd66204f7b0b1a0b219abd3\": rpc error: code = NotFound desc = could not find container \"560c4f661e91f29a61975f6700a9745378b967d04dd66204f7b0b1a0b219abd3\": container with ID starting with 560c4f661e91f29a61975f6700a9745378b967d04dd66204f7b0b1a0b219abd3 not found: ID does not exist" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.225309 4991 scope.go:117] "RemoveContainer" containerID="7145e36d4adf53d8fd1f49d4918d8d275945f311209c21994c9048a268ed1377" Dec 06 01:15:23 crc kubenswrapper[4991]: E1206 01:15:23.226240 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7145e36d4adf53d8fd1f49d4918d8d275945f311209c21994c9048a268ed1377\": container with ID starting with 7145e36d4adf53d8fd1f49d4918d8d275945f311209c21994c9048a268ed1377 not found: ID does not exist" containerID="7145e36d4adf53d8fd1f49d4918d8d275945f311209c21994c9048a268ed1377" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.226294 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7145e36d4adf53d8fd1f49d4918d8d275945f311209c21994c9048a268ed1377"} err="failed to get container status \"7145e36d4adf53d8fd1f49d4918d8d275945f311209c21994c9048a268ed1377\": rpc error: code = NotFound desc = could not find container \"7145e36d4adf53d8fd1f49d4918d8d275945f311209c21994c9048a268ed1377\": container with ID starting with 7145e36d4adf53d8fd1f49d4918d8d275945f311209c21994c9048a268ed1377 not found: ID does not exist" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.226335 4991 scope.go:117] "RemoveContainer" containerID="fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda" Dec 06 01:15:23 crc kubenswrapper[4991]: E1206 01:15:23.226720 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda\": container with ID starting with fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda not found: ID does not exist" containerID="fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.226762 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda"} err="failed to get container status \"fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda\": rpc error: code = NotFound desc = could not find container \"fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda\": container with ID starting with fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda not found: ID does not exist" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.226791 4991 scope.go:117] "RemoveContainer" containerID="223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d" Dec 06 01:15:23 crc kubenswrapper[4991]: E1206 01:15:23.227113 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d\": container with ID starting with 223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d not found: ID does not exist" containerID="223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.227139 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d"} err="failed to get container status \"223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d\": rpc error: code = NotFound desc = could not find container \"223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d\": container with ID starting with 223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d not found: ID does not exist" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.227152 4991 scope.go:117] "RemoveContainer" containerID="8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7" Dec 06 01:15:23 crc kubenswrapper[4991]: E1206 01:15:23.227405 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7\": container with ID starting with 8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7 not found: ID does not exist" containerID="8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.227430 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7"} err="failed to get container status \"8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7\": rpc error: code = NotFound desc = could not find container \"8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7\": container with ID starting with 8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7 not found: ID does not exist" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.227443 4991 scope.go:117] "RemoveContainer" containerID="019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156" Dec 06 01:15:23 crc kubenswrapper[4991]: E1206 01:15:23.227706 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156\": container with ID starting with 019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156 not found: ID does not exist" containerID="019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.227740 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156"} err="failed to get container status \"019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156\": rpc error: code = NotFound desc = could not find container \"019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156\": container with ID starting with 019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156 not found: ID does not exist" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.227759 4991 scope.go:117] "RemoveContainer" containerID="a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86" Dec 06 01:15:23 crc kubenswrapper[4991]: E1206 01:15:23.228388 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86\": container with ID starting with a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86 not found: ID does not exist" containerID="a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.228419 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86"} err="failed to get container status \"a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86\": rpc error: code = NotFound desc = could not find container \"a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86\": container with ID starting with a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86 not found: ID does not exist" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.228437 4991 scope.go:117] "RemoveContainer" containerID="ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842" Dec 06 01:15:23 crc kubenswrapper[4991]: E1206 01:15:23.228707 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842\": container with ID starting with ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842 not found: ID does not exist" containerID="ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.228735 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842"} err="failed to get container status \"ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842\": rpc error: code = NotFound desc = could not find container \"ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842\": container with ID starting with ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842 not found: ID does not exist" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.228752 4991 scope.go:117] "RemoveContainer" containerID="a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8" Dec 06 01:15:23 crc kubenswrapper[4991]: E1206 01:15:23.229032 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8\": container with ID starting with a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8 not found: ID does not exist" containerID="a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.229060 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8"} err="failed to get container status \"a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8\": rpc error: code = NotFound desc = could not find container \"a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8\": container with ID starting with a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8 not found: ID does not exist" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.229076 4991 scope.go:117] "RemoveContainer" containerID="e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396" Dec 06 01:15:23 crc kubenswrapper[4991]: E1206 01:15:23.229320 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\": container with ID starting with e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396 not found: ID does not exist" containerID="e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.229342 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396"} err="failed to get container status \"e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\": rpc error: code = NotFound desc = could not find container \"e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\": container with ID starting with e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396 not found: ID does not exist" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.229373 4991 scope.go:117] "RemoveContainer" containerID="560c4f661e91f29a61975f6700a9745378b967d04dd66204f7b0b1a0b219abd3" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.229593 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"560c4f661e91f29a61975f6700a9745378b967d04dd66204f7b0b1a0b219abd3"} err="failed to get container status \"560c4f661e91f29a61975f6700a9745378b967d04dd66204f7b0b1a0b219abd3\": rpc error: code = NotFound desc = could not find container \"560c4f661e91f29a61975f6700a9745378b967d04dd66204f7b0b1a0b219abd3\": container with ID starting with 560c4f661e91f29a61975f6700a9745378b967d04dd66204f7b0b1a0b219abd3 not found: ID does not exist" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.229614 4991 scope.go:117] "RemoveContainer" containerID="7145e36d4adf53d8fd1f49d4918d8d275945f311209c21994c9048a268ed1377" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.229830 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7145e36d4adf53d8fd1f49d4918d8d275945f311209c21994c9048a268ed1377"} err="failed to get container status \"7145e36d4adf53d8fd1f49d4918d8d275945f311209c21994c9048a268ed1377\": rpc error: code = NotFound desc = could not find container \"7145e36d4adf53d8fd1f49d4918d8d275945f311209c21994c9048a268ed1377\": container with ID starting with 7145e36d4adf53d8fd1f49d4918d8d275945f311209c21994c9048a268ed1377 not found: ID does not exist" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.229865 4991 scope.go:117] "RemoveContainer" containerID="fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.230115 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda"} err="failed to get container status \"fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda\": rpc error: code = NotFound desc = could not find container \"fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda\": container with ID starting with fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda not found: ID does not exist" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.230131 4991 scope.go:117] "RemoveContainer" containerID="223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.230353 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d"} err="failed to get container status \"223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d\": rpc error: code = NotFound desc = could not find container \"223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d\": container with ID starting with 223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d not found: ID does not exist" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.230373 4991 scope.go:117] "RemoveContainer" containerID="8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.230652 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7"} err="failed to get container status \"8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7\": rpc error: code = NotFound desc = could not find container \"8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7\": container with ID starting with 8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7 not found: ID does not exist" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.230713 4991 scope.go:117] "RemoveContainer" containerID="019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.232428 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156"} err="failed to get container status \"019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156\": rpc error: code = NotFound desc = could not find container \"019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156\": container with ID starting with 019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156 not found: ID does not exist" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.232457 4991 scope.go:117] "RemoveContainer" containerID="a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.232840 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86"} err="failed to get container status \"a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86\": rpc error: code = NotFound desc = could not find container \"a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86\": container with ID starting with a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86 not found: ID does not exist" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.232892 4991 scope.go:117] "RemoveContainer" containerID="ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.233316 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842"} err="failed to get container status \"ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842\": rpc error: code = NotFound desc = could not find container \"ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842\": container with ID starting with ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842 not found: ID does not exist" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.233365 4991 scope.go:117] "RemoveContainer" containerID="a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.233699 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8"} err="failed to get container status \"a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8\": rpc error: code = NotFound desc = could not find container \"a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8\": container with ID starting with a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8 not found: ID does not exist" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.233723 4991 scope.go:117] "RemoveContainer" containerID="e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.234029 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396"} err="failed to get container status \"e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\": rpc error: code = NotFound desc = could not find container \"e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\": container with ID starting with e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396 not found: ID does not exist" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.234099 4991 scope.go:117] "RemoveContainer" containerID="560c4f661e91f29a61975f6700a9745378b967d04dd66204f7b0b1a0b219abd3" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.234457 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"560c4f661e91f29a61975f6700a9745378b967d04dd66204f7b0b1a0b219abd3"} err="failed to get container status \"560c4f661e91f29a61975f6700a9745378b967d04dd66204f7b0b1a0b219abd3\": rpc error: code = NotFound desc = could not find container \"560c4f661e91f29a61975f6700a9745378b967d04dd66204f7b0b1a0b219abd3\": container with ID starting with 560c4f661e91f29a61975f6700a9745378b967d04dd66204f7b0b1a0b219abd3 not found: ID does not exist" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.234487 4991 scope.go:117] "RemoveContainer" containerID="7145e36d4adf53d8fd1f49d4918d8d275945f311209c21994c9048a268ed1377" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.234820 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7145e36d4adf53d8fd1f49d4918d8d275945f311209c21994c9048a268ed1377"} err="failed to get container status \"7145e36d4adf53d8fd1f49d4918d8d275945f311209c21994c9048a268ed1377\": rpc error: code = NotFound desc = could not find container \"7145e36d4adf53d8fd1f49d4918d8d275945f311209c21994c9048a268ed1377\": container with ID starting with 7145e36d4adf53d8fd1f49d4918d8d275945f311209c21994c9048a268ed1377 not found: ID does not exist" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.234849 4991 scope.go:117] "RemoveContainer" containerID="fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.235302 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda"} err="failed to get container status \"fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda\": rpc error: code = NotFound desc = could not find container \"fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda\": container with ID starting with fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda not found: ID does not exist" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.235330 4991 scope.go:117] "RemoveContainer" containerID="223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.235764 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d"} err="failed to get container status \"223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d\": rpc error: code = NotFound desc = could not find container \"223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d\": container with ID starting with 223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d not found: ID does not exist" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.235900 4991 scope.go:117] "RemoveContainer" containerID="8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.236436 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7"} err="failed to get container status \"8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7\": rpc error: code = NotFound desc = could not find container \"8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7\": container with ID starting with 8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7 not found: ID does not exist" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.236470 4991 scope.go:117] "RemoveContainer" containerID="019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.237057 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156"} err="failed to get container status \"019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156\": rpc error: code = NotFound desc = could not find container \"019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156\": container with ID starting with 019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156 not found: ID does not exist" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.237106 4991 scope.go:117] "RemoveContainer" containerID="a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.237471 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86"} err="failed to get container status \"a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86\": rpc error: code = NotFound desc = could not find container \"a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86\": container with ID starting with a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86 not found: ID does not exist" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.237498 4991 scope.go:117] "RemoveContainer" containerID="ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.237892 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842"} err="failed to get container status \"ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842\": rpc error: code = NotFound desc = could not find container \"ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842\": container with ID starting with ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842 not found: ID does not exist" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.237918 4991 scope.go:117] "RemoveContainer" containerID="a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.238219 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8"} err="failed to get container status \"a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8\": rpc error: code = NotFound desc = could not find container \"a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8\": container with ID starting with a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8 not found: ID does not exist" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.238270 4991 scope.go:117] "RemoveContainer" containerID="e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.238927 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396"} err="failed to get container status \"e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\": rpc error: code = NotFound desc = could not find container \"e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\": container with ID starting with e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396 not found: ID does not exist" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.238960 4991 scope.go:117] "RemoveContainer" containerID="560c4f661e91f29a61975f6700a9745378b967d04dd66204f7b0b1a0b219abd3" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.239317 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"560c4f661e91f29a61975f6700a9745378b967d04dd66204f7b0b1a0b219abd3"} err="failed to get container status \"560c4f661e91f29a61975f6700a9745378b967d04dd66204f7b0b1a0b219abd3\": rpc error: code = NotFound desc = could not find container \"560c4f661e91f29a61975f6700a9745378b967d04dd66204f7b0b1a0b219abd3\": container with ID starting with 560c4f661e91f29a61975f6700a9745378b967d04dd66204f7b0b1a0b219abd3 not found: ID does not exist" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.239339 4991 scope.go:117] "RemoveContainer" containerID="7145e36d4adf53d8fd1f49d4918d8d275945f311209c21994c9048a268ed1377" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.239594 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7145e36d4adf53d8fd1f49d4918d8d275945f311209c21994c9048a268ed1377"} err="failed to get container status \"7145e36d4adf53d8fd1f49d4918d8d275945f311209c21994c9048a268ed1377\": rpc error: code = NotFound desc = could not find container \"7145e36d4adf53d8fd1f49d4918d8d275945f311209c21994c9048a268ed1377\": container with ID starting with 7145e36d4adf53d8fd1f49d4918d8d275945f311209c21994c9048a268ed1377 not found: ID does not exist" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.239623 4991 scope.go:117] "RemoveContainer" containerID="fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.239957 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda"} err="failed to get container status \"fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda\": rpc error: code = NotFound desc = could not find container \"fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda\": container with ID starting with fdc7d65b0cbc47527f705e0d5617ad319fa817140facf5a814e11fd521c1efda not found: ID does not exist" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.240087 4991 scope.go:117] "RemoveContainer" containerID="223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.240690 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d"} err="failed to get container status \"223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d\": rpc error: code = NotFound desc = could not find container \"223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d\": container with ID starting with 223e6b0365fc312bf7ac99358280dd089cd7f9466a51a6ea8625dd1f39f0bf0d not found: ID does not exist" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.240764 4991 scope.go:117] "RemoveContainer" containerID="8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.241078 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7"} err="failed to get container status \"8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7\": rpc error: code = NotFound desc = could not find container \"8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7\": container with ID starting with 8c6adcf440fa43efead61d58e653f6865a19658d9f1dad332accd9deeea4e6f7 not found: ID does not exist" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.241109 4991 scope.go:117] "RemoveContainer" containerID="019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.241569 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156"} err="failed to get container status \"019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156\": rpc error: code = NotFound desc = could not find container \"019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156\": container with ID starting with 019ff35f6f58349cee69fa9343b3f0436983942119917b8cbc870ecd0470e156 not found: ID does not exist" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.241602 4991 scope.go:117] "RemoveContainer" containerID="a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.242077 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86"} err="failed to get container status \"a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86\": rpc error: code = NotFound desc = could not find container \"a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86\": container with ID starting with a619a78c8eeb6b49daf4e2f5ebc2ce62af7a171cad38d6c0fe03f551152ddc86 not found: ID does not exist" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.242105 4991 scope.go:117] "RemoveContainer" containerID="ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.242503 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842"} err="failed to get container status \"ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842\": rpc error: code = NotFound desc = could not find container \"ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842\": container with ID starting with ed1b05ed48050fd55805a15f8334c88e6d86b91a9ef048cdcf36bc7308ffa842 not found: ID does not exist" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.242526 4991 scope.go:117] "RemoveContainer" containerID="a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.242798 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8"} err="failed to get container status \"a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8\": rpc error: code = NotFound desc = could not find container \"a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8\": container with ID starting with a6d30d2c78830e9cd9d1d39c81d5c0ebedbc2c286117464da0366a171047cee8 not found: ID does not exist" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.242821 4991 scope.go:117] "RemoveContainer" containerID="e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.243221 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396"} err="failed to get container status \"e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\": rpc error: code = NotFound desc = could not find container \"e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396\": container with ID starting with e8ee3e2975a9496791f52f17df6b19de3c4b0d0ff51f7f11283a0cde6949e396 not found: ID does not exist" Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.928396 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" event={"ID":"bb27057f-d6f5-44bb-baf4-1da4ba789bb1","Type":"ContainerStarted","Data":"acfa735db34b043fafb2030121abefc3459b96d953317b66c5144f060912f280"} Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.928741 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" event={"ID":"bb27057f-d6f5-44bb-baf4-1da4ba789bb1","Type":"ContainerStarted","Data":"407612fde7a6a2d9c61ea4a0b6a2156a6f6430c30453a926a3ea1781a21565f9"} Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.928765 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" event={"ID":"bb27057f-d6f5-44bb-baf4-1da4ba789bb1","Type":"ContainerStarted","Data":"4807bed20d355e8f0b9035060e3d0b9fe0360e7ac0408928fa6368903c4f727d"} Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.928784 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" event={"ID":"bb27057f-d6f5-44bb-baf4-1da4ba789bb1","Type":"ContainerStarted","Data":"f04436d414d51fce0eb562878520b6db322188748ae4c776b7aa882570359dc3"} Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.928800 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" event={"ID":"bb27057f-d6f5-44bb-baf4-1da4ba789bb1","Type":"ContainerStarted","Data":"b333142d4a61f7618ea9f8c48d1db66587bf8dd7743ef9ac02cfe3ef3e7421e0"} Dec 06 01:15:23 crc kubenswrapper[4991]: I1206 01:15:23.928816 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" event={"ID":"bb27057f-d6f5-44bb-baf4-1da4ba789bb1","Type":"ContainerStarted","Data":"3377225f40c97582a6d313bdf8daf46997f78359961fbf1c02d832b52e92bbad"} Dec 06 01:15:26 crc kubenswrapper[4991]: I1206 01:15:26.957871 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" event={"ID":"bb27057f-d6f5-44bb-baf4-1da4ba789bb1","Type":"ContainerStarted","Data":"fa72451d25ba736145fe4187431e49e82bb25c00189508dd462e46dc4afcc8be"} Dec 06 01:15:28 crc kubenswrapper[4991]: I1206 01:15:28.979555 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" event={"ID":"bb27057f-d6f5-44bb-baf4-1da4ba789bb1","Type":"ContainerStarted","Data":"ed94a5c61e34aab9bacc962c6ae91427a742498c99b1d8013242d9df74cb5cc1"} Dec 06 01:15:28 crc kubenswrapper[4991]: I1206 01:15:28.980163 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:28 crc kubenswrapper[4991]: I1206 01:15:28.980189 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:28 crc kubenswrapper[4991]: I1206 01:15:28.980203 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:29 crc kubenswrapper[4991]: I1206 01:15:29.016877 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:29 crc kubenswrapper[4991]: I1206 01:15:29.019184 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:15:29 crc kubenswrapper[4991]: I1206 01:15:29.026016 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" podStartSLOduration=7.025967532 podStartE2EDuration="7.025967532s" podCreationTimestamp="2025-12-06 01:15:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:15:29.025298124 +0000 UTC m=+802.375588602" watchObservedRunningTime="2025-12-06 01:15:29.025967532 +0000 UTC m=+802.376258010" Dec 06 01:15:46 crc kubenswrapper[4991]: I1206 01:15:46.739678 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 01:15:46 crc kubenswrapper[4991]: I1206 01:15:46.741345 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 01:15:52 crc kubenswrapper[4991]: I1206 01:15:52.479713 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4cfz6" Dec 06 01:16:02 crc kubenswrapper[4991]: I1206 01:16:02.416719 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnzf99"] Dec 06 01:16:02 crc kubenswrapper[4991]: I1206 01:16:02.419672 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnzf99" Dec 06 01:16:02 crc kubenswrapper[4991]: I1206 01:16:02.422739 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 06 01:16:02 crc kubenswrapper[4991]: I1206 01:16:02.432161 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnzf99"] Dec 06 01:16:02 crc kubenswrapper[4991]: I1206 01:16:02.490731 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6f2005a0-682b-485a-bb40-b60cd9683c2e-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnzf99\" (UID: \"6f2005a0-682b-485a-bb40-b60cd9683c2e\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnzf99" Dec 06 01:16:02 crc kubenswrapper[4991]: I1206 01:16:02.490850 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6f2005a0-682b-485a-bb40-b60cd9683c2e-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnzf99\" (UID: \"6f2005a0-682b-485a-bb40-b60cd9683c2e\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnzf99" Dec 06 01:16:02 crc kubenswrapper[4991]: I1206 01:16:02.490881 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dnrt\" (UniqueName: \"kubernetes.io/projected/6f2005a0-682b-485a-bb40-b60cd9683c2e-kube-api-access-6dnrt\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnzf99\" (UID: \"6f2005a0-682b-485a-bb40-b60cd9683c2e\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnzf99" Dec 06 01:16:02 crc kubenswrapper[4991]: I1206 01:16:02.594136 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6f2005a0-682b-485a-bb40-b60cd9683c2e-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnzf99\" (UID: \"6f2005a0-682b-485a-bb40-b60cd9683c2e\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnzf99" Dec 06 01:16:02 crc kubenswrapper[4991]: I1206 01:16:02.594528 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6f2005a0-682b-485a-bb40-b60cd9683c2e-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnzf99\" (UID: \"6f2005a0-682b-485a-bb40-b60cd9683c2e\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnzf99" Dec 06 01:16:02 crc kubenswrapper[4991]: I1206 01:16:02.594620 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dnrt\" (UniqueName: \"kubernetes.io/projected/6f2005a0-682b-485a-bb40-b60cd9683c2e-kube-api-access-6dnrt\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnzf99\" (UID: \"6f2005a0-682b-485a-bb40-b60cd9683c2e\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnzf99" Dec 06 01:16:02 crc kubenswrapper[4991]: I1206 01:16:02.594845 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6f2005a0-682b-485a-bb40-b60cd9683c2e-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnzf99\" (UID: \"6f2005a0-682b-485a-bb40-b60cd9683c2e\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnzf99" Dec 06 01:16:02 crc kubenswrapper[4991]: I1206 01:16:02.595327 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6f2005a0-682b-485a-bb40-b60cd9683c2e-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnzf99\" (UID: \"6f2005a0-682b-485a-bb40-b60cd9683c2e\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnzf99" Dec 06 01:16:02 crc kubenswrapper[4991]: I1206 01:16:02.634465 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dnrt\" (UniqueName: \"kubernetes.io/projected/6f2005a0-682b-485a-bb40-b60cd9683c2e-kube-api-access-6dnrt\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnzf99\" (UID: \"6f2005a0-682b-485a-bb40-b60cd9683c2e\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnzf99" Dec 06 01:16:02 crc kubenswrapper[4991]: I1206 01:16:02.761223 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnzf99" Dec 06 01:16:03 crc kubenswrapper[4991]: I1206 01:16:03.059884 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnzf99"] Dec 06 01:16:03 crc kubenswrapper[4991]: I1206 01:16:03.239013 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnzf99" event={"ID":"6f2005a0-682b-485a-bb40-b60cd9683c2e","Type":"ContainerStarted","Data":"20ca8288ca9ca29597a4e59f38626dc5000b57f63145237cf8e8c022a97e9db3"} Dec 06 01:16:03 crc kubenswrapper[4991]: I1206 01:16:03.239597 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnzf99" event={"ID":"6f2005a0-682b-485a-bb40-b60cd9683c2e","Type":"ContainerStarted","Data":"21842c3ab18ceeb19527df7e3953af48c8dabc85bbae9af5fd8dec5dbfcfad8a"} Dec 06 01:16:04 crc kubenswrapper[4991]: I1206 01:16:04.246506 4991 generic.go:334] "Generic (PLEG): container finished" podID="6f2005a0-682b-485a-bb40-b60cd9683c2e" containerID="20ca8288ca9ca29597a4e59f38626dc5000b57f63145237cf8e8c022a97e9db3" exitCode=0 Dec 06 01:16:04 crc kubenswrapper[4991]: I1206 01:16:04.246579 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnzf99" event={"ID":"6f2005a0-682b-485a-bb40-b60cd9683c2e","Type":"ContainerDied","Data":"20ca8288ca9ca29597a4e59f38626dc5000b57f63145237cf8e8c022a97e9db3"} Dec 06 01:16:04 crc kubenswrapper[4991]: I1206 01:16:04.750228 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-j5w5j"] Dec 06 01:16:04 crc kubenswrapper[4991]: I1206 01:16:04.752256 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j5w5j" Dec 06 01:16:04 crc kubenswrapper[4991]: I1206 01:16:04.773837 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j5w5j"] Dec 06 01:16:04 crc kubenswrapper[4991]: I1206 01:16:04.854206 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2db8e841-8dd9-4df0-ac17-47ce3673078c-utilities\") pod \"redhat-operators-j5w5j\" (UID: \"2db8e841-8dd9-4df0-ac17-47ce3673078c\") " pod="openshift-marketplace/redhat-operators-j5w5j" Dec 06 01:16:04 crc kubenswrapper[4991]: I1206 01:16:04.854380 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2db8e841-8dd9-4df0-ac17-47ce3673078c-catalog-content\") pod \"redhat-operators-j5w5j\" (UID: \"2db8e841-8dd9-4df0-ac17-47ce3673078c\") " pod="openshift-marketplace/redhat-operators-j5w5j" Dec 06 01:16:04 crc kubenswrapper[4991]: I1206 01:16:04.854486 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfgfg\" (UniqueName: \"kubernetes.io/projected/2db8e841-8dd9-4df0-ac17-47ce3673078c-kube-api-access-kfgfg\") pod \"redhat-operators-j5w5j\" (UID: \"2db8e841-8dd9-4df0-ac17-47ce3673078c\") " pod="openshift-marketplace/redhat-operators-j5w5j" Dec 06 01:16:04 crc kubenswrapper[4991]: I1206 01:16:04.956145 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2db8e841-8dd9-4df0-ac17-47ce3673078c-catalog-content\") pod \"redhat-operators-j5w5j\" (UID: \"2db8e841-8dd9-4df0-ac17-47ce3673078c\") " pod="openshift-marketplace/redhat-operators-j5w5j" Dec 06 01:16:04 crc kubenswrapper[4991]: I1206 01:16:04.956201 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfgfg\" (UniqueName: \"kubernetes.io/projected/2db8e841-8dd9-4df0-ac17-47ce3673078c-kube-api-access-kfgfg\") pod \"redhat-operators-j5w5j\" (UID: \"2db8e841-8dd9-4df0-ac17-47ce3673078c\") " pod="openshift-marketplace/redhat-operators-j5w5j" Dec 06 01:16:04 crc kubenswrapper[4991]: I1206 01:16:04.956249 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2db8e841-8dd9-4df0-ac17-47ce3673078c-utilities\") pod \"redhat-operators-j5w5j\" (UID: \"2db8e841-8dd9-4df0-ac17-47ce3673078c\") " pod="openshift-marketplace/redhat-operators-j5w5j" Dec 06 01:16:04 crc kubenswrapper[4991]: I1206 01:16:04.956869 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2db8e841-8dd9-4df0-ac17-47ce3673078c-utilities\") pod \"redhat-operators-j5w5j\" (UID: \"2db8e841-8dd9-4df0-ac17-47ce3673078c\") " pod="openshift-marketplace/redhat-operators-j5w5j" Dec 06 01:16:04 crc kubenswrapper[4991]: I1206 01:16:04.956883 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2db8e841-8dd9-4df0-ac17-47ce3673078c-catalog-content\") pod \"redhat-operators-j5w5j\" (UID: \"2db8e841-8dd9-4df0-ac17-47ce3673078c\") " pod="openshift-marketplace/redhat-operators-j5w5j" Dec 06 01:16:04 crc kubenswrapper[4991]: I1206 01:16:04.985059 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfgfg\" (UniqueName: \"kubernetes.io/projected/2db8e841-8dd9-4df0-ac17-47ce3673078c-kube-api-access-kfgfg\") pod \"redhat-operators-j5w5j\" (UID: \"2db8e841-8dd9-4df0-ac17-47ce3673078c\") " pod="openshift-marketplace/redhat-operators-j5w5j" Dec 06 01:16:05 crc kubenswrapper[4991]: I1206 01:16:05.072816 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j5w5j" Dec 06 01:16:05 crc kubenswrapper[4991]: I1206 01:16:05.342721 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j5w5j"] Dec 06 01:16:05 crc kubenswrapper[4991]: W1206 01:16:05.353293 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2db8e841_8dd9_4df0_ac17_47ce3673078c.slice/crio-644ed1f07b8b5210d7ce822013ed3d95ecb40d844258713c445f88221ee2fe5e WatchSource:0}: Error finding container 644ed1f07b8b5210d7ce822013ed3d95ecb40d844258713c445f88221ee2fe5e: Status 404 returned error can't find the container with id 644ed1f07b8b5210d7ce822013ed3d95ecb40d844258713c445f88221ee2fe5e Dec 06 01:16:06 crc kubenswrapper[4991]: I1206 01:16:06.277195 4991 generic.go:334] "Generic (PLEG): container finished" podID="6f2005a0-682b-485a-bb40-b60cd9683c2e" containerID="4624be8d36d1c7f614c2be4ff3b9994ec9e53e77ad87bc680bb9adbc49166d37" exitCode=0 Dec 06 01:16:06 crc kubenswrapper[4991]: I1206 01:16:06.277432 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnzf99" event={"ID":"6f2005a0-682b-485a-bb40-b60cd9683c2e","Type":"ContainerDied","Data":"4624be8d36d1c7f614c2be4ff3b9994ec9e53e77ad87bc680bb9adbc49166d37"} Dec 06 01:16:06 crc kubenswrapper[4991]: I1206 01:16:06.280756 4991 generic.go:334] "Generic (PLEG): container finished" podID="2db8e841-8dd9-4df0-ac17-47ce3673078c" containerID="0a879a06a5d4d266e285c92b3114e64274a3217d4111a493cf08454f1ae65caa" exitCode=0 Dec 06 01:16:06 crc kubenswrapper[4991]: I1206 01:16:06.280818 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j5w5j" event={"ID":"2db8e841-8dd9-4df0-ac17-47ce3673078c","Type":"ContainerDied","Data":"0a879a06a5d4d266e285c92b3114e64274a3217d4111a493cf08454f1ae65caa"} Dec 06 01:16:06 crc kubenswrapper[4991]: I1206 01:16:06.280856 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j5w5j" event={"ID":"2db8e841-8dd9-4df0-ac17-47ce3673078c","Type":"ContainerStarted","Data":"644ed1f07b8b5210d7ce822013ed3d95ecb40d844258713c445f88221ee2fe5e"} Dec 06 01:16:07 crc kubenswrapper[4991]: I1206 01:16:07.290949 4991 generic.go:334] "Generic (PLEG): container finished" podID="6f2005a0-682b-485a-bb40-b60cd9683c2e" containerID="59526e5bc1cba76ed3c803dc84d871998f928f660fb00b0b63611f8daf0a26a9" exitCode=0 Dec 06 01:16:07 crc kubenswrapper[4991]: I1206 01:16:07.291058 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnzf99" event={"ID":"6f2005a0-682b-485a-bb40-b60cd9683c2e","Type":"ContainerDied","Data":"59526e5bc1cba76ed3c803dc84d871998f928f660fb00b0b63611f8daf0a26a9"} Dec 06 01:16:07 crc kubenswrapper[4991]: I1206 01:16:07.295281 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j5w5j" event={"ID":"2db8e841-8dd9-4df0-ac17-47ce3673078c","Type":"ContainerStarted","Data":"da4a38bf74904ed31a3928fdbcafc77693aa5055f629f3704e32f5409eb588f3"} Dec 06 01:16:08 crc kubenswrapper[4991]: I1206 01:16:08.306601 4991 generic.go:334] "Generic (PLEG): container finished" podID="2db8e841-8dd9-4df0-ac17-47ce3673078c" containerID="da4a38bf74904ed31a3928fdbcafc77693aa5055f629f3704e32f5409eb588f3" exitCode=0 Dec 06 01:16:08 crc kubenswrapper[4991]: I1206 01:16:08.306685 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j5w5j" event={"ID":"2db8e841-8dd9-4df0-ac17-47ce3673078c","Type":"ContainerDied","Data":"da4a38bf74904ed31a3928fdbcafc77693aa5055f629f3704e32f5409eb588f3"} Dec 06 01:16:08 crc kubenswrapper[4991]: I1206 01:16:08.564873 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnzf99" Dec 06 01:16:08 crc kubenswrapper[4991]: I1206 01:16:08.712242 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6f2005a0-682b-485a-bb40-b60cd9683c2e-util\") pod \"6f2005a0-682b-485a-bb40-b60cd9683c2e\" (UID: \"6f2005a0-682b-485a-bb40-b60cd9683c2e\") " Dec 06 01:16:08 crc kubenswrapper[4991]: I1206 01:16:08.712452 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dnrt\" (UniqueName: \"kubernetes.io/projected/6f2005a0-682b-485a-bb40-b60cd9683c2e-kube-api-access-6dnrt\") pod \"6f2005a0-682b-485a-bb40-b60cd9683c2e\" (UID: \"6f2005a0-682b-485a-bb40-b60cd9683c2e\") " Dec 06 01:16:08 crc kubenswrapper[4991]: I1206 01:16:08.712500 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6f2005a0-682b-485a-bb40-b60cd9683c2e-bundle\") pod \"6f2005a0-682b-485a-bb40-b60cd9683c2e\" (UID: \"6f2005a0-682b-485a-bb40-b60cd9683c2e\") " Dec 06 01:16:08 crc kubenswrapper[4991]: I1206 01:16:08.713693 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f2005a0-682b-485a-bb40-b60cd9683c2e-bundle" (OuterVolumeSpecName: "bundle") pod "6f2005a0-682b-485a-bb40-b60cd9683c2e" (UID: "6f2005a0-682b-485a-bb40-b60cd9683c2e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:16:08 crc kubenswrapper[4991]: I1206 01:16:08.724281 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f2005a0-682b-485a-bb40-b60cd9683c2e-kube-api-access-6dnrt" (OuterVolumeSpecName: "kube-api-access-6dnrt") pod "6f2005a0-682b-485a-bb40-b60cd9683c2e" (UID: "6f2005a0-682b-485a-bb40-b60cd9683c2e"). InnerVolumeSpecName "kube-api-access-6dnrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:16:08 crc kubenswrapper[4991]: I1206 01:16:08.814726 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dnrt\" (UniqueName: \"kubernetes.io/projected/6f2005a0-682b-485a-bb40-b60cd9683c2e-kube-api-access-6dnrt\") on node \"crc\" DevicePath \"\"" Dec 06 01:16:08 crc kubenswrapper[4991]: I1206 01:16:08.814820 4991 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6f2005a0-682b-485a-bb40-b60cd9683c2e-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:16:08 crc kubenswrapper[4991]: I1206 01:16:08.849662 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f2005a0-682b-485a-bb40-b60cd9683c2e-util" (OuterVolumeSpecName: "util") pod "6f2005a0-682b-485a-bb40-b60cd9683c2e" (UID: "6f2005a0-682b-485a-bb40-b60cd9683c2e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:16:08 crc kubenswrapper[4991]: I1206 01:16:08.915766 4991 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6f2005a0-682b-485a-bb40-b60cd9683c2e-util\") on node \"crc\" DevicePath \"\"" Dec 06 01:16:09 crc kubenswrapper[4991]: I1206 01:16:09.331352 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j5w5j" event={"ID":"2db8e841-8dd9-4df0-ac17-47ce3673078c","Type":"ContainerStarted","Data":"3e1d71f11f291e9868c301242664d32c091827f744fa9941794a004066099b53"} Dec 06 01:16:09 crc kubenswrapper[4991]: I1206 01:16:09.337036 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnzf99" event={"ID":"6f2005a0-682b-485a-bb40-b60cd9683c2e","Type":"ContainerDied","Data":"21842c3ab18ceeb19527df7e3953af48c8dabc85bbae9af5fd8dec5dbfcfad8a"} Dec 06 01:16:09 crc kubenswrapper[4991]: I1206 01:16:09.337092 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21842c3ab18ceeb19527df7e3953af48c8dabc85bbae9af5fd8dec5dbfcfad8a" Dec 06 01:16:09 crc kubenswrapper[4991]: I1206 01:16:09.337128 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnzf99" Dec 06 01:16:09 crc kubenswrapper[4991]: I1206 01:16:09.355307 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-j5w5j" podStartSLOduration=2.657795745 podStartE2EDuration="5.355269542s" podCreationTimestamp="2025-12-06 01:16:04 +0000 UTC" firstStartedPulling="2025-12-06 01:16:06.282594588 +0000 UTC m=+839.632885056" lastFinishedPulling="2025-12-06 01:16:08.980068345 +0000 UTC m=+842.330358853" observedRunningTime="2025-12-06 01:16:09.353905746 +0000 UTC m=+842.704196214" watchObservedRunningTime="2025-12-06 01:16:09.355269542 +0000 UTC m=+842.705560040" Dec 06 01:16:12 crc kubenswrapper[4991]: I1206 01:16:12.668581 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-nl2cl"] Dec 06 01:16:12 crc kubenswrapper[4991]: E1206 01:16:12.669397 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f2005a0-682b-485a-bb40-b60cd9683c2e" containerName="pull" Dec 06 01:16:12 crc kubenswrapper[4991]: I1206 01:16:12.669411 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f2005a0-682b-485a-bb40-b60cd9683c2e" containerName="pull" Dec 06 01:16:12 crc kubenswrapper[4991]: E1206 01:16:12.669424 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f2005a0-682b-485a-bb40-b60cd9683c2e" containerName="extract" Dec 06 01:16:12 crc kubenswrapper[4991]: I1206 01:16:12.669430 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f2005a0-682b-485a-bb40-b60cd9683c2e" containerName="extract" Dec 06 01:16:12 crc kubenswrapper[4991]: E1206 01:16:12.669446 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f2005a0-682b-485a-bb40-b60cd9683c2e" containerName="util" Dec 06 01:16:12 crc kubenswrapper[4991]: I1206 01:16:12.669454 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f2005a0-682b-485a-bb40-b60cd9683c2e" containerName="util" Dec 06 01:16:12 crc kubenswrapper[4991]: I1206 01:16:12.669555 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f2005a0-682b-485a-bb40-b60cd9683c2e" containerName="extract" Dec 06 01:16:12 crc kubenswrapper[4991]: I1206 01:16:12.670090 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-nl2cl" Dec 06 01:16:12 crc kubenswrapper[4991]: I1206 01:16:12.673235 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-bkcjr" Dec 06 01:16:12 crc kubenswrapper[4991]: I1206 01:16:12.673395 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 06 01:16:12 crc kubenswrapper[4991]: I1206 01:16:12.674151 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 06 01:16:12 crc kubenswrapper[4991]: I1206 01:16:12.682938 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-nl2cl"] Dec 06 01:16:12 crc kubenswrapper[4991]: I1206 01:16:12.774416 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr7fr\" (UniqueName: \"kubernetes.io/projected/47befc6f-6b9a-4a39-a029-1d55f420cf3b-kube-api-access-mr7fr\") pod \"nmstate-operator-5b5b58f5c8-nl2cl\" (UID: \"47befc6f-6b9a-4a39-a029-1d55f420cf3b\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-nl2cl" Dec 06 01:16:12 crc kubenswrapper[4991]: I1206 01:16:12.875461 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr7fr\" (UniqueName: \"kubernetes.io/projected/47befc6f-6b9a-4a39-a029-1d55f420cf3b-kube-api-access-mr7fr\") pod \"nmstate-operator-5b5b58f5c8-nl2cl\" (UID: \"47befc6f-6b9a-4a39-a029-1d55f420cf3b\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-nl2cl" Dec 06 01:16:12 crc kubenswrapper[4991]: I1206 01:16:12.903170 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr7fr\" (UniqueName: \"kubernetes.io/projected/47befc6f-6b9a-4a39-a029-1d55f420cf3b-kube-api-access-mr7fr\") pod \"nmstate-operator-5b5b58f5c8-nl2cl\" (UID: \"47befc6f-6b9a-4a39-a029-1d55f420cf3b\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-nl2cl" Dec 06 01:16:12 crc kubenswrapper[4991]: I1206 01:16:12.987464 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-nl2cl" Dec 06 01:16:13 crc kubenswrapper[4991]: I1206 01:16:13.205129 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-nl2cl"] Dec 06 01:16:13 crc kubenswrapper[4991]: W1206 01:16:13.210472 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47befc6f_6b9a_4a39_a029_1d55f420cf3b.slice/crio-de72e1395ddf7c84f86914053ead7adda93b16c42c62a61ab03fd9433193b0b5 WatchSource:0}: Error finding container de72e1395ddf7c84f86914053ead7adda93b16c42c62a61ab03fd9433193b0b5: Status 404 returned error can't find the container with id de72e1395ddf7c84f86914053ead7adda93b16c42c62a61ab03fd9433193b0b5 Dec 06 01:16:13 crc kubenswrapper[4991]: I1206 01:16:13.361864 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-nl2cl" event={"ID":"47befc6f-6b9a-4a39-a029-1d55f420cf3b","Type":"ContainerStarted","Data":"de72e1395ddf7c84f86914053ead7adda93b16c42c62a61ab03fd9433193b0b5"} Dec 06 01:16:15 crc kubenswrapper[4991]: I1206 01:16:15.073753 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-j5w5j" Dec 06 01:16:15 crc kubenswrapper[4991]: I1206 01:16:15.074340 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-j5w5j" Dec 06 01:16:16 crc kubenswrapper[4991]: I1206 01:16:16.182404 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-j5w5j" podUID="2db8e841-8dd9-4df0-ac17-47ce3673078c" containerName="registry-server" probeResult="failure" output=< Dec 06 01:16:16 crc kubenswrapper[4991]: timeout: failed to connect service ":50051" within 1s Dec 06 01:16:16 crc kubenswrapper[4991]: > Dec 06 01:16:16 crc kubenswrapper[4991]: I1206 01:16:16.381706 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-nl2cl" event={"ID":"47befc6f-6b9a-4a39-a029-1d55f420cf3b","Type":"ContainerStarted","Data":"2b5a1dc516d1b368e24b317cd75b67b1ab843051fc36e5ea276cad04d93c1c35"} Dec 06 01:16:16 crc kubenswrapper[4991]: I1206 01:16:16.413421 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-nl2cl" podStartSLOduration=1.7758291000000002 podStartE2EDuration="4.413401013s" podCreationTimestamp="2025-12-06 01:16:12 +0000 UTC" firstStartedPulling="2025-12-06 01:16:13.214803857 +0000 UTC m=+846.565094325" lastFinishedPulling="2025-12-06 01:16:15.85237577 +0000 UTC m=+849.202666238" observedRunningTime="2025-12-06 01:16:16.410752322 +0000 UTC m=+849.761042820" watchObservedRunningTime="2025-12-06 01:16:16.413401013 +0000 UTC m=+849.763691491" Dec 06 01:16:16 crc kubenswrapper[4991]: I1206 01:16:16.739573 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 01:16:16 crc kubenswrapper[4991]: I1206 01:16:16.739702 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 01:16:16 crc kubenswrapper[4991]: I1206 01:16:16.739789 4991 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" Dec 06 01:16:16 crc kubenswrapper[4991]: I1206 01:16:16.741027 4991 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"42613daf70f47f0e109690b595c837fa9976654bde53b0036709cbe0f06846cc"} pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 01:16:16 crc kubenswrapper[4991]: I1206 01:16:16.741166 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" containerID="cri-o://42613daf70f47f0e109690b595c837fa9976654bde53b0036709cbe0f06846cc" gracePeriod=600 Dec 06 01:16:17 crc kubenswrapper[4991]: I1206 01:16:17.391491 4991 generic.go:334] "Generic (PLEG): container finished" podID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerID="42613daf70f47f0e109690b595c837fa9976654bde53b0036709cbe0f06846cc" exitCode=0 Dec 06 01:16:17 crc kubenswrapper[4991]: I1206 01:16:17.391564 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" event={"ID":"f35cea72-9e31-4432-9e3b-16a50ebce794","Type":"ContainerDied","Data":"42613daf70f47f0e109690b595c837fa9976654bde53b0036709cbe0f06846cc"} Dec 06 01:16:17 crc kubenswrapper[4991]: I1206 01:16:17.392088 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" event={"ID":"f35cea72-9e31-4432-9e3b-16a50ebce794","Type":"ContainerStarted","Data":"f375611be938212e8f0efe7042b61ab4ee937e28fa539d018afc5fdc1b82d418"} Dec 06 01:16:17 crc kubenswrapper[4991]: I1206 01:16:17.392123 4991 scope.go:117] "RemoveContainer" containerID="170286243e72a237ea08578f06e304cbd6d9204823fba253174d3c0df2ac7b5e" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.551123 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-9czh7"] Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.552456 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-9czh7" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.555613 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-pgdt6" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.574725 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-9czh7"] Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.581896 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-q6ftt"] Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.599403 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-6vvck"] Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.601202 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-6vvck" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.601748 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-q6ftt" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.608718 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.621338 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-q6ftt"] Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.658241 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/461dd677-1efc-468e-833f-125c37a2c7b8-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-q6ftt\" (UID: \"461dd677-1efc-468e-833f-125c37a2c7b8\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-q6ftt" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.658299 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzkx8\" (UniqueName: \"kubernetes.io/projected/461dd677-1efc-468e-833f-125c37a2c7b8-kube-api-access-bzkx8\") pod \"nmstate-webhook-5f6d4c5ccb-q6ftt\" (UID: \"461dd677-1efc-468e-833f-125c37a2c7b8\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-q6ftt" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.658498 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6d7r\" (UniqueName: \"kubernetes.io/projected/8ba61c7d-dc05-4834-9e0e-e18bb755d1a9-kube-api-access-l6d7r\") pod \"nmstate-metrics-7f946cbc9-9czh7\" (UID: \"8ba61c7d-dc05-4834-9e0e-e18bb755d1a9\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-9czh7" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.717317 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gm4xg"] Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.718277 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gm4xg" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.723882 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.724104 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-lb2ss" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.724123 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.728886 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gm4xg"] Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.760049 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/461dd677-1efc-468e-833f-125c37a2c7b8-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-q6ftt\" (UID: \"461dd677-1efc-468e-833f-125c37a2c7b8\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-q6ftt" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.760211 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzkx8\" (UniqueName: \"kubernetes.io/projected/461dd677-1efc-468e-833f-125c37a2c7b8-kube-api-access-bzkx8\") pod \"nmstate-webhook-5f6d4c5ccb-q6ftt\" (UID: \"461dd677-1efc-468e-833f-125c37a2c7b8\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-q6ftt" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.760248 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ac4eae73-f13b-4e52-9b25-f837bba897b0-ovs-socket\") pod \"nmstate-handler-6vvck\" (UID: \"ac4eae73-f13b-4e52-9b25-f837bba897b0\") " pod="openshift-nmstate/nmstate-handler-6vvck" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.760275 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5rvj\" (UniqueName: \"kubernetes.io/projected/ac4eae73-f13b-4e52-9b25-f837bba897b0-kube-api-access-z5rvj\") pod \"nmstate-handler-6vvck\" (UID: \"ac4eae73-f13b-4e52-9b25-f837bba897b0\") " pod="openshift-nmstate/nmstate-handler-6vvck" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.760295 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ac4eae73-f13b-4e52-9b25-f837bba897b0-nmstate-lock\") pod \"nmstate-handler-6vvck\" (UID: \"ac4eae73-f13b-4e52-9b25-f837bba897b0\") " pod="openshift-nmstate/nmstate-handler-6vvck" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.760325 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ac4eae73-f13b-4e52-9b25-f837bba897b0-dbus-socket\") pod \"nmstate-handler-6vvck\" (UID: \"ac4eae73-f13b-4e52-9b25-f837bba897b0\") " pod="openshift-nmstate/nmstate-handler-6vvck" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.760365 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6d7r\" (UniqueName: \"kubernetes.io/projected/8ba61c7d-dc05-4834-9e0e-e18bb755d1a9-kube-api-access-l6d7r\") pod \"nmstate-metrics-7f946cbc9-9czh7\" (UID: \"8ba61c7d-dc05-4834-9e0e-e18bb755d1a9\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-9czh7" Dec 06 01:16:22 crc kubenswrapper[4991]: E1206 01:16:22.760385 4991 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Dec 06 01:16:22 crc kubenswrapper[4991]: E1206 01:16:22.760782 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/461dd677-1efc-468e-833f-125c37a2c7b8-tls-key-pair podName:461dd677-1efc-468e-833f-125c37a2c7b8 nodeName:}" failed. No retries permitted until 2025-12-06 01:16:23.260756855 +0000 UTC m=+856.611047323 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/461dd677-1efc-468e-833f-125c37a2c7b8-tls-key-pair") pod "nmstate-webhook-5f6d4c5ccb-q6ftt" (UID: "461dd677-1efc-468e-833f-125c37a2c7b8") : secret "openshift-nmstate-webhook" not found Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.780572 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzkx8\" (UniqueName: \"kubernetes.io/projected/461dd677-1efc-468e-833f-125c37a2c7b8-kube-api-access-bzkx8\") pod \"nmstate-webhook-5f6d4c5ccb-q6ftt\" (UID: \"461dd677-1efc-468e-833f-125c37a2c7b8\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-q6ftt" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.780592 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6d7r\" (UniqueName: \"kubernetes.io/projected/8ba61c7d-dc05-4834-9e0e-e18bb755d1a9-kube-api-access-l6d7r\") pod \"nmstate-metrics-7f946cbc9-9czh7\" (UID: \"8ba61c7d-dc05-4834-9e0e-e18bb755d1a9\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-9czh7" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.862439 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4gdl\" (UniqueName: \"kubernetes.io/projected/7b08a3d0-cce4-42df-b89a-a9b11ef39a0d-kube-api-access-g4gdl\") pod \"nmstate-console-plugin-7fbb5f6569-gm4xg\" (UID: \"7b08a3d0-cce4-42df-b89a-a9b11ef39a0d\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gm4xg" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.862520 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7b08a3d0-cce4-42df-b89a-a9b11ef39a0d-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-gm4xg\" (UID: \"7b08a3d0-cce4-42df-b89a-a9b11ef39a0d\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gm4xg" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.862586 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ac4eae73-f13b-4e52-9b25-f837bba897b0-ovs-socket\") pod \"nmstate-handler-6vvck\" (UID: \"ac4eae73-f13b-4e52-9b25-f837bba897b0\") " pod="openshift-nmstate/nmstate-handler-6vvck" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.862618 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5rvj\" (UniqueName: \"kubernetes.io/projected/ac4eae73-f13b-4e52-9b25-f837bba897b0-kube-api-access-z5rvj\") pod \"nmstate-handler-6vvck\" (UID: \"ac4eae73-f13b-4e52-9b25-f837bba897b0\") " pod="openshift-nmstate/nmstate-handler-6vvck" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.862643 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ac4eae73-f13b-4e52-9b25-f837bba897b0-nmstate-lock\") pod \"nmstate-handler-6vvck\" (UID: \"ac4eae73-f13b-4e52-9b25-f837bba897b0\") " pod="openshift-nmstate/nmstate-handler-6vvck" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.862680 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ac4eae73-f13b-4e52-9b25-f837bba897b0-dbus-socket\") pod \"nmstate-handler-6vvck\" (UID: \"ac4eae73-f13b-4e52-9b25-f837bba897b0\") " pod="openshift-nmstate/nmstate-handler-6vvck" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.862747 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/7b08a3d0-cce4-42df-b89a-a9b11ef39a0d-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-gm4xg\" (UID: \"7b08a3d0-cce4-42df-b89a-a9b11ef39a0d\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gm4xg" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.862865 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ac4eae73-f13b-4e52-9b25-f837bba897b0-ovs-socket\") pod \"nmstate-handler-6vvck\" (UID: \"ac4eae73-f13b-4e52-9b25-f837bba897b0\") " pod="openshift-nmstate/nmstate-handler-6vvck" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.863165 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ac4eae73-f13b-4e52-9b25-f837bba897b0-nmstate-lock\") pod \"nmstate-handler-6vvck\" (UID: \"ac4eae73-f13b-4e52-9b25-f837bba897b0\") " pod="openshift-nmstate/nmstate-handler-6vvck" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.863467 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ac4eae73-f13b-4e52-9b25-f837bba897b0-dbus-socket\") pod \"nmstate-handler-6vvck\" (UID: \"ac4eae73-f13b-4e52-9b25-f837bba897b0\") " pod="openshift-nmstate/nmstate-handler-6vvck" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.870572 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-9czh7" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.893908 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5rvj\" (UniqueName: \"kubernetes.io/projected/ac4eae73-f13b-4e52-9b25-f837bba897b0-kube-api-access-z5rvj\") pod \"nmstate-handler-6vvck\" (UID: \"ac4eae73-f13b-4e52-9b25-f837bba897b0\") " pod="openshift-nmstate/nmstate-handler-6vvck" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.939176 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-6vvck" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.941075 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-56dccc5ff8-s7k5z"] Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.941699 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56dccc5ff8-s7k5z" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.963950 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f7c902be-9d7b-4943-9e5a-3eae2ce9e16c-console-serving-cert\") pod \"console-56dccc5ff8-s7k5z\" (UID: \"f7c902be-9d7b-4943-9e5a-3eae2ce9e16c\") " pod="openshift-console/console-56dccc5ff8-s7k5z" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.964544 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/7b08a3d0-cce4-42df-b89a-a9b11ef39a0d-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-gm4xg\" (UID: \"7b08a3d0-cce4-42df-b89a-a9b11ef39a0d\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gm4xg" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.964569 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f7c902be-9d7b-4943-9e5a-3eae2ce9e16c-console-oauth-config\") pod \"console-56dccc5ff8-s7k5z\" (UID: \"f7c902be-9d7b-4943-9e5a-3eae2ce9e16c\") " pod="openshift-console/console-56dccc5ff8-s7k5z" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.964596 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4gdl\" (UniqueName: \"kubernetes.io/projected/7b08a3d0-cce4-42df-b89a-a9b11ef39a0d-kube-api-access-g4gdl\") pod \"nmstate-console-plugin-7fbb5f6569-gm4xg\" (UID: \"7b08a3d0-cce4-42df-b89a-a9b11ef39a0d\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gm4xg" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.964621 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7b08a3d0-cce4-42df-b89a-a9b11ef39a0d-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-gm4xg\" (UID: \"7b08a3d0-cce4-42df-b89a-a9b11ef39a0d\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gm4xg" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.964676 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f7c902be-9d7b-4943-9e5a-3eae2ce9e16c-oauth-serving-cert\") pod \"console-56dccc5ff8-s7k5z\" (UID: \"f7c902be-9d7b-4943-9e5a-3eae2ce9e16c\") " pod="openshift-console/console-56dccc5ff8-s7k5z" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.964699 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzs7v\" (UniqueName: \"kubernetes.io/projected/f7c902be-9d7b-4943-9e5a-3eae2ce9e16c-kube-api-access-zzs7v\") pod \"console-56dccc5ff8-s7k5z\" (UID: \"f7c902be-9d7b-4943-9e5a-3eae2ce9e16c\") " pod="openshift-console/console-56dccc5ff8-s7k5z" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.964726 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f7c902be-9d7b-4943-9e5a-3eae2ce9e16c-console-config\") pod \"console-56dccc5ff8-s7k5z\" (UID: \"f7c902be-9d7b-4943-9e5a-3eae2ce9e16c\") " pod="openshift-console/console-56dccc5ff8-s7k5z" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.964756 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f7c902be-9d7b-4943-9e5a-3eae2ce9e16c-service-ca\") pod \"console-56dccc5ff8-s7k5z\" (UID: \"f7c902be-9d7b-4943-9e5a-3eae2ce9e16c\") " pod="openshift-console/console-56dccc5ff8-s7k5z" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.964777 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7c902be-9d7b-4943-9e5a-3eae2ce9e16c-trusted-ca-bundle\") pod \"console-56dccc5ff8-s7k5z\" (UID: \"f7c902be-9d7b-4943-9e5a-3eae2ce9e16c\") " pod="openshift-console/console-56dccc5ff8-s7k5z" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.966238 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7b08a3d0-cce4-42df-b89a-a9b11ef39a0d-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-gm4xg\" (UID: \"7b08a3d0-cce4-42df-b89a-a9b11ef39a0d\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gm4xg" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.976374 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/7b08a3d0-cce4-42df-b89a-a9b11ef39a0d-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-gm4xg\" (UID: \"7b08a3d0-cce4-42df-b89a-a9b11ef39a0d\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gm4xg" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.983900 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4gdl\" (UniqueName: \"kubernetes.io/projected/7b08a3d0-cce4-42df-b89a-a9b11ef39a0d-kube-api-access-g4gdl\") pod \"nmstate-console-plugin-7fbb5f6569-gm4xg\" (UID: \"7b08a3d0-cce4-42df-b89a-a9b11ef39a0d\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gm4xg" Dec 06 01:16:22 crc kubenswrapper[4991]: I1206 01:16:22.999140 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56dccc5ff8-s7k5z"] Dec 06 01:16:23 crc kubenswrapper[4991]: I1206 01:16:23.039235 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gm4xg" Dec 06 01:16:23 crc kubenswrapper[4991]: I1206 01:16:23.066330 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f7c902be-9d7b-4943-9e5a-3eae2ce9e16c-oauth-serving-cert\") pod \"console-56dccc5ff8-s7k5z\" (UID: \"f7c902be-9d7b-4943-9e5a-3eae2ce9e16c\") " pod="openshift-console/console-56dccc5ff8-s7k5z" Dec 06 01:16:23 crc kubenswrapper[4991]: I1206 01:16:23.067315 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzs7v\" (UniqueName: \"kubernetes.io/projected/f7c902be-9d7b-4943-9e5a-3eae2ce9e16c-kube-api-access-zzs7v\") pod \"console-56dccc5ff8-s7k5z\" (UID: \"f7c902be-9d7b-4943-9e5a-3eae2ce9e16c\") " pod="openshift-console/console-56dccc5ff8-s7k5z" Dec 06 01:16:23 crc kubenswrapper[4991]: I1206 01:16:23.067346 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f7c902be-9d7b-4943-9e5a-3eae2ce9e16c-console-config\") pod \"console-56dccc5ff8-s7k5z\" (UID: \"f7c902be-9d7b-4943-9e5a-3eae2ce9e16c\") " pod="openshift-console/console-56dccc5ff8-s7k5z" Dec 06 01:16:23 crc kubenswrapper[4991]: I1206 01:16:23.067378 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f7c902be-9d7b-4943-9e5a-3eae2ce9e16c-service-ca\") pod \"console-56dccc5ff8-s7k5z\" (UID: \"f7c902be-9d7b-4943-9e5a-3eae2ce9e16c\") " pod="openshift-console/console-56dccc5ff8-s7k5z" Dec 06 01:16:23 crc kubenswrapper[4991]: I1206 01:16:23.067398 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7c902be-9d7b-4943-9e5a-3eae2ce9e16c-trusted-ca-bundle\") pod \"console-56dccc5ff8-s7k5z\" (UID: \"f7c902be-9d7b-4943-9e5a-3eae2ce9e16c\") " pod="openshift-console/console-56dccc5ff8-s7k5z" Dec 06 01:16:23 crc kubenswrapper[4991]: I1206 01:16:23.067421 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f7c902be-9d7b-4943-9e5a-3eae2ce9e16c-console-serving-cert\") pod \"console-56dccc5ff8-s7k5z\" (UID: \"f7c902be-9d7b-4943-9e5a-3eae2ce9e16c\") " pod="openshift-console/console-56dccc5ff8-s7k5z" Dec 06 01:16:23 crc kubenswrapper[4991]: I1206 01:16:23.067440 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f7c902be-9d7b-4943-9e5a-3eae2ce9e16c-console-oauth-config\") pod \"console-56dccc5ff8-s7k5z\" (UID: \"f7c902be-9d7b-4943-9e5a-3eae2ce9e16c\") " pod="openshift-console/console-56dccc5ff8-s7k5z" Dec 06 01:16:23 crc kubenswrapper[4991]: I1206 01:16:23.068480 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f7c902be-9d7b-4943-9e5a-3eae2ce9e16c-console-config\") pod \"console-56dccc5ff8-s7k5z\" (UID: \"f7c902be-9d7b-4943-9e5a-3eae2ce9e16c\") " pod="openshift-console/console-56dccc5ff8-s7k5z" Dec 06 01:16:23 crc kubenswrapper[4991]: I1206 01:16:23.067281 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f7c902be-9d7b-4943-9e5a-3eae2ce9e16c-oauth-serving-cert\") pod \"console-56dccc5ff8-s7k5z\" (UID: \"f7c902be-9d7b-4943-9e5a-3eae2ce9e16c\") " pod="openshift-console/console-56dccc5ff8-s7k5z" Dec 06 01:16:23 crc kubenswrapper[4991]: I1206 01:16:23.068740 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7c902be-9d7b-4943-9e5a-3eae2ce9e16c-trusted-ca-bundle\") pod \"console-56dccc5ff8-s7k5z\" (UID: \"f7c902be-9d7b-4943-9e5a-3eae2ce9e16c\") " pod="openshift-console/console-56dccc5ff8-s7k5z" Dec 06 01:16:23 crc kubenswrapper[4991]: I1206 01:16:23.069006 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f7c902be-9d7b-4943-9e5a-3eae2ce9e16c-service-ca\") pod \"console-56dccc5ff8-s7k5z\" (UID: \"f7c902be-9d7b-4943-9e5a-3eae2ce9e16c\") " pod="openshift-console/console-56dccc5ff8-s7k5z" Dec 06 01:16:23 crc kubenswrapper[4991]: I1206 01:16:23.081738 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f7c902be-9d7b-4943-9e5a-3eae2ce9e16c-console-serving-cert\") pod \"console-56dccc5ff8-s7k5z\" (UID: \"f7c902be-9d7b-4943-9e5a-3eae2ce9e16c\") " pod="openshift-console/console-56dccc5ff8-s7k5z" Dec 06 01:16:23 crc kubenswrapper[4991]: I1206 01:16:23.082819 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f7c902be-9d7b-4943-9e5a-3eae2ce9e16c-console-oauth-config\") pod \"console-56dccc5ff8-s7k5z\" (UID: \"f7c902be-9d7b-4943-9e5a-3eae2ce9e16c\") " pod="openshift-console/console-56dccc5ff8-s7k5z" Dec 06 01:16:23 crc kubenswrapper[4991]: I1206 01:16:23.086742 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzs7v\" (UniqueName: \"kubernetes.io/projected/f7c902be-9d7b-4943-9e5a-3eae2ce9e16c-kube-api-access-zzs7v\") pod \"console-56dccc5ff8-s7k5z\" (UID: \"f7c902be-9d7b-4943-9e5a-3eae2ce9e16c\") " pod="openshift-console/console-56dccc5ff8-s7k5z" Dec 06 01:16:23 crc kubenswrapper[4991]: I1206 01:16:23.111961 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-9czh7"] Dec 06 01:16:23 crc kubenswrapper[4991]: W1206 01:16:23.117028 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ba61c7d_dc05_4834_9e0e_e18bb755d1a9.slice/crio-d906924cd4ddba14be4851f1c9ad48604aacab33e35f62be824a134bb4863933 WatchSource:0}: Error finding container d906924cd4ddba14be4851f1c9ad48604aacab33e35f62be824a134bb4863933: Status 404 returned error can't find the container with id d906924cd4ddba14be4851f1c9ad48604aacab33e35f62be824a134bb4863933 Dec 06 01:16:23 crc kubenswrapper[4991]: I1206 01:16:23.265891 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56dccc5ff8-s7k5z" Dec 06 01:16:23 crc kubenswrapper[4991]: I1206 01:16:23.270898 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/461dd677-1efc-468e-833f-125c37a2c7b8-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-q6ftt\" (UID: \"461dd677-1efc-468e-833f-125c37a2c7b8\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-q6ftt" Dec 06 01:16:23 crc kubenswrapper[4991]: I1206 01:16:23.275368 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/461dd677-1efc-468e-833f-125c37a2c7b8-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-q6ftt\" (UID: \"461dd677-1efc-468e-833f-125c37a2c7b8\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-q6ftt" Dec 06 01:16:23 crc kubenswrapper[4991]: I1206 01:16:23.441892 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-9czh7" event={"ID":"8ba61c7d-dc05-4834-9e0e-e18bb755d1a9","Type":"ContainerStarted","Data":"d906924cd4ddba14be4851f1c9ad48604aacab33e35f62be824a134bb4863933"} Dec 06 01:16:23 crc kubenswrapper[4991]: I1206 01:16:23.443726 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-6vvck" event={"ID":"ac4eae73-f13b-4e52-9b25-f837bba897b0","Type":"ContainerStarted","Data":"8a791b63444dce386071bf9c954b701762b30f662701648918bb55486fc74016"} Dec 06 01:16:23 crc kubenswrapper[4991]: I1206 01:16:23.460857 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gm4xg"] Dec 06 01:16:23 crc kubenswrapper[4991]: W1206 01:16:23.461882 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b08a3d0_cce4_42df_b89a_a9b11ef39a0d.slice/crio-fc9923a6268053b6d6afdc6df199b9d0573c1ad02a2717793d19dc6b03ce6d14 WatchSource:0}: Error finding container fc9923a6268053b6d6afdc6df199b9d0573c1ad02a2717793d19dc6b03ce6d14: Status 404 returned error can't find the container with id fc9923a6268053b6d6afdc6df199b9d0573c1ad02a2717793d19dc6b03ce6d14 Dec 06 01:16:23 crc kubenswrapper[4991]: I1206 01:16:23.555606 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-q6ftt" Dec 06 01:16:23 crc kubenswrapper[4991]: I1206 01:16:23.777024 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56dccc5ff8-s7k5z"] Dec 06 01:16:23 crc kubenswrapper[4991]: I1206 01:16:23.804041 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-q6ftt"] Dec 06 01:16:23 crc kubenswrapper[4991]: W1206 01:16:23.805817 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7c902be_9d7b_4943_9e5a_3eae2ce9e16c.slice/crio-81c23aba43bad14b9491967ab2935431560b8196fc0176bd4d57e0c954c05631 WatchSource:0}: Error finding container 81c23aba43bad14b9491967ab2935431560b8196fc0176bd4d57e0c954c05631: Status 404 returned error can't find the container with id 81c23aba43bad14b9491967ab2935431560b8196fc0176bd4d57e0c954c05631 Dec 06 01:16:24 crc kubenswrapper[4991]: I1206 01:16:24.460702 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-q6ftt" event={"ID":"461dd677-1efc-468e-833f-125c37a2c7b8","Type":"ContainerStarted","Data":"abc53b60340147ef22e8865e7c324d05e21768df26caf9140d0ad84d419407c4"} Dec 06 01:16:24 crc kubenswrapper[4991]: I1206 01:16:24.462598 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gm4xg" event={"ID":"7b08a3d0-cce4-42df-b89a-a9b11ef39a0d","Type":"ContainerStarted","Data":"fc9923a6268053b6d6afdc6df199b9d0573c1ad02a2717793d19dc6b03ce6d14"} Dec 06 01:16:24 crc kubenswrapper[4991]: I1206 01:16:24.466034 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56dccc5ff8-s7k5z" event={"ID":"f7c902be-9d7b-4943-9e5a-3eae2ce9e16c","Type":"ContainerStarted","Data":"1838ab94f35ba6d533ebbcdd3d2c1421dddbfb32c682192dbebda6de7584a911"} Dec 06 01:16:24 crc kubenswrapper[4991]: I1206 01:16:24.466175 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56dccc5ff8-s7k5z" event={"ID":"f7c902be-9d7b-4943-9e5a-3eae2ce9e16c","Type":"ContainerStarted","Data":"81c23aba43bad14b9491967ab2935431560b8196fc0176bd4d57e0c954c05631"} Dec 06 01:16:24 crc kubenswrapper[4991]: I1206 01:16:24.490707 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-56dccc5ff8-s7k5z" podStartSLOduration=2.490680245 podStartE2EDuration="2.490680245s" podCreationTimestamp="2025-12-06 01:16:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:16:24.488755733 +0000 UTC m=+857.839046221" watchObservedRunningTime="2025-12-06 01:16:24.490680245 +0000 UTC m=+857.840970763" Dec 06 01:16:25 crc kubenswrapper[4991]: I1206 01:16:25.118175 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-j5w5j" Dec 06 01:16:25 crc kubenswrapper[4991]: I1206 01:16:25.165903 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-j5w5j" Dec 06 01:16:25 crc kubenswrapper[4991]: I1206 01:16:25.356119 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j5w5j"] Dec 06 01:16:26 crc kubenswrapper[4991]: I1206 01:16:26.478395 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-6vvck" event={"ID":"ac4eae73-f13b-4e52-9b25-f837bba897b0","Type":"ContainerStarted","Data":"bd6d1e429c33cd034be90920c8eeff0f899874427d03c19daa203b2611118cdb"} Dec 06 01:16:26 crc kubenswrapper[4991]: I1206 01:16:26.480721 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-6vvck" Dec 06 01:16:26 crc kubenswrapper[4991]: I1206 01:16:26.481317 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-q6ftt" event={"ID":"461dd677-1efc-468e-833f-125c37a2c7b8","Type":"ContainerStarted","Data":"97661bec1bc66a4c2b48287f5a3772a77581b649890068cba959c38b55b7cdcb"} Dec 06 01:16:26 crc kubenswrapper[4991]: I1206 01:16:26.481555 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-q6ftt" Dec 06 01:16:26 crc kubenswrapper[4991]: I1206 01:16:26.482937 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gm4xg" event={"ID":"7b08a3d0-cce4-42df-b89a-a9b11ef39a0d","Type":"ContainerStarted","Data":"5631f3070503600521045cf62b11af3056a3e1623914586774780b3ef751fd5a"} Dec 06 01:16:26 crc kubenswrapper[4991]: I1206 01:16:26.484806 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-j5w5j" podUID="2db8e841-8dd9-4df0-ac17-47ce3673078c" containerName="registry-server" containerID="cri-o://3e1d71f11f291e9868c301242664d32c091827f744fa9941794a004066099b53" gracePeriod=2 Dec 06 01:16:26 crc kubenswrapper[4991]: I1206 01:16:26.485466 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-9czh7" event={"ID":"8ba61c7d-dc05-4834-9e0e-e18bb755d1a9","Type":"ContainerStarted","Data":"4be861b9f3642842301178bbbbfd811c32279d3152376ae66194264689b8fc4c"} Dec 06 01:16:26 crc kubenswrapper[4991]: I1206 01:16:26.511487 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-6vvck" podStartSLOduration=1.512046036 podStartE2EDuration="4.511457376s" podCreationTimestamp="2025-12-06 01:16:22 +0000 UTC" firstStartedPulling="2025-12-06 01:16:23.009822032 +0000 UTC m=+856.360112500" lastFinishedPulling="2025-12-06 01:16:26.009233332 +0000 UTC m=+859.359523840" observedRunningTime="2025-12-06 01:16:26.499256341 +0000 UTC m=+859.849546809" watchObservedRunningTime="2025-12-06 01:16:26.511457376 +0000 UTC m=+859.861747844" Dec 06 01:16:26 crc kubenswrapper[4991]: I1206 01:16:26.521031 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gm4xg" podStartSLOduration=1.975707491 podStartE2EDuration="4.521008621s" podCreationTimestamp="2025-12-06 01:16:22 +0000 UTC" firstStartedPulling="2025-12-06 01:16:23.463502071 +0000 UTC m=+856.813792539" lastFinishedPulling="2025-12-06 01:16:26.008803201 +0000 UTC m=+859.359093669" observedRunningTime="2025-12-06 01:16:26.520156368 +0000 UTC m=+859.870446836" watchObservedRunningTime="2025-12-06 01:16:26.521008621 +0000 UTC m=+859.871299109" Dec 06 01:16:26 crc kubenswrapper[4991]: I1206 01:16:26.547582 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-q6ftt" podStartSLOduration=2.312057067 podStartE2EDuration="4.547547959s" podCreationTimestamp="2025-12-06 01:16:22 +0000 UTC" firstStartedPulling="2025-12-06 01:16:23.80595662 +0000 UTC m=+857.156247118" lastFinishedPulling="2025-12-06 01:16:26.041447542 +0000 UTC m=+859.391738010" observedRunningTime="2025-12-06 01:16:26.547315633 +0000 UTC m=+859.897606141" watchObservedRunningTime="2025-12-06 01:16:26.547547959 +0000 UTC m=+859.897838427" Dec 06 01:16:26 crc kubenswrapper[4991]: I1206 01:16:26.819331 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j5w5j" Dec 06 01:16:26 crc kubenswrapper[4991]: I1206 01:16:26.829021 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfgfg\" (UniqueName: \"kubernetes.io/projected/2db8e841-8dd9-4df0-ac17-47ce3673078c-kube-api-access-kfgfg\") pod \"2db8e841-8dd9-4df0-ac17-47ce3673078c\" (UID: \"2db8e841-8dd9-4df0-ac17-47ce3673078c\") " Dec 06 01:16:26 crc kubenswrapper[4991]: I1206 01:16:26.829059 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2db8e841-8dd9-4df0-ac17-47ce3673078c-catalog-content\") pod \"2db8e841-8dd9-4df0-ac17-47ce3673078c\" (UID: \"2db8e841-8dd9-4df0-ac17-47ce3673078c\") " Dec 06 01:16:26 crc kubenswrapper[4991]: I1206 01:16:26.829081 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2db8e841-8dd9-4df0-ac17-47ce3673078c-utilities\") pod \"2db8e841-8dd9-4df0-ac17-47ce3673078c\" (UID: \"2db8e841-8dd9-4df0-ac17-47ce3673078c\") " Dec 06 01:16:26 crc kubenswrapper[4991]: I1206 01:16:26.830936 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2db8e841-8dd9-4df0-ac17-47ce3673078c-utilities" (OuterVolumeSpecName: "utilities") pod "2db8e841-8dd9-4df0-ac17-47ce3673078c" (UID: "2db8e841-8dd9-4df0-ac17-47ce3673078c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:16:26 crc kubenswrapper[4991]: I1206 01:16:26.843478 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2db8e841-8dd9-4df0-ac17-47ce3673078c-kube-api-access-kfgfg" (OuterVolumeSpecName: "kube-api-access-kfgfg") pod "2db8e841-8dd9-4df0-ac17-47ce3673078c" (UID: "2db8e841-8dd9-4df0-ac17-47ce3673078c"). InnerVolumeSpecName "kube-api-access-kfgfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:16:26 crc kubenswrapper[4991]: I1206 01:16:26.930281 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfgfg\" (UniqueName: \"kubernetes.io/projected/2db8e841-8dd9-4df0-ac17-47ce3673078c-kube-api-access-kfgfg\") on node \"crc\" DevicePath \"\"" Dec 06 01:16:26 crc kubenswrapper[4991]: I1206 01:16:26.930325 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2db8e841-8dd9-4df0-ac17-47ce3673078c-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 01:16:26 crc kubenswrapper[4991]: I1206 01:16:26.963364 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2db8e841-8dd9-4df0-ac17-47ce3673078c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2db8e841-8dd9-4df0-ac17-47ce3673078c" (UID: "2db8e841-8dd9-4df0-ac17-47ce3673078c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:16:27 crc kubenswrapper[4991]: I1206 01:16:27.031398 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2db8e841-8dd9-4df0-ac17-47ce3673078c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 01:16:27 crc kubenswrapper[4991]: I1206 01:16:27.495689 4991 generic.go:334] "Generic (PLEG): container finished" podID="2db8e841-8dd9-4df0-ac17-47ce3673078c" containerID="3e1d71f11f291e9868c301242664d32c091827f744fa9941794a004066099b53" exitCode=0 Dec 06 01:16:27 crc kubenswrapper[4991]: I1206 01:16:27.495762 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j5w5j" event={"ID":"2db8e841-8dd9-4df0-ac17-47ce3673078c","Type":"ContainerDied","Data":"3e1d71f11f291e9868c301242664d32c091827f744fa9941794a004066099b53"} Dec 06 01:16:27 crc kubenswrapper[4991]: I1206 01:16:27.496268 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j5w5j" event={"ID":"2db8e841-8dd9-4df0-ac17-47ce3673078c","Type":"ContainerDied","Data":"644ed1f07b8b5210d7ce822013ed3d95ecb40d844258713c445f88221ee2fe5e"} Dec 06 01:16:27 crc kubenswrapper[4991]: I1206 01:16:27.496313 4991 scope.go:117] "RemoveContainer" containerID="3e1d71f11f291e9868c301242664d32c091827f744fa9941794a004066099b53" Dec 06 01:16:27 crc kubenswrapper[4991]: I1206 01:16:27.495881 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j5w5j" Dec 06 01:16:27 crc kubenswrapper[4991]: I1206 01:16:27.525710 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j5w5j"] Dec 06 01:16:27 crc kubenswrapper[4991]: I1206 01:16:27.530952 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-j5w5j"] Dec 06 01:16:27 crc kubenswrapper[4991]: I1206 01:16:27.536746 4991 scope.go:117] "RemoveContainer" containerID="da4a38bf74904ed31a3928fdbcafc77693aa5055f629f3704e32f5409eb588f3" Dec 06 01:16:27 crc kubenswrapper[4991]: I1206 01:16:27.566354 4991 scope.go:117] "RemoveContainer" containerID="0a879a06a5d4d266e285c92b3114e64274a3217d4111a493cf08454f1ae65caa" Dec 06 01:16:27 crc kubenswrapper[4991]: I1206 01:16:27.600386 4991 scope.go:117] "RemoveContainer" containerID="3e1d71f11f291e9868c301242664d32c091827f744fa9941794a004066099b53" Dec 06 01:16:27 crc kubenswrapper[4991]: E1206 01:16:27.600918 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e1d71f11f291e9868c301242664d32c091827f744fa9941794a004066099b53\": container with ID starting with 3e1d71f11f291e9868c301242664d32c091827f744fa9941794a004066099b53 not found: ID does not exist" containerID="3e1d71f11f291e9868c301242664d32c091827f744fa9941794a004066099b53" Dec 06 01:16:27 crc kubenswrapper[4991]: I1206 01:16:27.600961 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e1d71f11f291e9868c301242664d32c091827f744fa9941794a004066099b53"} err="failed to get container status \"3e1d71f11f291e9868c301242664d32c091827f744fa9941794a004066099b53\": rpc error: code = NotFound desc = could not find container \"3e1d71f11f291e9868c301242664d32c091827f744fa9941794a004066099b53\": container with ID starting with 3e1d71f11f291e9868c301242664d32c091827f744fa9941794a004066099b53 not found: ID does not exist" Dec 06 01:16:27 crc kubenswrapper[4991]: I1206 01:16:27.601011 4991 scope.go:117] "RemoveContainer" containerID="da4a38bf74904ed31a3928fdbcafc77693aa5055f629f3704e32f5409eb588f3" Dec 06 01:16:27 crc kubenswrapper[4991]: E1206 01:16:27.602273 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da4a38bf74904ed31a3928fdbcafc77693aa5055f629f3704e32f5409eb588f3\": container with ID starting with da4a38bf74904ed31a3928fdbcafc77693aa5055f629f3704e32f5409eb588f3 not found: ID does not exist" containerID="da4a38bf74904ed31a3928fdbcafc77693aa5055f629f3704e32f5409eb588f3" Dec 06 01:16:27 crc kubenswrapper[4991]: I1206 01:16:27.602328 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da4a38bf74904ed31a3928fdbcafc77693aa5055f629f3704e32f5409eb588f3"} err="failed to get container status \"da4a38bf74904ed31a3928fdbcafc77693aa5055f629f3704e32f5409eb588f3\": rpc error: code = NotFound desc = could not find container \"da4a38bf74904ed31a3928fdbcafc77693aa5055f629f3704e32f5409eb588f3\": container with ID starting with da4a38bf74904ed31a3928fdbcafc77693aa5055f629f3704e32f5409eb588f3 not found: ID does not exist" Dec 06 01:16:27 crc kubenswrapper[4991]: I1206 01:16:27.602362 4991 scope.go:117] "RemoveContainer" containerID="0a879a06a5d4d266e285c92b3114e64274a3217d4111a493cf08454f1ae65caa" Dec 06 01:16:27 crc kubenswrapper[4991]: E1206 01:16:27.603968 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a879a06a5d4d266e285c92b3114e64274a3217d4111a493cf08454f1ae65caa\": container with ID starting with 0a879a06a5d4d266e285c92b3114e64274a3217d4111a493cf08454f1ae65caa not found: ID does not exist" containerID="0a879a06a5d4d266e285c92b3114e64274a3217d4111a493cf08454f1ae65caa" Dec 06 01:16:27 crc kubenswrapper[4991]: I1206 01:16:27.604018 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a879a06a5d4d266e285c92b3114e64274a3217d4111a493cf08454f1ae65caa"} err="failed to get container status \"0a879a06a5d4d266e285c92b3114e64274a3217d4111a493cf08454f1ae65caa\": rpc error: code = NotFound desc = could not find container \"0a879a06a5d4d266e285c92b3114e64274a3217d4111a493cf08454f1ae65caa\": container with ID starting with 0a879a06a5d4d266e285c92b3114e64274a3217d4111a493cf08454f1ae65caa not found: ID does not exist" Dec 06 01:16:29 crc kubenswrapper[4991]: I1206 01:16:29.049808 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2db8e841-8dd9-4df0-ac17-47ce3673078c" path="/var/lib/kubelet/pods/2db8e841-8dd9-4df0-ac17-47ce3673078c/volumes" Dec 06 01:16:29 crc kubenswrapper[4991]: I1206 01:16:29.523367 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-9czh7" event={"ID":"8ba61c7d-dc05-4834-9e0e-e18bb755d1a9","Type":"ContainerStarted","Data":"79b1bebd4fbb46f1474f8d306635f477432ca49101a817a252c4aa550b14ed8a"} Dec 06 01:16:29 crc kubenswrapper[4991]: I1206 01:16:29.554149 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-9czh7" podStartSLOduration=2.189417704 podStartE2EDuration="7.554126191s" podCreationTimestamp="2025-12-06 01:16:22 +0000 UTC" firstStartedPulling="2025-12-06 01:16:23.119161071 +0000 UTC m=+856.469451539" lastFinishedPulling="2025-12-06 01:16:28.483869518 +0000 UTC m=+861.834160026" observedRunningTime="2025-12-06 01:16:29.551509101 +0000 UTC m=+862.901799579" watchObservedRunningTime="2025-12-06 01:16:29.554126191 +0000 UTC m=+862.904416669" Dec 06 01:16:29 crc kubenswrapper[4991]: I1206 01:16:29.567056 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dfb6p"] Dec 06 01:16:29 crc kubenswrapper[4991]: E1206 01:16:29.567410 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2db8e841-8dd9-4df0-ac17-47ce3673078c" containerName="registry-server" Dec 06 01:16:29 crc kubenswrapper[4991]: I1206 01:16:29.567439 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="2db8e841-8dd9-4df0-ac17-47ce3673078c" containerName="registry-server" Dec 06 01:16:29 crc kubenswrapper[4991]: E1206 01:16:29.567468 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2db8e841-8dd9-4df0-ac17-47ce3673078c" containerName="extract-content" Dec 06 01:16:29 crc kubenswrapper[4991]: I1206 01:16:29.567482 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="2db8e841-8dd9-4df0-ac17-47ce3673078c" containerName="extract-content" Dec 06 01:16:29 crc kubenswrapper[4991]: E1206 01:16:29.567504 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2db8e841-8dd9-4df0-ac17-47ce3673078c" containerName="extract-utilities" Dec 06 01:16:29 crc kubenswrapper[4991]: I1206 01:16:29.567517 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="2db8e841-8dd9-4df0-ac17-47ce3673078c" containerName="extract-utilities" Dec 06 01:16:29 crc kubenswrapper[4991]: I1206 01:16:29.567748 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="2db8e841-8dd9-4df0-ac17-47ce3673078c" containerName="registry-server" Dec 06 01:16:29 crc kubenswrapper[4991]: I1206 01:16:29.569377 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dfb6p" Dec 06 01:16:29 crc kubenswrapper[4991]: I1206 01:16:29.585235 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dfb6p"] Dec 06 01:16:29 crc kubenswrapper[4991]: I1206 01:16:29.767182 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a67fb8d0-4d03-4181-ac71-1995faf47df6-catalog-content\") pod \"redhat-marketplace-dfb6p\" (UID: \"a67fb8d0-4d03-4181-ac71-1995faf47df6\") " pod="openshift-marketplace/redhat-marketplace-dfb6p" Dec 06 01:16:29 crc kubenswrapper[4991]: I1206 01:16:29.767267 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a67fb8d0-4d03-4181-ac71-1995faf47df6-utilities\") pod \"redhat-marketplace-dfb6p\" (UID: \"a67fb8d0-4d03-4181-ac71-1995faf47df6\") " pod="openshift-marketplace/redhat-marketplace-dfb6p" Dec 06 01:16:29 crc kubenswrapper[4991]: I1206 01:16:29.767308 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np7sg\" (UniqueName: \"kubernetes.io/projected/a67fb8d0-4d03-4181-ac71-1995faf47df6-kube-api-access-np7sg\") pod \"redhat-marketplace-dfb6p\" (UID: \"a67fb8d0-4d03-4181-ac71-1995faf47df6\") " pod="openshift-marketplace/redhat-marketplace-dfb6p" Dec 06 01:16:29 crc kubenswrapper[4991]: I1206 01:16:29.868973 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a67fb8d0-4d03-4181-ac71-1995faf47df6-catalog-content\") pod \"redhat-marketplace-dfb6p\" (UID: \"a67fb8d0-4d03-4181-ac71-1995faf47df6\") " pod="openshift-marketplace/redhat-marketplace-dfb6p" Dec 06 01:16:29 crc kubenswrapper[4991]: I1206 01:16:29.869394 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a67fb8d0-4d03-4181-ac71-1995faf47df6-utilities\") pod \"redhat-marketplace-dfb6p\" (UID: \"a67fb8d0-4d03-4181-ac71-1995faf47df6\") " pod="openshift-marketplace/redhat-marketplace-dfb6p" Dec 06 01:16:29 crc kubenswrapper[4991]: I1206 01:16:29.869440 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np7sg\" (UniqueName: \"kubernetes.io/projected/a67fb8d0-4d03-4181-ac71-1995faf47df6-kube-api-access-np7sg\") pod \"redhat-marketplace-dfb6p\" (UID: \"a67fb8d0-4d03-4181-ac71-1995faf47df6\") " pod="openshift-marketplace/redhat-marketplace-dfb6p" Dec 06 01:16:29 crc kubenswrapper[4991]: I1206 01:16:29.869950 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a67fb8d0-4d03-4181-ac71-1995faf47df6-catalog-content\") pod \"redhat-marketplace-dfb6p\" (UID: \"a67fb8d0-4d03-4181-ac71-1995faf47df6\") " pod="openshift-marketplace/redhat-marketplace-dfb6p" Dec 06 01:16:29 crc kubenswrapper[4991]: I1206 01:16:29.870042 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a67fb8d0-4d03-4181-ac71-1995faf47df6-utilities\") pod \"redhat-marketplace-dfb6p\" (UID: \"a67fb8d0-4d03-4181-ac71-1995faf47df6\") " pod="openshift-marketplace/redhat-marketplace-dfb6p" Dec 06 01:16:29 crc kubenswrapper[4991]: I1206 01:16:29.892799 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np7sg\" (UniqueName: \"kubernetes.io/projected/a67fb8d0-4d03-4181-ac71-1995faf47df6-kube-api-access-np7sg\") pod \"redhat-marketplace-dfb6p\" (UID: \"a67fb8d0-4d03-4181-ac71-1995faf47df6\") " pod="openshift-marketplace/redhat-marketplace-dfb6p" Dec 06 01:16:29 crc kubenswrapper[4991]: I1206 01:16:29.901480 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dfb6p" Dec 06 01:16:30 crc kubenswrapper[4991]: I1206 01:16:30.117728 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dfb6p"] Dec 06 01:16:30 crc kubenswrapper[4991]: W1206 01:16:30.123777 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda67fb8d0_4d03_4181_ac71_1995faf47df6.slice/crio-7266c7a21395020bd9e6796315dae31d178eeabbca56de7ece93496e03bf7e09 WatchSource:0}: Error finding container 7266c7a21395020bd9e6796315dae31d178eeabbca56de7ece93496e03bf7e09: Status 404 returned error can't find the container with id 7266c7a21395020bd9e6796315dae31d178eeabbca56de7ece93496e03bf7e09 Dec 06 01:16:30 crc kubenswrapper[4991]: I1206 01:16:30.532342 4991 generic.go:334] "Generic (PLEG): container finished" podID="a67fb8d0-4d03-4181-ac71-1995faf47df6" containerID="e01c599636e3a7ceeb3b8467835015cd0e35a6838a911dc0037727cd310aa9ef" exitCode=0 Dec 06 01:16:30 crc kubenswrapper[4991]: I1206 01:16:30.532442 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dfb6p" event={"ID":"a67fb8d0-4d03-4181-ac71-1995faf47df6","Type":"ContainerDied","Data":"e01c599636e3a7ceeb3b8467835015cd0e35a6838a911dc0037727cd310aa9ef"} Dec 06 01:16:30 crc kubenswrapper[4991]: I1206 01:16:30.532966 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dfb6p" event={"ID":"a67fb8d0-4d03-4181-ac71-1995faf47df6","Type":"ContainerStarted","Data":"7266c7a21395020bd9e6796315dae31d178eeabbca56de7ece93496e03bf7e09"} Dec 06 01:16:31 crc kubenswrapper[4991]: I1206 01:16:31.544623 4991 generic.go:334] "Generic (PLEG): container finished" podID="a67fb8d0-4d03-4181-ac71-1995faf47df6" containerID="48bc968700e618fb9d44b93403a2a513047bb845d902c69c81d21bd1702b9ead" exitCode=0 Dec 06 01:16:31 crc kubenswrapper[4991]: I1206 01:16:31.544745 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dfb6p" event={"ID":"a67fb8d0-4d03-4181-ac71-1995faf47df6","Type":"ContainerDied","Data":"48bc968700e618fb9d44b93403a2a513047bb845d902c69c81d21bd1702b9ead"} Dec 06 01:16:32 crc kubenswrapper[4991]: I1206 01:16:32.554594 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dfb6p" event={"ID":"a67fb8d0-4d03-4181-ac71-1995faf47df6","Type":"ContainerStarted","Data":"4feee3af378d39e428fb6d189cd651dbdfd46cd3ae979115de440f0819ff377c"} Dec 06 01:16:32 crc kubenswrapper[4991]: I1206 01:16:32.592131 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dfb6p" podStartSLOduration=2.186458825 podStartE2EDuration="3.59210446s" podCreationTimestamp="2025-12-06 01:16:29 +0000 UTC" firstStartedPulling="2025-12-06 01:16:30.53392356 +0000 UTC m=+863.884214018" lastFinishedPulling="2025-12-06 01:16:31.939569175 +0000 UTC m=+865.289859653" observedRunningTime="2025-12-06 01:16:32.58235124 +0000 UTC m=+865.932641728" watchObservedRunningTime="2025-12-06 01:16:32.59210446 +0000 UTC m=+865.942394948" Dec 06 01:16:32 crc kubenswrapper[4991]: I1206 01:16:32.982628 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-6vvck" Dec 06 01:16:33 crc kubenswrapper[4991]: I1206 01:16:33.266655 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-56dccc5ff8-s7k5z" Dec 06 01:16:33 crc kubenswrapper[4991]: I1206 01:16:33.266796 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-56dccc5ff8-s7k5z" Dec 06 01:16:33 crc kubenswrapper[4991]: I1206 01:16:33.274299 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-56dccc5ff8-s7k5z" Dec 06 01:16:33 crc kubenswrapper[4991]: I1206 01:16:33.568144 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-56dccc5ff8-s7k5z" Dec 06 01:16:33 crc kubenswrapper[4991]: I1206 01:16:33.653951 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-97gxp"] Dec 06 01:16:39 crc kubenswrapper[4991]: I1206 01:16:39.902044 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dfb6p" Dec 06 01:16:39 crc kubenswrapper[4991]: I1206 01:16:39.902704 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dfb6p" Dec 06 01:16:39 crc kubenswrapper[4991]: I1206 01:16:39.971724 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dfb6p" Dec 06 01:16:40 crc kubenswrapper[4991]: I1206 01:16:40.687446 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dfb6p" Dec 06 01:16:40 crc kubenswrapper[4991]: I1206 01:16:40.756201 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dfb6p"] Dec 06 01:16:42 crc kubenswrapper[4991]: I1206 01:16:42.630737 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dfb6p" podUID="a67fb8d0-4d03-4181-ac71-1995faf47df6" containerName="registry-server" containerID="cri-o://4feee3af378d39e428fb6d189cd651dbdfd46cd3ae979115de440f0819ff377c" gracePeriod=2 Dec 06 01:16:43 crc kubenswrapper[4991]: I1206 01:16:43.551057 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dfb6p" Dec 06 01:16:43 crc kubenswrapper[4991]: I1206 01:16:43.562653 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-q6ftt" Dec 06 01:16:43 crc kubenswrapper[4991]: I1206 01:16:43.641643 4991 generic.go:334] "Generic (PLEG): container finished" podID="a67fb8d0-4d03-4181-ac71-1995faf47df6" containerID="4feee3af378d39e428fb6d189cd651dbdfd46cd3ae979115de440f0819ff377c" exitCode=0 Dec 06 01:16:43 crc kubenswrapper[4991]: I1206 01:16:43.642034 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dfb6p" event={"ID":"a67fb8d0-4d03-4181-ac71-1995faf47df6","Type":"ContainerDied","Data":"4feee3af378d39e428fb6d189cd651dbdfd46cd3ae979115de440f0819ff377c"} Dec 06 01:16:43 crc kubenswrapper[4991]: I1206 01:16:43.642106 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dfb6p" event={"ID":"a67fb8d0-4d03-4181-ac71-1995faf47df6","Type":"ContainerDied","Data":"7266c7a21395020bd9e6796315dae31d178eeabbca56de7ece93496e03bf7e09"} Dec 06 01:16:43 crc kubenswrapper[4991]: I1206 01:16:43.642134 4991 scope.go:117] "RemoveContainer" containerID="4feee3af378d39e428fb6d189cd651dbdfd46cd3ae979115de440f0819ff377c" Dec 06 01:16:43 crc kubenswrapper[4991]: I1206 01:16:43.642142 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dfb6p" Dec 06 01:16:43 crc kubenswrapper[4991]: I1206 01:16:43.659357 4991 scope.go:117] "RemoveContainer" containerID="48bc968700e618fb9d44b93403a2a513047bb845d902c69c81d21bd1702b9ead" Dec 06 01:16:43 crc kubenswrapper[4991]: I1206 01:16:43.674515 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a67fb8d0-4d03-4181-ac71-1995faf47df6-utilities\") pod \"a67fb8d0-4d03-4181-ac71-1995faf47df6\" (UID: \"a67fb8d0-4d03-4181-ac71-1995faf47df6\") " Dec 06 01:16:43 crc kubenswrapper[4991]: I1206 01:16:43.674672 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-np7sg\" (UniqueName: \"kubernetes.io/projected/a67fb8d0-4d03-4181-ac71-1995faf47df6-kube-api-access-np7sg\") pod \"a67fb8d0-4d03-4181-ac71-1995faf47df6\" (UID: \"a67fb8d0-4d03-4181-ac71-1995faf47df6\") " Dec 06 01:16:43 crc kubenswrapper[4991]: I1206 01:16:43.674748 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a67fb8d0-4d03-4181-ac71-1995faf47df6-catalog-content\") pod \"a67fb8d0-4d03-4181-ac71-1995faf47df6\" (UID: \"a67fb8d0-4d03-4181-ac71-1995faf47df6\") " Dec 06 01:16:43 crc kubenswrapper[4991]: I1206 01:16:43.676238 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a67fb8d0-4d03-4181-ac71-1995faf47df6-utilities" (OuterVolumeSpecName: "utilities") pod "a67fb8d0-4d03-4181-ac71-1995faf47df6" (UID: "a67fb8d0-4d03-4181-ac71-1995faf47df6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:16:43 crc kubenswrapper[4991]: I1206 01:16:43.682193 4991 scope.go:117] "RemoveContainer" containerID="e01c599636e3a7ceeb3b8467835015cd0e35a6838a911dc0037727cd310aa9ef" Dec 06 01:16:43 crc kubenswrapper[4991]: I1206 01:16:43.682452 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a67fb8d0-4d03-4181-ac71-1995faf47df6-kube-api-access-np7sg" (OuterVolumeSpecName: "kube-api-access-np7sg") pod "a67fb8d0-4d03-4181-ac71-1995faf47df6" (UID: "a67fb8d0-4d03-4181-ac71-1995faf47df6"). InnerVolumeSpecName "kube-api-access-np7sg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:16:43 crc kubenswrapper[4991]: I1206 01:16:43.712842 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a67fb8d0-4d03-4181-ac71-1995faf47df6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a67fb8d0-4d03-4181-ac71-1995faf47df6" (UID: "a67fb8d0-4d03-4181-ac71-1995faf47df6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:16:43 crc kubenswrapper[4991]: I1206 01:16:43.718201 4991 scope.go:117] "RemoveContainer" containerID="4feee3af378d39e428fb6d189cd651dbdfd46cd3ae979115de440f0819ff377c" Dec 06 01:16:43 crc kubenswrapper[4991]: E1206 01:16:43.718871 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4feee3af378d39e428fb6d189cd651dbdfd46cd3ae979115de440f0819ff377c\": container with ID starting with 4feee3af378d39e428fb6d189cd651dbdfd46cd3ae979115de440f0819ff377c not found: ID does not exist" containerID="4feee3af378d39e428fb6d189cd651dbdfd46cd3ae979115de440f0819ff377c" Dec 06 01:16:43 crc kubenswrapper[4991]: I1206 01:16:43.718909 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4feee3af378d39e428fb6d189cd651dbdfd46cd3ae979115de440f0819ff377c"} err="failed to get container status \"4feee3af378d39e428fb6d189cd651dbdfd46cd3ae979115de440f0819ff377c\": rpc error: code = NotFound desc = could not find container \"4feee3af378d39e428fb6d189cd651dbdfd46cd3ae979115de440f0819ff377c\": container with ID starting with 4feee3af378d39e428fb6d189cd651dbdfd46cd3ae979115de440f0819ff377c not found: ID does not exist" Dec 06 01:16:43 crc kubenswrapper[4991]: I1206 01:16:43.718937 4991 scope.go:117] "RemoveContainer" containerID="48bc968700e618fb9d44b93403a2a513047bb845d902c69c81d21bd1702b9ead" Dec 06 01:16:43 crc kubenswrapper[4991]: E1206 01:16:43.719277 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48bc968700e618fb9d44b93403a2a513047bb845d902c69c81d21bd1702b9ead\": container with ID starting with 48bc968700e618fb9d44b93403a2a513047bb845d902c69c81d21bd1702b9ead not found: ID does not exist" containerID="48bc968700e618fb9d44b93403a2a513047bb845d902c69c81d21bd1702b9ead" Dec 06 01:16:43 crc kubenswrapper[4991]: I1206 01:16:43.719383 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48bc968700e618fb9d44b93403a2a513047bb845d902c69c81d21bd1702b9ead"} err="failed to get container status \"48bc968700e618fb9d44b93403a2a513047bb845d902c69c81d21bd1702b9ead\": rpc error: code = NotFound desc = could not find container \"48bc968700e618fb9d44b93403a2a513047bb845d902c69c81d21bd1702b9ead\": container with ID starting with 48bc968700e618fb9d44b93403a2a513047bb845d902c69c81d21bd1702b9ead not found: ID does not exist" Dec 06 01:16:43 crc kubenswrapper[4991]: I1206 01:16:43.719471 4991 scope.go:117] "RemoveContainer" containerID="e01c599636e3a7ceeb3b8467835015cd0e35a6838a911dc0037727cd310aa9ef" Dec 06 01:16:43 crc kubenswrapper[4991]: E1206 01:16:43.720058 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e01c599636e3a7ceeb3b8467835015cd0e35a6838a911dc0037727cd310aa9ef\": container with ID starting with e01c599636e3a7ceeb3b8467835015cd0e35a6838a911dc0037727cd310aa9ef not found: ID does not exist" containerID="e01c599636e3a7ceeb3b8467835015cd0e35a6838a911dc0037727cd310aa9ef" Dec 06 01:16:43 crc kubenswrapper[4991]: I1206 01:16:43.720186 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e01c599636e3a7ceeb3b8467835015cd0e35a6838a911dc0037727cd310aa9ef"} err="failed to get container status \"e01c599636e3a7ceeb3b8467835015cd0e35a6838a911dc0037727cd310aa9ef\": rpc error: code = NotFound desc = could not find container \"e01c599636e3a7ceeb3b8467835015cd0e35a6838a911dc0037727cd310aa9ef\": container with ID starting with e01c599636e3a7ceeb3b8467835015cd0e35a6838a911dc0037727cd310aa9ef not found: ID does not exist" Dec 06 01:16:43 crc kubenswrapper[4991]: I1206 01:16:43.775878 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-np7sg\" (UniqueName: \"kubernetes.io/projected/a67fb8d0-4d03-4181-ac71-1995faf47df6-kube-api-access-np7sg\") on node \"crc\" DevicePath \"\"" Dec 06 01:16:43 crc kubenswrapper[4991]: I1206 01:16:43.775923 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a67fb8d0-4d03-4181-ac71-1995faf47df6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 01:16:43 crc kubenswrapper[4991]: I1206 01:16:43.775939 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a67fb8d0-4d03-4181-ac71-1995faf47df6-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 01:16:43 crc kubenswrapper[4991]: I1206 01:16:43.979914 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dfb6p"] Dec 06 01:16:43 crc kubenswrapper[4991]: I1206 01:16:43.988736 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dfb6p"] Dec 06 01:16:45 crc kubenswrapper[4991]: I1206 01:16:45.052914 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a67fb8d0-4d03-4181-ac71-1995faf47df6" path="/var/lib/kubelet/pods/a67fb8d0-4d03-4181-ac71-1995faf47df6/volumes" Dec 06 01:16:58 crc kubenswrapper[4991]: I1206 01:16:58.695064 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-97gxp" podUID="8723600d-fcf8-414f-a440-da9d65d41e07" containerName="console" containerID="cri-o://4b9f1c62629240e98ebb8a24fe30ae5c2f69731f649b4d02200779344c8772a5" gracePeriod=15 Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.098513 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8nnxx"] Dec 06 01:16:59 crc kubenswrapper[4991]: E1206 01:16:59.098930 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a67fb8d0-4d03-4181-ac71-1995faf47df6" containerName="registry-server" Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.098955 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="a67fb8d0-4d03-4181-ac71-1995faf47df6" containerName="registry-server" Dec 06 01:16:59 crc kubenswrapper[4991]: E1206 01:16:59.099018 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a67fb8d0-4d03-4181-ac71-1995faf47df6" containerName="extract-content" Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.100068 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="a67fb8d0-4d03-4181-ac71-1995faf47df6" containerName="extract-content" Dec 06 01:16:59 crc kubenswrapper[4991]: E1206 01:16:59.100105 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a67fb8d0-4d03-4181-ac71-1995faf47df6" containerName="extract-utilities" Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.100121 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="a67fb8d0-4d03-4181-ac71-1995faf47df6" containerName="extract-utilities" Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.100324 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="a67fb8d0-4d03-4181-ac71-1995faf47df6" containerName="registry-server" Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.102451 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8nnxx" Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.111640 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8nnxx"] Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.203791 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-97gxp_8723600d-fcf8-414f-a440-da9d65d41e07/console/0.log" Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.203851 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-97gxp" Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.213238 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b475cb0a-14ff-4a19-983a-c4a352134ffb-catalog-content\") pod \"community-operators-8nnxx\" (UID: \"b475cb0a-14ff-4a19-983a-c4a352134ffb\") " pod="openshift-marketplace/community-operators-8nnxx" Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.213298 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr7k9\" (UniqueName: \"kubernetes.io/projected/b475cb0a-14ff-4a19-983a-c4a352134ffb-kube-api-access-zr7k9\") pod \"community-operators-8nnxx\" (UID: \"b475cb0a-14ff-4a19-983a-c4a352134ffb\") " pod="openshift-marketplace/community-operators-8nnxx" Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.213336 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b475cb0a-14ff-4a19-983a-c4a352134ffb-utilities\") pod \"community-operators-8nnxx\" (UID: \"b475cb0a-14ff-4a19-983a-c4a352134ffb\") " pod="openshift-marketplace/community-operators-8nnxx" Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.313834 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8723600d-fcf8-414f-a440-da9d65d41e07-console-serving-cert\") pod \"8723600d-fcf8-414f-a440-da9d65d41e07\" (UID: \"8723600d-fcf8-414f-a440-da9d65d41e07\") " Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.313893 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4x5kd\" (UniqueName: \"kubernetes.io/projected/8723600d-fcf8-414f-a440-da9d65d41e07-kube-api-access-4x5kd\") pod \"8723600d-fcf8-414f-a440-da9d65d41e07\" (UID: \"8723600d-fcf8-414f-a440-da9d65d41e07\") " Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.313914 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8723600d-fcf8-414f-a440-da9d65d41e07-service-ca\") pod \"8723600d-fcf8-414f-a440-da9d65d41e07\" (UID: \"8723600d-fcf8-414f-a440-da9d65d41e07\") " Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.313942 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8723600d-fcf8-414f-a440-da9d65d41e07-trusted-ca-bundle\") pod \"8723600d-fcf8-414f-a440-da9d65d41e07\" (UID: \"8723600d-fcf8-414f-a440-da9d65d41e07\") " Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.313998 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8723600d-fcf8-414f-a440-da9d65d41e07-console-oauth-config\") pod \"8723600d-fcf8-414f-a440-da9d65d41e07\" (UID: \"8723600d-fcf8-414f-a440-da9d65d41e07\") " Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.314040 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8723600d-fcf8-414f-a440-da9d65d41e07-console-config\") pod \"8723600d-fcf8-414f-a440-da9d65d41e07\" (UID: \"8723600d-fcf8-414f-a440-da9d65d41e07\") " Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.314088 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8723600d-fcf8-414f-a440-da9d65d41e07-oauth-serving-cert\") pod \"8723600d-fcf8-414f-a440-da9d65d41e07\" (UID: \"8723600d-fcf8-414f-a440-da9d65d41e07\") " Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.314244 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr7k9\" (UniqueName: \"kubernetes.io/projected/b475cb0a-14ff-4a19-983a-c4a352134ffb-kube-api-access-zr7k9\") pod \"community-operators-8nnxx\" (UID: \"b475cb0a-14ff-4a19-983a-c4a352134ffb\") " pod="openshift-marketplace/community-operators-8nnxx" Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.314284 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b475cb0a-14ff-4a19-983a-c4a352134ffb-utilities\") pod \"community-operators-8nnxx\" (UID: \"b475cb0a-14ff-4a19-983a-c4a352134ffb\") " pod="openshift-marketplace/community-operators-8nnxx" Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.314324 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b475cb0a-14ff-4a19-983a-c4a352134ffb-catalog-content\") pod \"community-operators-8nnxx\" (UID: \"b475cb0a-14ff-4a19-983a-c4a352134ffb\") " pod="openshift-marketplace/community-operators-8nnxx" Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.314753 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b475cb0a-14ff-4a19-983a-c4a352134ffb-catalog-content\") pod \"community-operators-8nnxx\" (UID: \"b475cb0a-14ff-4a19-983a-c4a352134ffb\") " pod="openshift-marketplace/community-operators-8nnxx" Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.315637 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8723600d-fcf8-414f-a440-da9d65d41e07-console-config" (OuterVolumeSpecName: "console-config") pod "8723600d-fcf8-414f-a440-da9d65d41e07" (UID: "8723600d-fcf8-414f-a440-da9d65d41e07"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.316157 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8723600d-fcf8-414f-a440-da9d65d41e07-service-ca" (OuterVolumeSpecName: "service-ca") pod "8723600d-fcf8-414f-a440-da9d65d41e07" (UID: "8723600d-fcf8-414f-a440-da9d65d41e07"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.316586 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8723600d-fcf8-414f-a440-da9d65d41e07-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "8723600d-fcf8-414f-a440-da9d65d41e07" (UID: "8723600d-fcf8-414f-a440-da9d65d41e07"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.317457 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b475cb0a-14ff-4a19-983a-c4a352134ffb-utilities\") pod \"community-operators-8nnxx\" (UID: \"b475cb0a-14ff-4a19-983a-c4a352134ffb\") " pod="openshift-marketplace/community-operators-8nnxx" Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.317822 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8723600d-fcf8-414f-a440-da9d65d41e07-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "8723600d-fcf8-414f-a440-da9d65d41e07" (UID: "8723600d-fcf8-414f-a440-da9d65d41e07"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.319731 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8723600d-fcf8-414f-a440-da9d65d41e07-kube-api-access-4x5kd" (OuterVolumeSpecName: "kube-api-access-4x5kd") pod "8723600d-fcf8-414f-a440-da9d65d41e07" (UID: "8723600d-fcf8-414f-a440-da9d65d41e07"). InnerVolumeSpecName "kube-api-access-4x5kd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.320633 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8723600d-fcf8-414f-a440-da9d65d41e07-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "8723600d-fcf8-414f-a440-da9d65d41e07" (UID: "8723600d-fcf8-414f-a440-da9d65d41e07"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.321214 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8723600d-fcf8-414f-a440-da9d65d41e07-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "8723600d-fcf8-414f-a440-da9d65d41e07" (UID: "8723600d-fcf8-414f-a440-da9d65d41e07"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.357417 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr7k9\" (UniqueName: \"kubernetes.io/projected/b475cb0a-14ff-4a19-983a-c4a352134ffb-kube-api-access-zr7k9\") pod \"community-operators-8nnxx\" (UID: \"b475cb0a-14ff-4a19-983a-c4a352134ffb\") " pod="openshift-marketplace/community-operators-8nnxx" Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.415346 4991 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8723600d-fcf8-414f-a440-da9d65d41e07-console-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.415376 4991 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8723600d-fcf8-414f-a440-da9d65d41e07-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.415385 4991 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8723600d-fcf8-414f-a440-da9d65d41e07-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.415396 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4x5kd\" (UniqueName: \"kubernetes.io/projected/8723600d-fcf8-414f-a440-da9d65d41e07-kube-api-access-4x5kd\") on node \"crc\" DevicePath \"\"" Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.415407 4991 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8723600d-fcf8-414f-a440-da9d65d41e07-service-ca\") on node \"crc\" DevicePath \"\"" Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.415416 4991 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8723600d-fcf8-414f-a440-da9d65d41e07-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.415425 4991 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8723600d-fcf8-414f-a440-da9d65d41e07-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.427207 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8nnxx" Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.761472 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-97gxp_8723600d-fcf8-414f-a440-da9d65d41e07/console/0.log" Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.761767 4991 generic.go:334] "Generic (PLEG): container finished" podID="8723600d-fcf8-414f-a440-da9d65d41e07" containerID="4b9f1c62629240e98ebb8a24fe30ae5c2f69731f649b4d02200779344c8772a5" exitCode=2 Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.761791 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-97gxp" event={"ID":"8723600d-fcf8-414f-a440-da9d65d41e07","Type":"ContainerDied","Data":"4b9f1c62629240e98ebb8a24fe30ae5c2f69731f649b4d02200779344c8772a5"} Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.761815 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-97gxp" event={"ID":"8723600d-fcf8-414f-a440-da9d65d41e07","Type":"ContainerDied","Data":"320f8a90cc1d41152a902cb8345543ed323a0bad17247f6c31788749c52f7795"} Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.761832 4991 scope.go:117] "RemoveContainer" containerID="4b9f1c62629240e98ebb8a24fe30ae5c2f69731f649b4d02200779344c8772a5" Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.761839 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-97gxp" Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.782216 4991 scope.go:117] "RemoveContainer" containerID="4b9f1c62629240e98ebb8a24fe30ae5c2f69731f649b4d02200779344c8772a5" Dec 06 01:16:59 crc kubenswrapper[4991]: E1206 01:16:59.783286 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b9f1c62629240e98ebb8a24fe30ae5c2f69731f649b4d02200779344c8772a5\": container with ID starting with 4b9f1c62629240e98ebb8a24fe30ae5c2f69731f649b4d02200779344c8772a5 not found: ID does not exist" containerID="4b9f1c62629240e98ebb8a24fe30ae5c2f69731f649b4d02200779344c8772a5" Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.783337 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b9f1c62629240e98ebb8a24fe30ae5c2f69731f649b4d02200779344c8772a5"} err="failed to get container status \"4b9f1c62629240e98ebb8a24fe30ae5c2f69731f649b4d02200779344c8772a5\": rpc error: code = NotFound desc = could not find container \"4b9f1c62629240e98ebb8a24fe30ae5c2f69731f649b4d02200779344c8772a5\": container with ID starting with 4b9f1c62629240e98ebb8a24fe30ae5c2f69731f649b4d02200779344c8772a5 not found: ID does not exist" Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.797897 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-97gxp"] Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.802800 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-97gxp"] Dec 06 01:16:59 crc kubenswrapper[4991]: I1206 01:16:59.889327 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8nnxx"] Dec 06 01:17:00 crc kubenswrapper[4991]: I1206 01:17:00.321317 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x7pmx"] Dec 06 01:17:00 crc kubenswrapper[4991]: E1206 01:17:00.321570 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8723600d-fcf8-414f-a440-da9d65d41e07" containerName="console" Dec 06 01:17:00 crc kubenswrapper[4991]: I1206 01:17:00.321584 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="8723600d-fcf8-414f-a440-da9d65d41e07" containerName="console" Dec 06 01:17:00 crc kubenswrapper[4991]: I1206 01:17:00.321703 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="8723600d-fcf8-414f-a440-da9d65d41e07" containerName="console" Dec 06 01:17:00 crc kubenswrapper[4991]: I1206 01:17:00.322582 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x7pmx" Dec 06 01:17:00 crc kubenswrapper[4991]: I1206 01:17:00.325631 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 06 01:17:00 crc kubenswrapper[4991]: I1206 01:17:00.349841 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x7pmx"] Dec 06 01:17:00 crc kubenswrapper[4991]: I1206 01:17:00.429683 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c4cc76d8-965d-4228-8e93-4d9e736a9f95-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x7pmx\" (UID: \"c4cc76d8-965d-4228-8e93-4d9e736a9f95\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x7pmx" Dec 06 01:17:00 crc kubenswrapper[4991]: I1206 01:17:00.429767 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsjvw\" (UniqueName: \"kubernetes.io/projected/c4cc76d8-965d-4228-8e93-4d9e736a9f95-kube-api-access-dsjvw\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x7pmx\" (UID: \"c4cc76d8-965d-4228-8e93-4d9e736a9f95\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x7pmx" Dec 06 01:17:00 crc kubenswrapper[4991]: I1206 01:17:00.430111 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c4cc76d8-965d-4228-8e93-4d9e736a9f95-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x7pmx\" (UID: \"c4cc76d8-965d-4228-8e93-4d9e736a9f95\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x7pmx" Dec 06 01:17:00 crc kubenswrapper[4991]: I1206 01:17:00.531639 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c4cc76d8-965d-4228-8e93-4d9e736a9f95-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x7pmx\" (UID: \"c4cc76d8-965d-4228-8e93-4d9e736a9f95\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x7pmx" Dec 06 01:17:00 crc kubenswrapper[4991]: I1206 01:17:00.531761 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsjvw\" (UniqueName: \"kubernetes.io/projected/c4cc76d8-965d-4228-8e93-4d9e736a9f95-kube-api-access-dsjvw\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x7pmx\" (UID: \"c4cc76d8-965d-4228-8e93-4d9e736a9f95\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x7pmx" Dec 06 01:17:00 crc kubenswrapper[4991]: I1206 01:17:00.531858 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c4cc76d8-965d-4228-8e93-4d9e736a9f95-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x7pmx\" (UID: \"c4cc76d8-965d-4228-8e93-4d9e736a9f95\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x7pmx" Dec 06 01:17:00 crc kubenswrapper[4991]: I1206 01:17:00.532350 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c4cc76d8-965d-4228-8e93-4d9e736a9f95-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x7pmx\" (UID: \"c4cc76d8-965d-4228-8e93-4d9e736a9f95\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x7pmx" Dec 06 01:17:00 crc kubenswrapper[4991]: I1206 01:17:00.532429 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c4cc76d8-965d-4228-8e93-4d9e736a9f95-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x7pmx\" (UID: \"c4cc76d8-965d-4228-8e93-4d9e736a9f95\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x7pmx" Dec 06 01:17:00 crc kubenswrapper[4991]: I1206 01:17:00.563090 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsjvw\" (UniqueName: \"kubernetes.io/projected/c4cc76d8-965d-4228-8e93-4d9e736a9f95-kube-api-access-dsjvw\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x7pmx\" (UID: \"c4cc76d8-965d-4228-8e93-4d9e736a9f95\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x7pmx" Dec 06 01:17:00 crc kubenswrapper[4991]: I1206 01:17:00.709724 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x7pmx" Dec 06 01:17:00 crc kubenswrapper[4991]: I1206 01:17:00.774475 4991 generic.go:334] "Generic (PLEG): container finished" podID="b475cb0a-14ff-4a19-983a-c4a352134ffb" containerID="560d45c5be403370396f9655e62afa151ff719944334bf0261cc9618ba41d550" exitCode=0 Dec 06 01:17:00 crc kubenswrapper[4991]: I1206 01:17:00.774577 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8nnxx" event={"ID":"b475cb0a-14ff-4a19-983a-c4a352134ffb","Type":"ContainerDied","Data":"560d45c5be403370396f9655e62afa151ff719944334bf0261cc9618ba41d550"} Dec 06 01:17:00 crc kubenswrapper[4991]: I1206 01:17:00.774627 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8nnxx" event={"ID":"b475cb0a-14ff-4a19-983a-c4a352134ffb","Type":"ContainerStarted","Data":"a00e465744ac583405d4ddb759ac0a1207641d5c04af3a06c0898d7333aab6ed"} Dec 06 01:17:01 crc kubenswrapper[4991]: I1206 01:17:01.052374 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8723600d-fcf8-414f-a440-da9d65d41e07" path="/var/lib/kubelet/pods/8723600d-fcf8-414f-a440-da9d65d41e07/volumes" Dec 06 01:17:01 crc kubenswrapper[4991]: I1206 01:17:01.210594 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x7pmx"] Dec 06 01:17:01 crc kubenswrapper[4991]: I1206 01:17:01.786892 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8nnxx" event={"ID":"b475cb0a-14ff-4a19-983a-c4a352134ffb","Type":"ContainerStarted","Data":"5c84056931d08bf87db028eee8a7160bcfca6c79f3f54021656c028d4cb588cb"} Dec 06 01:17:01 crc kubenswrapper[4991]: I1206 01:17:01.789655 4991 generic.go:334] "Generic (PLEG): container finished" podID="c4cc76d8-965d-4228-8e93-4d9e736a9f95" containerID="409fa70aa6e5a3e9c665a9160e656f7ba41b96dccc78e127d5ded9dcc5d0b3e4" exitCode=0 Dec 06 01:17:01 crc kubenswrapper[4991]: I1206 01:17:01.789722 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x7pmx" event={"ID":"c4cc76d8-965d-4228-8e93-4d9e736a9f95","Type":"ContainerDied","Data":"409fa70aa6e5a3e9c665a9160e656f7ba41b96dccc78e127d5ded9dcc5d0b3e4"} Dec 06 01:17:01 crc kubenswrapper[4991]: I1206 01:17:01.789760 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x7pmx" event={"ID":"c4cc76d8-965d-4228-8e93-4d9e736a9f95","Type":"ContainerStarted","Data":"18f954f7c8ba03f7bbbfdedd070c6bdefb187affaf7ab4cdc37dc85bca5ffda4"} Dec 06 01:17:02 crc kubenswrapper[4991]: I1206 01:17:02.807672 4991 generic.go:334] "Generic (PLEG): container finished" podID="b475cb0a-14ff-4a19-983a-c4a352134ffb" containerID="5c84056931d08bf87db028eee8a7160bcfca6c79f3f54021656c028d4cb588cb" exitCode=0 Dec 06 01:17:02 crc kubenswrapper[4991]: I1206 01:17:02.807774 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8nnxx" event={"ID":"b475cb0a-14ff-4a19-983a-c4a352134ffb","Type":"ContainerDied","Data":"5c84056931d08bf87db028eee8a7160bcfca6c79f3f54021656c028d4cb588cb"} Dec 06 01:17:03 crc kubenswrapper[4991]: I1206 01:17:03.821400 4991 generic.go:334] "Generic (PLEG): container finished" podID="c4cc76d8-965d-4228-8e93-4d9e736a9f95" containerID="d97f48a41065807d4fe0a37afad64634a3e6a1d0e2e9fce528184ca15e6882cf" exitCode=0 Dec 06 01:17:03 crc kubenswrapper[4991]: I1206 01:17:03.821510 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x7pmx" event={"ID":"c4cc76d8-965d-4228-8e93-4d9e736a9f95","Type":"ContainerDied","Data":"d97f48a41065807d4fe0a37afad64634a3e6a1d0e2e9fce528184ca15e6882cf"} Dec 06 01:17:03 crc kubenswrapper[4991]: I1206 01:17:03.826860 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8nnxx" event={"ID":"b475cb0a-14ff-4a19-983a-c4a352134ffb","Type":"ContainerStarted","Data":"97f0513fcbae27416f15b43ba95729e9a78b225efdfcd53014f47aaa051bbf37"} Dec 06 01:17:04 crc kubenswrapper[4991]: I1206 01:17:04.840497 4991 generic.go:334] "Generic (PLEG): container finished" podID="c4cc76d8-965d-4228-8e93-4d9e736a9f95" containerID="48d9c7655e53e26ef62d667e9964e2848552ca67d524e934f68cbfed7b4e114d" exitCode=0 Dec 06 01:17:04 crc kubenswrapper[4991]: I1206 01:17:04.840579 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x7pmx" event={"ID":"c4cc76d8-965d-4228-8e93-4d9e736a9f95","Type":"ContainerDied","Data":"48d9c7655e53e26ef62d667e9964e2848552ca67d524e934f68cbfed7b4e114d"} Dec 06 01:17:04 crc kubenswrapper[4991]: I1206 01:17:04.870226 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8nnxx" podStartSLOduration=3.378126666 podStartE2EDuration="5.870200037s" podCreationTimestamp="2025-12-06 01:16:59 +0000 UTC" firstStartedPulling="2025-12-06 01:17:00.776528312 +0000 UTC m=+894.126818830" lastFinishedPulling="2025-12-06 01:17:03.268601693 +0000 UTC m=+896.618892201" observedRunningTime="2025-12-06 01:17:03.868175525 +0000 UTC m=+897.218466003" watchObservedRunningTime="2025-12-06 01:17:04.870200037 +0000 UTC m=+898.220490535" Dec 06 01:17:06 crc kubenswrapper[4991]: I1206 01:17:06.218602 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x7pmx" Dec 06 01:17:06 crc kubenswrapper[4991]: I1206 01:17:06.320215 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c4cc76d8-965d-4228-8e93-4d9e736a9f95-bundle\") pod \"c4cc76d8-965d-4228-8e93-4d9e736a9f95\" (UID: \"c4cc76d8-965d-4228-8e93-4d9e736a9f95\") " Dec 06 01:17:06 crc kubenswrapper[4991]: I1206 01:17:06.320378 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsjvw\" (UniqueName: \"kubernetes.io/projected/c4cc76d8-965d-4228-8e93-4d9e736a9f95-kube-api-access-dsjvw\") pod \"c4cc76d8-965d-4228-8e93-4d9e736a9f95\" (UID: \"c4cc76d8-965d-4228-8e93-4d9e736a9f95\") " Dec 06 01:17:06 crc kubenswrapper[4991]: I1206 01:17:06.320454 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c4cc76d8-965d-4228-8e93-4d9e736a9f95-util\") pod \"c4cc76d8-965d-4228-8e93-4d9e736a9f95\" (UID: \"c4cc76d8-965d-4228-8e93-4d9e736a9f95\") " Dec 06 01:17:06 crc kubenswrapper[4991]: I1206 01:17:06.321373 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4cc76d8-965d-4228-8e93-4d9e736a9f95-bundle" (OuterVolumeSpecName: "bundle") pod "c4cc76d8-965d-4228-8e93-4d9e736a9f95" (UID: "c4cc76d8-965d-4228-8e93-4d9e736a9f95"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:17:06 crc kubenswrapper[4991]: I1206 01:17:06.330315 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4cc76d8-965d-4228-8e93-4d9e736a9f95-kube-api-access-dsjvw" (OuterVolumeSpecName: "kube-api-access-dsjvw") pod "c4cc76d8-965d-4228-8e93-4d9e736a9f95" (UID: "c4cc76d8-965d-4228-8e93-4d9e736a9f95"). InnerVolumeSpecName "kube-api-access-dsjvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:17:06 crc kubenswrapper[4991]: I1206 01:17:06.333890 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4cc76d8-965d-4228-8e93-4d9e736a9f95-util" (OuterVolumeSpecName: "util") pod "c4cc76d8-965d-4228-8e93-4d9e736a9f95" (UID: "c4cc76d8-965d-4228-8e93-4d9e736a9f95"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:17:06 crc kubenswrapper[4991]: I1206 01:17:06.421775 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsjvw\" (UniqueName: \"kubernetes.io/projected/c4cc76d8-965d-4228-8e93-4d9e736a9f95-kube-api-access-dsjvw\") on node \"crc\" DevicePath \"\"" Dec 06 01:17:06 crc kubenswrapper[4991]: I1206 01:17:06.421828 4991 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c4cc76d8-965d-4228-8e93-4d9e736a9f95-util\") on node \"crc\" DevicePath \"\"" Dec 06 01:17:06 crc kubenswrapper[4991]: I1206 01:17:06.421846 4991 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c4cc76d8-965d-4228-8e93-4d9e736a9f95-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:17:06 crc kubenswrapper[4991]: I1206 01:17:06.861435 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x7pmx" event={"ID":"c4cc76d8-965d-4228-8e93-4d9e736a9f95","Type":"ContainerDied","Data":"18f954f7c8ba03f7bbbfdedd070c6bdefb187affaf7ab4cdc37dc85bca5ffda4"} Dec 06 01:17:06 crc kubenswrapper[4991]: I1206 01:17:06.861512 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18f954f7c8ba03f7bbbfdedd070c6bdefb187affaf7ab4cdc37dc85bca5ffda4" Dec 06 01:17:06 crc kubenswrapper[4991]: I1206 01:17:06.861522 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x7pmx" Dec 06 01:17:09 crc kubenswrapper[4991]: I1206 01:17:09.086454 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7sshq"] Dec 06 01:17:09 crc kubenswrapper[4991]: E1206 01:17:09.087446 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4cc76d8-965d-4228-8e93-4d9e736a9f95" containerName="util" Dec 06 01:17:09 crc kubenswrapper[4991]: I1206 01:17:09.087475 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4cc76d8-965d-4228-8e93-4d9e736a9f95" containerName="util" Dec 06 01:17:09 crc kubenswrapper[4991]: E1206 01:17:09.087524 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4cc76d8-965d-4228-8e93-4d9e736a9f95" containerName="extract" Dec 06 01:17:09 crc kubenswrapper[4991]: I1206 01:17:09.087541 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4cc76d8-965d-4228-8e93-4d9e736a9f95" containerName="extract" Dec 06 01:17:09 crc kubenswrapper[4991]: E1206 01:17:09.087570 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4cc76d8-965d-4228-8e93-4d9e736a9f95" containerName="pull" Dec 06 01:17:09 crc kubenswrapper[4991]: I1206 01:17:09.087585 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4cc76d8-965d-4228-8e93-4d9e736a9f95" containerName="pull" Dec 06 01:17:09 crc kubenswrapper[4991]: I1206 01:17:09.087812 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4cc76d8-965d-4228-8e93-4d9e736a9f95" containerName="extract" Dec 06 01:17:09 crc kubenswrapper[4991]: I1206 01:17:09.089578 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7sshq" Dec 06 01:17:09 crc kubenswrapper[4991]: I1206 01:17:09.104861 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7sshq"] Dec 06 01:17:09 crc kubenswrapper[4991]: I1206 01:17:09.163075 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpjqd\" (UniqueName: \"kubernetes.io/projected/82476ed7-68cd-4fdf-a1ae-1158e1a1be5b-kube-api-access-lpjqd\") pod \"certified-operators-7sshq\" (UID: \"82476ed7-68cd-4fdf-a1ae-1158e1a1be5b\") " pod="openshift-marketplace/certified-operators-7sshq" Dec 06 01:17:09 crc kubenswrapper[4991]: I1206 01:17:09.163470 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82476ed7-68cd-4fdf-a1ae-1158e1a1be5b-utilities\") pod \"certified-operators-7sshq\" (UID: \"82476ed7-68cd-4fdf-a1ae-1158e1a1be5b\") " pod="openshift-marketplace/certified-operators-7sshq" Dec 06 01:17:09 crc kubenswrapper[4991]: I1206 01:17:09.163712 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82476ed7-68cd-4fdf-a1ae-1158e1a1be5b-catalog-content\") pod \"certified-operators-7sshq\" (UID: \"82476ed7-68cd-4fdf-a1ae-1158e1a1be5b\") " pod="openshift-marketplace/certified-operators-7sshq" Dec 06 01:17:09 crc kubenswrapper[4991]: I1206 01:17:09.265345 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82476ed7-68cd-4fdf-a1ae-1158e1a1be5b-catalog-content\") pod \"certified-operators-7sshq\" (UID: \"82476ed7-68cd-4fdf-a1ae-1158e1a1be5b\") " pod="openshift-marketplace/certified-operators-7sshq" Dec 06 01:17:09 crc kubenswrapper[4991]: I1206 01:17:09.265612 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpjqd\" (UniqueName: \"kubernetes.io/projected/82476ed7-68cd-4fdf-a1ae-1158e1a1be5b-kube-api-access-lpjqd\") pod \"certified-operators-7sshq\" (UID: \"82476ed7-68cd-4fdf-a1ae-1158e1a1be5b\") " pod="openshift-marketplace/certified-operators-7sshq" Dec 06 01:17:09 crc kubenswrapper[4991]: I1206 01:17:09.265683 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82476ed7-68cd-4fdf-a1ae-1158e1a1be5b-utilities\") pod \"certified-operators-7sshq\" (UID: \"82476ed7-68cd-4fdf-a1ae-1158e1a1be5b\") " pod="openshift-marketplace/certified-operators-7sshq" Dec 06 01:17:09 crc kubenswrapper[4991]: I1206 01:17:09.266495 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82476ed7-68cd-4fdf-a1ae-1158e1a1be5b-utilities\") pod \"certified-operators-7sshq\" (UID: \"82476ed7-68cd-4fdf-a1ae-1158e1a1be5b\") " pod="openshift-marketplace/certified-operators-7sshq" Dec 06 01:17:09 crc kubenswrapper[4991]: I1206 01:17:09.266495 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82476ed7-68cd-4fdf-a1ae-1158e1a1be5b-catalog-content\") pod \"certified-operators-7sshq\" (UID: \"82476ed7-68cd-4fdf-a1ae-1158e1a1be5b\") " pod="openshift-marketplace/certified-operators-7sshq" Dec 06 01:17:09 crc kubenswrapper[4991]: I1206 01:17:09.291645 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpjqd\" (UniqueName: \"kubernetes.io/projected/82476ed7-68cd-4fdf-a1ae-1158e1a1be5b-kube-api-access-lpjqd\") pod \"certified-operators-7sshq\" (UID: \"82476ed7-68cd-4fdf-a1ae-1158e1a1be5b\") " pod="openshift-marketplace/certified-operators-7sshq" Dec 06 01:17:09 crc kubenswrapper[4991]: I1206 01:17:09.430089 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8nnxx" Dec 06 01:17:09 crc kubenswrapper[4991]: I1206 01:17:09.430153 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8nnxx" Dec 06 01:17:09 crc kubenswrapper[4991]: I1206 01:17:09.432435 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7sshq" Dec 06 01:17:09 crc kubenswrapper[4991]: I1206 01:17:09.505012 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8nnxx" Dec 06 01:17:09 crc kubenswrapper[4991]: I1206 01:17:09.751499 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7sshq"] Dec 06 01:17:09 crc kubenswrapper[4991]: I1206 01:17:09.883474 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7sshq" event={"ID":"82476ed7-68cd-4fdf-a1ae-1158e1a1be5b","Type":"ContainerStarted","Data":"df43a7bd66ac64bff55b116af128b32dd58e0c60f696782bb63c452190ca4cf0"} Dec 06 01:17:09 crc kubenswrapper[4991]: I1206 01:17:09.928651 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8nnxx" Dec 06 01:17:10 crc kubenswrapper[4991]: I1206 01:17:10.889417 4991 generic.go:334] "Generic (PLEG): container finished" podID="82476ed7-68cd-4fdf-a1ae-1158e1a1be5b" containerID="c9bc72ba7072e8bbe3e2b2ddd82b624fef8281c218b336421713bedbb1161be5" exitCode=0 Dec 06 01:17:10 crc kubenswrapper[4991]: I1206 01:17:10.890307 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7sshq" event={"ID":"82476ed7-68cd-4fdf-a1ae-1158e1a1be5b","Type":"ContainerDied","Data":"c9bc72ba7072e8bbe3e2b2ddd82b624fef8281c218b336421713bedbb1161be5"} Dec 06 01:17:11 crc kubenswrapper[4991]: I1206 01:17:11.270206 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8nnxx"] Dec 06 01:17:11 crc kubenswrapper[4991]: I1206 01:17:11.897584 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7sshq" event={"ID":"82476ed7-68cd-4fdf-a1ae-1158e1a1be5b","Type":"ContainerStarted","Data":"0668e1fa7444c4c4143901f980d9ed03b33e3c1169a2f9a633600eed7a4a40ed"} Dec 06 01:17:11 crc kubenswrapper[4991]: I1206 01:17:11.897718 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8nnxx" podUID="b475cb0a-14ff-4a19-983a-c4a352134ffb" containerName="registry-server" containerID="cri-o://97f0513fcbae27416f15b43ba95729e9a78b225efdfcd53014f47aaa051bbf37" gracePeriod=2 Dec 06 01:17:12 crc kubenswrapper[4991]: I1206 01:17:12.334561 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8nnxx" Dec 06 01:17:12 crc kubenswrapper[4991]: I1206 01:17:12.406304 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b475cb0a-14ff-4a19-983a-c4a352134ffb-catalog-content\") pod \"b475cb0a-14ff-4a19-983a-c4a352134ffb\" (UID: \"b475cb0a-14ff-4a19-983a-c4a352134ffb\") " Dec 06 01:17:12 crc kubenswrapper[4991]: I1206 01:17:12.406775 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zr7k9\" (UniqueName: \"kubernetes.io/projected/b475cb0a-14ff-4a19-983a-c4a352134ffb-kube-api-access-zr7k9\") pod \"b475cb0a-14ff-4a19-983a-c4a352134ffb\" (UID: \"b475cb0a-14ff-4a19-983a-c4a352134ffb\") " Dec 06 01:17:12 crc kubenswrapper[4991]: I1206 01:17:12.406951 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b475cb0a-14ff-4a19-983a-c4a352134ffb-utilities\") pod \"b475cb0a-14ff-4a19-983a-c4a352134ffb\" (UID: \"b475cb0a-14ff-4a19-983a-c4a352134ffb\") " Dec 06 01:17:12 crc kubenswrapper[4991]: I1206 01:17:12.407890 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b475cb0a-14ff-4a19-983a-c4a352134ffb-utilities" (OuterVolumeSpecName: "utilities") pod "b475cb0a-14ff-4a19-983a-c4a352134ffb" (UID: "b475cb0a-14ff-4a19-983a-c4a352134ffb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:17:12 crc kubenswrapper[4991]: I1206 01:17:12.421326 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b475cb0a-14ff-4a19-983a-c4a352134ffb-kube-api-access-zr7k9" (OuterVolumeSpecName: "kube-api-access-zr7k9") pod "b475cb0a-14ff-4a19-983a-c4a352134ffb" (UID: "b475cb0a-14ff-4a19-983a-c4a352134ffb"). InnerVolumeSpecName "kube-api-access-zr7k9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:17:12 crc kubenswrapper[4991]: I1206 01:17:12.457518 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b475cb0a-14ff-4a19-983a-c4a352134ffb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b475cb0a-14ff-4a19-983a-c4a352134ffb" (UID: "b475cb0a-14ff-4a19-983a-c4a352134ffb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:17:12 crc kubenswrapper[4991]: I1206 01:17:12.508044 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b475cb0a-14ff-4a19-983a-c4a352134ffb-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 01:17:12 crc kubenswrapper[4991]: I1206 01:17:12.508089 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b475cb0a-14ff-4a19-983a-c4a352134ffb-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 01:17:12 crc kubenswrapper[4991]: I1206 01:17:12.508107 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zr7k9\" (UniqueName: \"kubernetes.io/projected/b475cb0a-14ff-4a19-983a-c4a352134ffb-kube-api-access-zr7k9\") on node \"crc\" DevicePath \"\"" Dec 06 01:17:12 crc kubenswrapper[4991]: I1206 01:17:12.906366 4991 generic.go:334] "Generic (PLEG): container finished" podID="b475cb0a-14ff-4a19-983a-c4a352134ffb" containerID="97f0513fcbae27416f15b43ba95729e9a78b225efdfcd53014f47aaa051bbf37" exitCode=0 Dec 06 01:17:12 crc kubenswrapper[4991]: I1206 01:17:12.906425 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8nnxx" event={"ID":"b475cb0a-14ff-4a19-983a-c4a352134ffb","Type":"ContainerDied","Data":"97f0513fcbae27416f15b43ba95729e9a78b225efdfcd53014f47aaa051bbf37"} Dec 06 01:17:12 crc kubenswrapper[4991]: I1206 01:17:12.906473 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8nnxx" Dec 06 01:17:12 crc kubenswrapper[4991]: I1206 01:17:12.906490 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8nnxx" event={"ID":"b475cb0a-14ff-4a19-983a-c4a352134ffb","Type":"ContainerDied","Data":"a00e465744ac583405d4ddb759ac0a1207641d5c04af3a06c0898d7333aab6ed"} Dec 06 01:17:12 crc kubenswrapper[4991]: I1206 01:17:12.906518 4991 scope.go:117] "RemoveContainer" containerID="97f0513fcbae27416f15b43ba95729e9a78b225efdfcd53014f47aaa051bbf37" Dec 06 01:17:12 crc kubenswrapper[4991]: I1206 01:17:12.909453 4991 generic.go:334] "Generic (PLEG): container finished" podID="82476ed7-68cd-4fdf-a1ae-1158e1a1be5b" containerID="0668e1fa7444c4c4143901f980d9ed03b33e3c1169a2f9a633600eed7a4a40ed" exitCode=0 Dec 06 01:17:12 crc kubenswrapper[4991]: I1206 01:17:12.909489 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7sshq" event={"ID":"82476ed7-68cd-4fdf-a1ae-1158e1a1be5b","Type":"ContainerDied","Data":"0668e1fa7444c4c4143901f980d9ed03b33e3c1169a2f9a633600eed7a4a40ed"} Dec 06 01:17:12 crc kubenswrapper[4991]: I1206 01:17:12.927876 4991 scope.go:117] "RemoveContainer" containerID="5c84056931d08bf87db028eee8a7160bcfca6c79f3f54021656c028d4cb588cb" Dec 06 01:17:12 crc kubenswrapper[4991]: I1206 01:17:12.953893 4991 scope.go:117] "RemoveContainer" containerID="560d45c5be403370396f9655e62afa151ff719944334bf0261cc9618ba41d550" Dec 06 01:17:12 crc kubenswrapper[4991]: I1206 01:17:12.956100 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8nnxx"] Dec 06 01:17:12 crc kubenswrapper[4991]: I1206 01:17:12.959064 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8nnxx"] Dec 06 01:17:12 crc kubenswrapper[4991]: I1206 01:17:12.981506 4991 scope.go:117] "RemoveContainer" containerID="97f0513fcbae27416f15b43ba95729e9a78b225efdfcd53014f47aaa051bbf37" Dec 06 01:17:12 crc kubenswrapper[4991]: E1206 01:17:12.984085 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97f0513fcbae27416f15b43ba95729e9a78b225efdfcd53014f47aaa051bbf37\": container with ID starting with 97f0513fcbae27416f15b43ba95729e9a78b225efdfcd53014f47aaa051bbf37 not found: ID does not exist" containerID="97f0513fcbae27416f15b43ba95729e9a78b225efdfcd53014f47aaa051bbf37" Dec 06 01:17:12 crc kubenswrapper[4991]: I1206 01:17:12.984150 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97f0513fcbae27416f15b43ba95729e9a78b225efdfcd53014f47aaa051bbf37"} err="failed to get container status \"97f0513fcbae27416f15b43ba95729e9a78b225efdfcd53014f47aaa051bbf37\": rpc error: code = NotFound desc = could not find container \"97f0513fcbae27416f15b43ba95729e9a78b225efdfcd53014f47aaa051bbf37\": container with ID starting with 97f0513fcbae27416f15b43ba95729e9a78b225efdfcd53014f47aaa051bbf37 not found: ID does not exist" Dec 06 01:17:12 crc kubenswrapper[4991]: I1206 01:17:12.984191 4991 scope.go:117] "RemoveContainer" containerID="5c84056931d08bf87db028eee8a7160bcfca6c79f3f54021656c028d4cb588cb" Dec 06 01:17:12 crc kubenswrapper[4991]: E1206 01:17:12.984672 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c84056931d08bf87db028eee8a7160bcfca6c79f3f54021656c028d4cb588cb\": container with ID starting with 5c84056931d08bf87db028eee8a7160bcfca6c79f3f54021656c028d4cb588cb not found: ID does not exist" containerID="5c84056931d08bf87db028eee8a7160bcfca6c79f3f54021656c028d4cb588cb" Dec 06 01:17:12 crc kubenswrapper[4991]: I1206 01:17:12.984710 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c84056931d08bf87db028eee8a7160bcfca6c79f3f54021656c028d4cb588cb"} err="failed to get container status \"5c84056931d08bf87db028eee8a7160bcfca6c79f3f54021656c028d4cb588cb\": rpc error: code = NotFound desc = could not find container \"5c84056931d08bf87db028eee8a7160bcfca6c79f3f54021656c028d4cb588cb\": container with ID starting with 5c84056931d08bf87db028eee8a7160bcfca6c79f3f54021656c028d4cb588cb not found: ID does not exist" Dec 06 01:17:12 crc kubenswrapper[4991]: I1206 01:17:12.984739 4991 scope.go:117] "RemoveContainer" containerID="560d45c5be403370396f9655e62afa151ff719944334bf0261cc9618ba41d550" Dec 06 01:17:12 crc kubenswrapper[4991]: E1206 01:17:12.985175 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"560d45c5be403370396f9655e62afa151ff719944334bf0261cc9618ba41d550\": container with ID starting with 560d45c5be403370396f9655e62afa151ff719944334bf0261cc9618ba41d550 not found: ID does not exist" containerID="560d45c5be403370396f9655e62afa151ff719944334bf0261cc9618ba41d550" Dec 06 01:17:12 crc kubenswrapper[4991]: I1206 01:17:12.985204 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"560d45c5be403370396f9655e62afa151ff719944334bf0261cc9618ba41d550"} err="failed to get container status \"560d45c5be403370396f9655e62afa151ff719944334bf0261cc9618ba41d550\": rpc error: code = NotFound desc = could not find container \"560d45c5be403370396f9655e62afa151ff719944334bf0261cc9618ba41d550\": container with ID starting with 560d45c5be403370396f9655e62afa151ff719944334bf0261cc9618ba41d550 not found: ID does not exist" Dec 06 01:17:13 crc kubenswrapper[4991]: I1206 01:17:13.044343 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b475cb0a-14ff-4a19-983a-c4a352134ffb" path="/var/lib/kubelet/pods/b475cb0a-14ff-4a19-983a-c4a352134ffb/volumes" Dec 06 01:17:13 crc kubenswrapper[4991]: I1206 01:17:13.918635 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7sshq" event={"ID":"82476ed7-68cd-4fdf-a1ae-1158e1a1be5b","Type":"ContainerStarted","Data":"aed55b9c494ea9702fb50a96bfa90deeaac3a8fc801d7e52041c85f2f924cca5"} Dec 06 01:17:13 crc kubenswrapper[4991]: I1206 01:17:13.940955 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7sshq" podStartSLOduration=2.242775682 podStartE2EDuration="4.940935372s" podCreationTimestamp="2025-12-06 01:17:09 +0000 UTC" firstStartedPulling="2025-12-06 01:17:10.890814089 +0000 UTC m=+904.241104557" lastFinishedPulling="2025-12-06 01:17:13.588973779 +0000 UTC m=+906.939264247" observedRunningTime="2025-12-06 01:17:13.936333469 +0000 UTC m=+907.286623937" watchObservedRunningTime="2025-12-06 01:17:13.940935372 +0000 UTC m=+907.291225850" Dec 06 01:17:16 crc kubenswrapper[4991]: I1206 01:17:16.546536 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-545bb986f8-rh6kd"] Dec 06 01:17:16 crc kubenswrapper[4991]: E1206 01:17:16.547212 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b475cb0a-14ff-4a19-983a-c4a352134ffb" containerName="extract-utilities" Dec 06 01:17:16 crc kubenswrapper[4991]: I1206 01:17:16.547229 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="b475cb0a-14ff-4a19-983a-c4a352134ffb" containerName="extract-utilities" Dec 06 01:17:16 crc kubenswrapper[4991]: E1206 01:17:16.547238 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b475cb0a-14ff-4a19-983a-c4a352134ffb" containerName="extract-content" Dec 06 01:17:16 crc kubenswrapper[4991]: I1206 01:17:16.547246 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="b475cb0a-14ff-4a19-983a-c4a352134ffb" containerName="extract-content" Dec 06 01:17:16 crc kubenswrapper[4991]: E1206 01:17:16.547260 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b475cb0a-14ff-4a19-983a-c4a352134ffb" containerName="registry-server" Dec 06 01:17:16 crc kubenswrapper[4991]: I1206 01:17:16.547267 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="b475cb0a-14ff-4a19-983a-c4a352134ffb" containerName="registry-server" Dec 06 01:17:16 crc kubenswrapper[4991]: I1206 01:17:16.547384 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="b475cb0a-14ff-4a19-983a-c4a352134ffb" containerName="registry-server" Dec 06 01:17:16 crc kubenswrapper[4991]: I1206 01:17:16.547869 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-545bb986f8-rh6kd" Dec 06 01:17:16 crc kubenswrapper[4991]: I1206 01:17:16.550489 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 06 01:17:16 crc kubenswrapper[4991]: I1206 01:17:16.550595 4991 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-7hjph" Dec 06 01:17:16 crc kubenswrapper[4991]: I1206 01:17:16.551890 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 06 01:17:16 crc kubenswrapper[4991]: I1206 01:17:16.552000 4991 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 06 01:17:16 crc kubenswrapper[4991]: I1206 01:17:16.555652 4991 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 06 01:17:16 crc kubenswrapper[4991]: I1206 01:17:16.574111 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-545bb986f8-rh6kd"] Dec 06 01:17:16 crc kubenswrapper[4991]: I1206 01:17:16.676734 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbrb6\" (UniqueName: \"kubernetes.io/projected/4fd49c2f-3ac6-4444-ab0a-4d9a47617377-kube-api-access-fbrb6\") pod \"metallb-operator-controller-manager-545bb986f8-rh6kd\" (UID: \"4fd49c2f-3ac6-4444-ab0a-4d9a47617377\") " pod="metallb-system/metallb-operator-controller-manager-545bb986f8-rh6kd" Dec 06 01:17:16 crc kubenswrapper[4991]: I1206 01:17:16.676808 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4fd49c2f-3ac6-4444-ab0a-4d9a47617377-webhook-cert\") pod \"metallb-operator-controller-manager-545bb986f8-rh6kd\" (UID: \"4fd49c2f-3ac6-4444-ab0a-4d9a47617377\") " pod="metallb-system/metallb-operator-controller-manager-545bb986f8-rh6kd" Dec 06 01:17:16 crc kubenswrapper[4991]: I1206 01:17:16.676846 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4fd49c2f-3ac6-4444-ab0a-4d9a47617377-apiservice-cert\") pod \"metallb-operator-controller-manager-545bb986f8-rh6kd\" (UID: \"4fd49c2f-3ac6-4444-ab0a-4d9a47617377\") " pod="metallb-system/metallb-operator-controller-manager-545bb986f8-rh6kd" Dec 06 01:17:16 crc kubenswrapper[4991]: I1206 01:17:16.778474 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4fd49c2f-3ac6-4444-ab0a-4d9a47617377-apiservice-cert\") pod \"metallb-operator-controller-manager-545bb986f8-rh6kd\" (UID: \"4fd49c2f-3ac6-4444-ab0a-4d9a47617377\") " pod="metallb-system/metallb-operator-controller-manager-545bb986f8-rh6kd" Dec 06 01:17:16 crc kubenswrapper[4991]: I1206 01:17:16.778602 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbrb6\" (UniqueName: \"kubernetes.io/projected/4fd49c2f-3ac6-4444-ab0a-4d9a47617377-kube-api-access-fbrb6\") pod \"metallb-operator-controller-manager-545bb986f8-rh6kd\" (UID: \"4fd49c2f-3ac6-4444-ab0a-4d9a47617377\") " pod="metallb-system/metallb-operator-controller-manager-545bb986f8-rh6kd" Dec 06 01:17:16 crc kubenswrapper[4991]: I1206 01:17:16.779160 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4fd49c2f-3ac6-4444-ab0a-4d9a47617377-webhook-cert\") pod \"metallb-operator-controller-manager-545bb986f8-rh6kd\" (UID: \"4fd49c2f-3ac6-4444-ab0a-4d9a47617377\") " pod="metallb-system/metallb-operator-controller-manager-545bb986f8-rh6kd" Dec 06 01:17:16 crc kubenswrapper[4991]: I1206 01:17:16.788667 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4fd49c2f-3ac6-4444-ab0a-4d9a47617377-webhook-cert\") pod \"metallb-operator-controller-manager-545bb986f8-rh6kd\" (UID: \"4fd49c2f-3ac6-4444-ab0a-4d9a47617377\") " pod="metallb-system/metallb-operator-controller-manager-545bb986f8-rh6kd" Dec 06 01:17:16 crc kubenswrapper[4991]: I1206 01:17:16.799283 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4fd49c2f-3ac6-4444-ab0a-4d9a47617377-apiservice-cert\") pod \"metallb-operator-controller-manager-545bb986f8-rh6kd\" (UID: \"4fd49c2f-3ac6-4444-ab0a-4d9a47617377\") " pod="metallb-system/metallb-operator-controller-manager-545bb986f8-rh6kd" Dec 06 01:17:16 crc kubenswrapper[4991]: I1206 01:17:16.812679 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbrb6\" (UniqueName: \"kubernetes.io/projected/4fd49c2f-3ac6-4444-ab0a-4d9a47617377-kube-api-access-fbrb6\") pod \"metallb-operator-controller-manager-545bb986f8-rh6kd\" (UID: \"4fd49c2f-3ac6-4444-ab0a-4d9a47617377\") " pod="metallb-system/metallb-operator-controller-manager-545bb986f8-rh6kd" Dec 06 01:17:16 crc kubenswrapper[4991]: I1206 01:17:16.863077 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-545bb986f8-rh6kd" Dec 06 01:17:16 crc kubenswrapper[4991]: I1206 01:17:16.873037 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-8495449b99-p7f45"] Dec 06 01:17:16 crc kubenswrapper[4991]: I1206 01:17:16.874114 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-8495449b99-p7f45" Dec 06 01:17:16 crc kubenswrapper[4991]: I1206 01:17:16.879226 4991 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 06 01:17:16 crc kubenswrapper[4991]: I1206 01:17:16.880185 4991 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-vgvzx" Dec 06 01:17:16 crc kubenswrapper[4991]: I1206 01:17:16.880345 4991 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 06 01:17:16 crc kubenswrapper[4991]: I1206 01:17:16.903577 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-8495449b99-p7f45"] Dec 06 01:17:16 crc kubenswrapper[4991]: I1206 01:17:16.981739 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2eba891b-78cc-4a1c-8f65-a09829f349e8-webhook-cert\") pod \"metallb-operator-webhook-server-8495449b99-p7f45\" (UID: \"2eba891b-78cc-4a1c-8f65-a09829f349e8\") " pod="metallb-system/metallb-operator-webhook-server-8495449b99-p7f45" Dec 06 01:17:16 crc kubenswrapper[4991]: I1206 01:17:16.981837 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zchc7\" (UniqueName: \"kubernetes.io/projected/2eba891b-78cc-4a1c-8f65-a09829f349e8-kube-api-access-zchc7\") pod \"metallb-operator-webhook-server-8495449b99-p7f45\" (UID: \"2eba891b-78cc-4a1c-8f65-a09829f349e8\") " pod="metallb-system/metallb-operator-webhook-server-8495449b99-p7f45" Dec 06 01:17:16 crc kubenswrapper[4991]: I1206 01:17:16.981870 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2eba891b-78cc-4a1c-8f65-a09829f349e8-apiservice-cert\") pod \"metallb-operator-webhook-server-8495449b99-p7f45\" (UID: \"2eba891b-78cc-4a1c-8f65-a09829f349e8\") " pod="metallb-system/metallb-operator-webhook-server-8495449b99-p7f45" Dec 06 01:17:17 crc kubenswrapper[4991]: I1206 01:17:17.083331 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zchc7\" (UniqueName: \"kubernetes.io/projected/2eba891b-78cc-4a1c-8f65-a09829f349e8-kube-api-access-zchc7\") pod \"metallb-operator-webhook-server-8495449b99-p7f45\" (UID: \"2eba891b-78cc-4a1c-8f65-a09829f349e8\") " pod="metallb-system/metallb-operator-webhook-server-8495449b99-p7f45" Dec 06 01:17:17 crc kubenswrapper[4991]: I1206 01:17:17.083386 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2eba891b-78cc-4a1c-8f65-a09829f349e8-apiservice-cert\") pod \"metallb-operator-webhook-server-8495449b99-p7f45\" (UID: \"2eba891b-78cc-4a1c-8f65-a09829f349e8\") " pod="metallb-system/metallb-operator-webhook-server-8495449b99-p7f45" Dec 06 01:17:17 crc kubenswrapper[4991]: I1206 01:17:17.083456 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2eba891b-78cc-4a1c-8f65-a09829f349e8-webhook-cert\") pod \"metallb-operator-webhook-server-8495449b99-p7f45\" (UID: \"2eba891b-78cc-4a1c-8f65-a09829f349e8\") " pod="metallb-system/metallb-operator-webhook-server-8495449b99-p7f45" Dec 06 01:17:17 crc kubenswrapper[4991]: I1206 01:17:17.089773 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2eba891b-78cc-4a1c-8f65-a09829f349e8-apiservice-cert\") pod \"metallb-operator-webhook-server-8495449b99-p7f45\" (UID: \"2eba891b-78cc-4a1c-8f65-a09829f349e8\") " pod="metallb-system/metallb-operator-webhook-server-8495449b99-p7f45" Dec 06 01:17:17 crc kubenswrapper[4991]: I1206 01:17:17.092287 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2eba891b-78cc-4a1c-8f65-a09829f349e8-webhook-cert\") pod \"metallb-operator-webhook-server-8495449b99-p7f45\" (UID: \"2eba891b-78cc-4a1c-8f65-a09829f349e8\") " pod="metallb-system/metallb-operator-webhook-server-8495449b99-p7f45" Dec 06 01:17:17 crc kubenswrapper[4991]: I1206 01:17:17.101120 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zchc7\" (UniqueName: \"kubernetes.io/projected/2eba891b-78cc-4a1c-8f65-a09829f349e8-kube-api-access-zchc7\") pod \"metallb-operator-webhook-server-8495449b99-p7f45\" (UID: \"2eba891b-78cc-4a1c-8f65-a09829f349e8\") " pod="metallb-system/metallb-operator-webhook-server-8495449b99-p7f45" Dec 06 01:17:17 crc kubenswrapper[4991]: I1206 01:17:17.227620 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-8495449b99-p7f45" Dec 06 01:17:17 crc kubenswrapper[4991]: I1206 01:17:17.380522 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-545bb986f8-rh6kd"] Dec 06 01:17:17 crc kubenswrapper[4991]: W1206 01:17:17.396140 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fd49c2f_3ac6_4444_ab0a_4d9a47617377.slice/crio-03b994f15b9be21f9c87643a6fe64c597e68a0ac00056c1c4d662fb18b07b779 WatchSource:0}: Error finding container 03b994f15b9be21f9c87643a6fe64c597e68a0ac00056c1c4d662fb18b07b779: Status 404 returned error can't find the container with id 03b994f15b9be21f9c87643a6fe64c597e68a0ac00056c1c4d662fb18b07b779 Dec 06 01:17:17 crc kubenswrapper[4991]: I1206 01:17:17.452537 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-8495449b99-p7f45"] Dec 06 01:17:17 crc kubenswrapper[4991]: W1206 01:17:17.459247 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2eba891b_78cc_4a1c_8f65_a09829f349e8.slice/crio-10bc02f3b33647e3fa048d080380c2adc3476922126926be1118d3aa1a16607a WatchSource:0}: Error finding container 10bc02f3b33647e3fa048d080380c2adc3476922126926be1118d3aa1a16607a: Status 404 returned error can't find the container with id 10bc02f3b33647e3fa048d080380c2adc3476922126926be1118d3aa1a16607a Dec 06 01:17:17 crc kubenswrapper[4991]: I1206 01:17:17.947704 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-545bb986f8-rh6kd" event={"ID":"4fd49c2f-3ac6-4444-ab0a-4d9a47617377","Type":"ContainerStarted","Data":"03b994f15b9be21f9c87643a6fe64c597e68a0ac00056c1c4d662fb18b07b779"} Dec 06 01:17:17 crc kubenswrapper[4991]: I1206 01:17:17.949464 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-8495449b99-p7f45" event={"ID":"2eba891b-78cc-4a1c-8f65-a09829f349e8","Type":"ContainerStarted","Data":"10bc02f3b33647e3fa048d080380c2adc3476922126926be1118d3aa1a16607a"} Dec 06 01:17:19 crc kubenswrapper[4991]: I1206 01:17:19.433171 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7sshq" Dec 06 01:17:19 crc kubenswrapper[4991]: I1206 01:17:19.433613 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7sshq" Dec 06 01:17:19 crc kubenswrapper[4991]: I1206 01:17:19.476577 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7sshq" Dec 06 01:17:20 crc kubenswrapper[4991]: I1206 01:17:20.047823 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7sshq" Dec 06 01:17:21 crc kubenswrapper[4991]: I1206 01:17:21.262383 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7sshq"] Dec 06 01:17:21 crc kubenswrapper[4991]: I1206 01:17:21.994629 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7sshq" podUID="82476ed7-68cd-4fdf-a1ae-1158e1a1be5b" containerName="registry-server" containerID="cri-o://aed55b9c494ea9702fb50a96bfa90deeaac3a8fc801d7e52041c85f2f924cca5" gracePeriod=2 Dec 06 01:17:22 crc kubenswrapper[4991]: I1206 01:17:22.843901 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7sshq" Dec 06 01:17:22 crc kubenswrapper[4991]: I1206 01:17:22.968388 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82476ed7-68cd-4fdf-a1ae-1158e1a1be5b-catalog-content\") pod \"82476ed7-68cd-4fdf-a1ae-1158e1a1be5b\" (UID: \"82476ed7-68cd-4fdf-a1ae-1158e1a1be5b\") " Dec 06 01:17:22 crc kubenswrapper[4991]: I1206 01:17:22.968497 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpjqd\" (UniqueName: \"kubernetes.io/projected/82476ed7-68cd-4fdf-a1ae-1158e1a1be5b-kube-api-access-lpjqd\") pod \"82476ed7-68cd-4fdf-a1ae-1158e1a1be5b\" (UID: \"82476ed7-68cd-4fdf-a1ae-1158e1a1be5b\") " Dec 06 01:17:22 crc kubenswrapper[4991]: I1206 01:17:22.968563 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82476ed7-68cd-4fdf-a1ae-1158e1a1be5b-utilities\") pod \"82476ed7-68cd-4fdf-a1ae-1158e1a1be5b\" (UID: \"82476ed7-68cd-4fdf-a1ae-1158e1a1be5b\") " Dec 06 01:17:22 crc kubenswrapper[4991]: I1206 01:17:22.969565 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82476ed7-68cd-4fdf-a1ae-1158e1a1be5b-utilities" (OuterVolumeSpecName: "utilities") pod "82476ed7-68cd-4fdf-a1ae-1158e1a1be5b" (UID: "82476ed7-68cd-4fdf-a1ae-1158e1a1be5b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:17:22 crc kubenswrapper[4991]: I1206 01:17:22.975589 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82476ed7-68cd-4fdf-a1ae-1158e1a1be5b-kube-api-access-lpjqd" (OuterVolumeSpecName: "kube-api-access-lpjqd") pod "82476ed7-68cd-4fdf-a1ae-1158e1a1be5b" (UID: "82476ed7-68cd-4fdf-a1ae-1158e1a1be5b"). InnerVolumeSpecName "kube-api-access-lpjqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:17:23 crc kubenswrapper[4991]: I1206 01:17:23.001298 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-545bb986f8-rh6kd" event={"ID":"4fd49c2f-3ac6-4444-ab0a-4d9a47617377","Type":"ContainerStarted","Data":"cbdc30e42890e3def7431ec9f20de1052e0a0b587118fc802f04692cbc70d0e8"} Dec 06 01:17:23 crc kubenswrapper[4991]: I1206 01:17:23.002300 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-545bb986f8-rh6kd" Dec 06 01:17:23 crc kubenswrapper[4991]: I1206 01:17:23.005269 4991 generic.go:334] "Generic (PLEG): container finished" podID="82476ed7-68cd-4fdf-a1ae-1158e1a1be5b" containerID="aed55b9c494ea9702fb50a96bfa90deeaac3a8fc801d7e52041c85f2f924cca5" exitCode=0 Dec 06 01:17:23 crc kubenswrapper[4991]: I1206 01:17:23.005301 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7sshq" event={"ID":"82476ed7-68cd-4fdf-a1ae-1158e1a1be5b","Type":"ContainerDied","Data":"aed55b9c494ea9702fb50a96bfa90deeaac3a8fc801d7e52041c85f2f924cca5"} Dec 06 01:17:23 crc kubenswrapper[4991]: I1206 01:17:23.005335 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7sshq" event={"ID":"82476ed7-68cd-4fdf-a1ae-1158e1a1be5b","Type":"ContainerDied","Data":"df43a7bd66ac64bff55b116af128b32dd58e0c60f696782bb63c452190ca4cf0"} Dec 06 01:17:23 crc kubenswrapper[4991]: I1206 01:17:23.005362 4991 scope.go:117] "RemoveContainer" containerID="aed55b9c494ea9702fb50a96bfa90deeaac3a8fc801d7e52041c85f2f924cca5" Dec 06 01:17:23 crc kubenswrapper[4991]: I1206 01:17:23.005370 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7sshq" Dec 06 01:17:23 crc kubenswrapper[4991]: I1206 01:17:23.007949 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-8495449b99-p7f45" event={"ID":"2eba891b-78cc-4a1c-8f65-a09829f349e8","Type":"ContainerStarted","Data":"32d302d296649ac1f3900a336894b0ddfd0b76d5baa454ba59b7a9f4ec9e9227"} Dec 06 01:17:23 crc kubenswrapper[4991]: I1206 01:17:23.008328 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-8495449b99-p7f45" Dec 06 01:17:23 crc kubenswrapper[4991]: I1206 01:17:23.010291 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82476ed7-68cd-4fdf-a1ae-1158e1a1be5b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "82476ed7-68cd-4fdf-a1ae-1158e1a1be5b" (UID: "82476ed7-68cd-4fdf-a1ae-1158e1a1be5b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:17:23 crc kubenswrapper[4991]: I1206 01:17:23.026747 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-545bb986f8-rh6kd" podStartSLOduration=1.8227507429999998 podStartE2EDuration="7.026730099s" podCreationTimestamp="2025-12-06 01:17:16 +0000 UTC" firstStartedPulling="2025-12-06 01:17:17.403584586 +0000 UTC m=+910.753875064" lastFinishedPulling="2025-12-06 01:17:22.607563952 +0000 UTC m=+915.957854420" observedRunningTime="2025-12-06 01:17:23.020928484 +0000 UTC m=+916.371218952" watchObservedRunningTime="2025-12-06 01:17:23.026730099 +0000 UTC m=+916.377020567" Dec 06 01:17:23 crc kubenswrapper[4991]: I1206 01:17:23.035266 4991 scope.go:117] "RemoveContainer" containerID="0668e1fa7444c4c4143901f980d9ed03b33e3c1169a2f9a633600eed7a4a40ed" Dec 06 01:17:23 crc kubenswrapper[4991]: I1206 01:17:23.040775 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-8495449b99-p7f45" podStartSLOduration=1.876437696 podStartE2EDuration="7.040758374s" podCreationTimestamp="2025-12-06 01:17:16 +0000 UTC" firstStartedPulling="2025-12-06 01:17:17.460921536 +0000 UTC m=+910.811212014" lastFinishedPulling="2025-12-06 01:17:22.625242224 +0000 UTC m=+915.975532692" observedRunningTime="2025-12-06 01:17:23.037287991 +0000 UTC m=+916.387578459" watchObservedRunningTime="2025-12-06 01:17:23.040758374 +0000 UTC m=+916.391048842" Dec 06 01:17:23 crc kubenswrapper[4991]: I1206 01:17:23.062619 4991 scope.go:117] "RemoveContainer" containerID="c9bc72ba7072e8bbe3e2b2ddd82b624fef8281c218b336421713bedbb1161be5" Dec 06 01:17:23 crc kubenswrapper[4991]: I1206 01:17:23.070759 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82476ed7-68cd-4fdf-a1ae-1158e1a1be5b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 01:17:23 crc kubenswrapper[4991]: I1206 01:17:23.070792 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpjqd\" (UniqueName: \"kubernetes.io/projected/82476ed7-68cd-4fdf-a1ae-1158e1a1be5b-kube-api-access-lpjqd\") on node \"crc\" DevicePath \"\"" Dec 06 01:17:23 crc kubenswrapper[4991]: I1206 01:17:23.070806 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82476ed7-68cd-4fdf-a1ae-1158e1a1be5b-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 01:17:23 crc kubenswrapper[4991]: I1206 01:17:23.099198 4991 scope.go:117] "RemoveContainer" containerID="aed55b9c494ea9702fb50a96bfa90deeaac3a8fc801d7e52041c85f2f924cca5" Dec 06 01:17:23 crc kubenswrapper[4991]: E1206 01:17:23.099691 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aed55b9c494ea9702fb50a96bfa90deeaac3a8fc801d7e52041c85f2f924cca5\": container with ID starting with aed55b9c494ea9702fb50a96bfa90deeaac3a8fc801d7e52041c85f2f924cca5 not found: ID does not exist" containerID="aed55b9c494ea9702fb50a96bfa90deeaac3a8fc801d7e52041c85f2f924cca5" Dec 06 01:17:23 crc kubenswrapper[4991]: I1206 01:17:23.099733 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aed55b9c494ea9702fb50a96bfa90deeaac3a8fc801d7e52041c85f2f924cca5"} err="failed to get container status \"aed55b9c494ea9702fb50a96bfa90deeaac3a8fc801d7e52041c85f2f924cca5\": rpc error: code = NotFound desc = could not find container \"aed55b9c494ea9702fb50a96bfa90deeaac3a8fc801d7e52041c85f2f924cca5\": container with ID starting with aed55b9c494ea9702fb50a96bfa90deeaac3a8fc801d7e52041c85f2f924cca5 not found: ID does not exist" Dec 06 01:17:23 crc kubenswrapper[4991]: I1206 01:17:23.099762 4991 scope.go:117] "RemoveContainer" containerID="0668e1fa7444c4c4143901f980d9ed03b33e3c1169a2f9a633600eed7a4a40ed" Dec 06 01:17:23 crc kubenswrapper[4991]: E1206 01:17:23.100053 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0668e1fa7444c4c4143901f980d9ed03b33e3c1169a2f9a633600eed7a4a40ed\": container with ID starting with 0668e1fa7444c4c4143901f980d9ed03b33e3c1169a2f9a633600eed7a4a40ed not found: ID does not exist" containerID="0668e1fa7444c4c4143901f980d9ed03b33e3c1169a2f9a633600eed7a4a40ed" Dec 06 01:17:23 crc kubenswrapper[4991]: I1206 01:17:23.100085 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0668e1fa7444c4c4143901f980d9ed03b33e3c1169a2f9a633600eed7a4a40ed"} err="failed to get container status \"0668e1fa7444c4c4143901f980d9ed03b33e3c1169a2f9a633600eed7a4a40ed\": rpc error: code = NotFound desc = could not find container \"0668e1fa7444c4c4143901f980d9ed03b33e3c1169a2f9a633600eed7a4a40ed\": container with ID starting with 0668e1fa7444c4c4143901f980d9ed03b33e3c1169a2f9a633600eed7a4a40ed not found: ID does not exist" Dec 06 01:17:23 crc kubenswrapper[4991]: I1206 01:17:23.100149 4991 scope.go:117] "RemoveContainer" containerID="c9bc72ba7072e8bbe3e2b2ddd82b624fef8281c218b336421713bedbb1161be5" Dec 06 01:17:23 crc kubenswrapper[4991]: E1206 01:17:23.100474 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9bc72ba7072e8bbe3e2b2ddd82b624fef8281c218b336421713bedbb1161be5\": container with ID starting with c9bc72ba7072e8bbe3e2b2ddd82b624fef8281c218b336421713bedbb1161be5 not found: ID does not exist" containerID="c9bc72ba7072e8bbe3e2b2ddd82b624fef8281c218b336421713bedbb1161be5" Dec 06 01:17:23 crc kubenswrapper[4991]: I1206 01:17:23.100539 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9bc72ba7072e8bbe3e2b2ddd82b624fef8281c218b336421713bedbb1161be5"} err="failed to get container status \"c9bc72ba7072e8bbe3e2b2ddd82b624fef8281c218b336421713bedbb1161be5\": rpc error: code = NotFound desc = could not find container \"c9bc72ba7072e8bbe3e2b2ddd82b624fef8281c218b336421713bedbb1161be5\": container with ID starting with c9bc72ba7072e8bbe3e2b2ddd82b624fef8281c218b336421713bedbb1161be5 not found: ID does not exist" Dec 06 01:17:23 crc kubenswrapper[4991]: I1206 01:17:23.327531 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7sshq"] Dec 06 01:17:23 crc kubenswrapper[4991]: I1206 01:17:23.333765 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7sshq"] Dec 06 01:17:25 crc kubenswrapper[4991]: I1206 01:17:25.051677 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82476ed7-68cd-4fdf-a1ae-1158e1a1be5b" path="/var/lib/kubelet/pods/82476ed7-68cd-4fdf-a1ae-1158e1a1be5b/volumes" Dec 06 01:17:37 crc kubenswrapper[4991]: I1206 01:17:37.232224 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-8495449b99-p7f45" Dec 06 01:17:56 crc kubenswrapper[4991]: I1206 01:17:56.868034 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-545bb986f8-rh6kd" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.615040 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-fkgqs"] Dec 06 01:17:57 crc kubenswrapper[4991]: E1206 01:17:57.615306 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82476ed7-68cd-4fdf-a1ae-1158e1a1be5b" containerName="extract-content" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.615326 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="82476ed7-68cd-4fdf-a1ae-1158e1a1be5b" containerName="extract-content" Dec 06 01:17:57 crc kubenswrapper[4991]: E1206 01:17:57.615336 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82476ed7-68cd-4fdf-a1ae-1158e1a1be5b" containerName="extract-utilities" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.615347 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="82476ed7-68cd-4fdf-a1ae-1158e1a1be5b" containerName="extract-utilities" Dec 06 01:17:57 crc kubenswrapper[4991]: E1206 01:17:57.615359 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82476ed7-68cd-4fdf-a1ae-1158e1a1be5b" containerName="registry-server" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.615366 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="82476ed7-68cd-4fdf-a1ae-1158e1a1be5b" containerName="registry-server" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.615505 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="82476ed7-68cd-4fdf-a1ae-1158e1a1be5b" containerName="registry-server" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.630397 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-fkgqs" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.632458 4991 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-vxsmc" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.632606 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-t5xxb"] Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.633024 4991 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.633217 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.636483 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-t5xxb" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.648633 4991 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.663202 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-t5xxb"] Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.761121 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-4hfhs"] Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.762034 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-4hfhs" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.763970 4991 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.764109 4991 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.767557 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.769294 4991 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-mldhc" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.774644 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-qtv46"] Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.776145 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-qtv46" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.778297 4991 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.784669 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzrcx\" (UniqueName: \"kubernetes.io/projected/61ac5c00-6180-4a7b-a4c0-fd98da3bb456-kube-api-access-pzrcx\") pod \"frr-k8s-webhook-server-7fcb986d4-t5xxb\" (UID: \"61ac5c00-6180-4a7b-a4c0-fd98da3bb456\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-t5xxb" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.784709 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/a82fdef0-9a99-4150-b86b-c8839caa41a1-frr-startup\") pod \"frr-k8s-fkgqs\" (UID: \"a82fdef0-9a99-4150-b86b-c8839caa41a1\") " pod="metallb-system/frr-k8s-fkgqs" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.784739 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a82fdef0-9a99-4150-b86b-c8839caa41a1-metrics-certs\") pod \"frr-k8s-fkgqs\" (UID: \"a82fdef0-9a99-4150-b86b-c8839caa41a1\") " pod="metallb-system/frr-k8s-fkgqs" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.784761 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/a82fdef0-9a99-4150-b86b-c8839caa41a1-frr-conf\") pod \"frr-k8s-fkgqs\" (UID: \"a82fdef0-9a99-4150-b86b-c8839caa41a1\") " pod="metallb-system/frr-k8s-fkgqs" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.784805 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/a82fdef0-9a99-4150-b86b-c8839caa41a1-metrics\") pod \"frr-k8s-fkgqs\" (UID: \"a82fdef0-9a99-4150-b86b-c8839caa41a1\") " pod="metallb-system/frr-k8s-fkgqs" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.784818 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/a82fdef0-9a99-4150-b86b-c8839caa41a1-frr-sockets\") pod \"frr-k8s-fkgqs\" (UID: \"a82fdef0-9a99-4150-b86b-c8839caa41a1\") " pod="metallb-system/frr-k8s-fkgqs" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.784833 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/a82fdef0-9a99-4150-b86b-c8839caa41a1-reloader\") pod \"frr-k8s-fkgqs\" (UID: \"a82fdef0-9a99-4150-b86b-c8839caa41a1\") " pod="metallb-system/frr-k8s-fkgqs" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.784849 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/61ac5c00-6180-4a7b-a4c0-fd98da3bb456-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-t5xxb\" (UID: \"61ac5c00-6180-4a7b-a4c0-fd98da3bb456\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-t5xxb" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.784869 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw257\" (UniqueName: \"kubernetes.io/projected/a82fdef0-9a99-4150-b86b-c8839caa41a1-kube-api-access-lw257\") pod \"frr-k8s-fkgqs\" (UID: \"a82fdef0-9a99-4150-b86b-c8839caa41a1\") " pod="metallb-system/frr-k8s-fkgqs" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.794449 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-qtv46"] Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.887142 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzrcx\" (UniqueName: \"kubernetes.io/projected/61ac5c00-6180-4a7b-a4c0-fd98da3bb456-kube-api-access-pzrcx\") pod \"frr-k8s-webhook-server-7fcb986d4-t5xxb\" (UID: \"61ac5c00-6180-4a7b-a4c0-fd98da3bb456\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-t5xxb" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.887222 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/a82fdef0-9a99-4150-b86b-c8839caa41a1-frr-startup\") pod \"frr-k8s-fkgqs\" (UID: \"a82fdef0-9a99-4150-b86b-c8839caa41a1\") " pod="metallb-system/frr-k8s-fkgqs" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.887262 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/79797e2b-b768-414a-8f3f-0a68ae045269-memberlist\") pod \"speaker-4hfhs\" (UID: \"79797e2b-b768-414a-8f3f-0a68ae045269\") " pod="metallb-system/speaker-4hfhs" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.887300 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87hhc\" (UniqueName: \"kubernetes.io/projected/79797e2b-b768-414a-8f3f-0a68ae045269-kube-api-access-87hhc\") pod \"speaker-4hfhs\" (UID: \"79797e2b-b768-414a-8f3f-0a68ae045269\") " pod="metallb-system/speaker-4hfhs" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.887331 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8110ff83-33b6-4f27-946e-383e8bff44c7-metrics-certs\") pod \"controller-f8648f98b-qtv46\" (UID: \"8110ff83-33b6-4f27-946e-383e8bff44c7\") " pod="metallb-system/controller-f8648f98b-qtv46" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.887380 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a82fdef0-9a99-4150-b86b-c8839caa41a1-metrics-certs\") pod \"frr-k8s-fkgqs\" (UID: \"a82fdef0-9a99-4150-b86b-c8839caa41a1\") " pod="metallb-system/frr-k8s-fkgqs" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.887408 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/79797e2b-b768-414a-8f3f-0a68ae045269-metallb-excludel2\") pod \"speaker-4hfhs\" (UID: \"79797e2b-b768-414a-8f3f-0a68ae045269\") " pod="metallb-system/speaker-4hfhs" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.887449 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/a82fdef0-9a99-4150-b86b-c8839caa41a1-frr-conf\") pod \"frr-k8s-fkgqs\" (UID: \"a82fdef0-9a99-4150-b86b-c8839caa41a1\") " pod="metallb-system/frr-k8s-fkgqs" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.887532 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/a82fdef0-9a99-4150-b86b-c8839caa41a1-metrics\") pod \"frr-k8s-fkgqs\" (UID: \"a82fdef0-9a99-4150-b86b-c8839caa41a1\") " pod="metallb-system/frr-k8s-fkgqs" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.887561 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/a82fdef0-9a99-4150-b86b-c8839caa41a1-frr-sockets\") pod \"frr-k8s-fkgqs\" (UID: \"a82fdef0-9a99-4150-b86b-c8839caa41a1\") " pod="metallb-system/frr-k8s-fkgqs" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.887593 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/a82fdef0-9a99-4150-b86b-c8839caa41a1-reloader\") pod \"frr-k8s-fkgqs\" (UID: \"a82fdef0-9a99-4150-b86b-c8839caa41a1\") " pod="metallb-system/frr-k8s-fkgqs" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.887616 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/61ac5c00-6180-4a7b-a4c0-fd98da3bb456-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-t5xxb\" (UID: \"61ac5c00-6180-4a7b-a4c0-fd98da3bb456\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-t5xxb" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.887648 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/79797e2b-b768-414a-8f3f-0a68ae045269-metrics-certs\") pod \"speaker-4hfhs\" (UID: \"79797e2b-b768-414a-8f3f-0a68ae045269\") " pod="metallb-system/speaker-4hfhs" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.887684 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw257\" (UniqueName: \"kubernetes.io/projected/a82fdef0-9a99-4150-b86b-c8839caa41a1-kube-api-access-lw257\") pod \"frr-k8s-fkgqs\" (UID: \"a82fdef0-9a99-4150-b86b-c8839caa41a1\") " pod="metallb-system/frr-k8s-fkgqs" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.887717 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8110ff83-33b6-4f27-946e-383e8bff44c7-cert\") pod \"controller-f8648f98b-qtv46\" (UID: \"8110ff83-33b6-4f27-946e-383e8bff44c7\") " pod="metallb-system/controller-f8648f98b-qtv46" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.887745 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76nhk\" (UniqueName: \"kubernetes.io/projected/8110ff83-33b6-4f27-946e-383e8bff44c7-kube-api-access-76nhk\") pod \"controller-f8648f98b-qtv46\" (UID: \"8110ff83-33b6-4f27-946e-383e8bff44c7\") " pod="metallb-system/controller-f8648f98b-qtv46" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.888225 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/a82fdef0-9a99-4150-b86b-c8839caa41a1-frr-sockets\") pod \"frr-k8s-fkgqs\" (UID: \"a82fdef0-9a99-4150-b86b-c8839caa41a1\") " pod="metallb-system/frr-k8s-fkgqs" Dec 06 01:17:57 crc kubenswrapper[4991]: E1206 01:17:57.888345 4991 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.888391 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/a82fdef0-9a99-4150-b86b-c8839caa41a1-frr-startup\") pod \"frr-k8s-fkgqs\" (UID: \"a82fdef0-9a99-4150-b86b-c8839caa41a1\") " pod="metallb-system/frr-k8s-fkgqs" Dec 06 01:17:57 crc kubenswrapper[4991]: E1206 01:17:57.888416 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a82fdef0-9a99-4150-b86b-c8839caa41a1-metrics-certs podName:a82fdef0-9a99-4150-b86b-c8839caa41a1 nodeName:}" failed. No retries permitted until 2025-12-06 01:17:58.388398639 +0000 UTC m=+951.738689107 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a82fdef0-9a99-4150-b86b-c8839caa41a1-metrics-certs") pod "frr-k8s-fkgqs" (UID: "a82fdef0-9a99-4150-b86b-c8839caa41a1") : secret "frr-k8s-certs-secret" not found Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.888461 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/a82fdef0-9a99-4150-b86b-c8839caa41a1-reloader\") pod \"frr-k8s-fkgqs\" (UID: \"a82fdef0-9a99-4150-b86b-c8839caa41a1\") " pod="metallb-system/frr-k8s-fkgqs" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.888608 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/a82fdef0-9a99-4150-b86b-c8839caa41a1-frr-conf\") pod \"frr-k8s-fkgqs\" (UID: \"a82fdef0-9a99-4150-b86b-c8839caa41a1\") " pod="metallb-system/frr-k8s-fkgqs" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.888838 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/a82fdef0-9a99-4150-b86b-c8839caa41a1-metrics\") pod \"frr-k8s-fkgqs\" (UID: \"a82fdef0-9a99-4150-b86b-c8839caa41a1\") " pod="metallb-system/frr-k8s-fkgqs" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.908084 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/61ac5c00-6180-4a7b-a4c0-fd98da3bb456-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-t5xxb\" (UID: \"61ac5c00-6180-4a7b-a4c0-fd98da3bb456\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-t5xxb" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.910563 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw257\" (UniqueName: \"kubernetes.io/projected/a82fdef0-9a99-4150-b86b-c8839caa41a1-kube-api-access-lw257\") pod \"frr-k8s-fkgqs\" (UID: \"a82fdef0-9a99-4150-b86b-c8839caa41a1\") " pod="metallb-system/frr-k8s-fkgqs" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.916140 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzrcx\" (UniqueName: \"kubernetes.io/projected/61ac5c00-6180-4a7b-a4c0-fd98da3bb456-kube-api-access-pzrcx\") pod \"frr-k8s-webhook-server-7fcb986d4-t5xxb\" (UID: \"61ac5c00-6180-4a7b-a4c0-fd98da3bb456\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-t5xxb" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.960816 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-t5xxb" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.990093 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/79797e2b-b768-414a-8f3f-0a68ae045269-metrics-certs\") pod \"speaker-4hfhs\" (UID: \"79797e2b-b768-414a-8f3f-0a68ae045269\") " pod="metallb-system/speaker-4hfhs" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.990203 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8110ff83-33b6-4f27-946e-383e8bff44c7-cert\") pod \"controller-f8648f98b-qtv46\" (UID: \"8110ff83-33b6-4f27-946e-383e8bff44c7\") " pod="metallb-system/controller-f8648f98b-qtv46" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.990228 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76nhk\" (UniqueName: \"kubernetes.io/projected/8110ff83-33b6-4f27-946e-383e8bff44c7-kube-api-access-76nhk\") pod \"controller-f8648f98b-qtv46\" (UID: \"8110ff83-33b6-4f27-946e-383e8bff44c7\") " pod="metallb-system/controller-f8648f98b-qtv46" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.990261 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/79797e2b-b768-414a-8f3f-0a68ae045269-memberlist\") pod \"speaker-4hfhs\" (UID: \"79797e2b-b768-414a-8f3f-0a68ae045269\") " pod="metallb-system/speaker-4hfhs" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.990285 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87hhc\" (UniqueName: \"kubernetes.io/projected/79797e2b-b768-414a-8f3f-0a68ae045269-kube-api-access-87hhc\") pod \"speaker-4hfhs\" (UID: \"79797e2b-b768-414a-8f3f-0a68ae045269\") " pod="metallb-system/speaker-4hfhs" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.990302 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8110ff83-33b6-4f27-946e-383e8bff44c7-metrics-certs\") pod \"controller-f8648f98b-qtv46\" (UID: \"8110ff83-33b6-4f27-946e-383e8bff44c7\") " pod="metallb-system/controller-f8648f98b-qtv46" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.990336 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/79797e2b-b768-414a-8f3f-0a68ae045269-metallb-excludel2\") pod \"speaker-4hfhs\" (UID: \"79797e2b-b768-414a-8f3f-0a68ae045269\") " pod="metallb-system/speaker-4hfhs" Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.991198 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/79797e2b-b768-414a-8f3f-0a68ae045269-metallb-excludel2\") pod \"speaker-4hfhs\" (UID: \"79797e2b-b768-414a-8f3f-0a68ae045269\") " pod="metallb-system/speaker-4hfhs" Dec 06 01:17:57 crc kubenswrapper[4991]: E1206 01:17:57.992535 4991 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 06 01:17:57 crc kubenswrapper[4991]: E1206 01:17:57.992635 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79797e2b-b768-414a-8f3f-0a68ae045269-memberlist podName:79797e2b-b768-414a-8f3f-0a68ae045269 nodeName:}" failed. No retries permitted until 2025-12-06 01:17:58.492611629 +0000 UTC m=+951.842902297 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/79797e2b-b768-414a-8f3f-0a68ae045269-memberlist") pod "speaker-4hfhs" (UID: "79797e2b-b768-414a-8f3f-0a68ae045269") : secret "metallb-memberlist" not found Dec 06 01:17:57 crc kubenswrapper[4991]: E1206 01:17:57.993283 4991 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Dec 06 01:17:57 crc kubenswrapper[4991]: E1206 01:17:57.993319 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8110ff83-33b6-4f27-946e-383e8bff44c7-metrics-certs podName:8110ff83-33b6-4f27-946e-383e8bff44c7 nodeName:}" failed. No retries permitted until 2025-12-06 01:17:58.493308348 +0000 UTC m=+951.843599026 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8110ff83-33b6-4f27-946e-383e8bff44c7-metrics-certs") pod "controller-f8648f98b-qtv46" (UID: "8110ff83-33b6-4f27-946e-383e8bff44c7") : secret "controller-certs-secret" not found Dec 06 01:17:57 crc kubenswrapper[4991]: I1206 01:17:57.993504 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/79797e2b-b768-414a-8f3f-0a68ae045269-metrics-certs\") pod \"speaker-4hfhs\" (UID: \"79797e2b-b768-414a-8f3f-0a68ae045269\") " pod="metallb-system/speaker-4hfhs" Dec 06 01:17:58 crc kubenswrapper[4991]: I1206 01:17:58.003718 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8110ff83-33b6-4f27-946e-383e8bff44c7-cert\") pod \"controller-f8648f98b-qtv46\" (UID: \"8110ff83-33b6-4f27-946e-383e8bff44c7\") " pod="metallb-system/controller-f8648f98b-qtv46" Dec 06 01:17:58 crc kubenswrapper[4991]: I1206 01:17:58.009878 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76nhk\" (UniqueName: \"kubernetes.io/projected/8110ff83-33b6-4f27-946e-383e8bff44c7-kube-api-access-76nhk\") pod \"controller-f8648f98b-qtv46\" (UID: \"8110ff83-33b6-4f27-946e-383e8bff44c7\") " pod="metallb-system/controller-f8648f98b-qtv46" Dec 06 01:17:58 crc kubenswrapper[4991]: I1206 01:17:58.020718 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87hhc\" (UniqueName: \"kubernetes.io/projected/79797e2b-b768-414a-8f3f-0a68ae045269-kube-api-access-87hhc\") pod \"speaker-4hfhs\" (UID: \"79797e2b-b768-414a-8f3f-0a68ae045269\") " pod="metallb-system/speaker-4hfhs" Dec 06 01:17:58 crc kubenswrapper[4991]: I1206 01:17:58.408700 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a82fdef0-9a99-4150-b86b-c8839caa41a1-metrics-certs\") pod \"frr-k8s-fkgqs\" (UID: \"a82fdef0-9a99-4150-b86b-c8839caa41a1\") " pod="metallb-system/frr-k8s-fkgqs" Dec 06 01:17:58 crc kubenswrapper[4991]: I1206 01:17:58.413440 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a82fdef0-9a99-4150-b86b-c8839caa41a1-metrics-certs\") pod \"frr-k8s-fkgqs\" (UID: \"a82fdef0-9a99-4150-b86b-c8839caa41a1\") " pod="metallb-system/frr-k8s-fkgqs" Dec 06 01:17:58 crc kubenswrapper[4991]: I1206 01:17:58.433493 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-t5xxb"] Dec 06 01:17:58 crc kubenswrapper[4991]: I1206 01:17:58.510894 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/79797e2b-b768-414a-8f3f-0a68ae045269-memberlist\") pod \"speaker-4hfhs\" (UID: \"79797e2b-b768-414a-8f3f-0a68ae045269\") " pod="metallb-system/speaker-4hfhs" Dec 06 01:17:58 crc kubenswrapper[4991]: I1206 01:17:58.511016 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8110ff83-33b6-4f27-946e-383e8bff44c7-metrics-certs\") pod \"controller-f8648f98b-qtv46\" (UID: \"8110ff83-33b6-4f27-946e-383e8bff44c7\") " pod="metallb-system/controller-f8648f98b-qtv46" Dec 06 01:17:58 crc kubenswrapper[4991]: E1206 01:17:58.511900 4991 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 06 01:17:58 crc kubenswrapper[4991]: E1206 01:17:58.512055 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79797e2b-b768-414a-8f3f-0a68ae045269-memberlist podName:79797e2b-b768-414a-8f3f-0a68ae045269 nodeName:}" failed. No retries permitted until 2025-12-06 01:17:59.512027582 +0000 UTC m=+952.862318090 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/79797e2b-b768-414a-8f3f-0a68ae045269-memberlist") pod "speaker-4hfhs" (UID: "79797e2b-b768-414a-8f3f-0a68ae045269") : secret "metallb-memberlist" not found Dec 06 01:17:58 crc kubenswrapper[4991]: I1206 01:17:58.516836 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8110ff83-33b6-4f27-946e-383e8bff44c7-metrics-certs\") pod \"controller-f8648f98b-qtv46\" (UID: \"8110ff83-33b6-4f27-946e-383e8bff44c7\") " pod="metallb-system/controller-f8648f98b-qtv46" Dec 06 01:17:58 crc kubenswrapper[4991]: I1206 01:17:58.552138 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-fkgqs" Dec 06 01:17:58 crc kubenswrapper[4991]: I1206 01:17:58.688750 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-qtv46" Dec 06 01:17:58 crc kubenswrapper[4991]: I1206 01:17:58.965049 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-qtv46"] Dec 06 01:17:58 crc kubenswrapper[4991]: W1206 01:17:58.973857 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8110ff83_33b6_4f27_946e_383e8bff44c7.slice/crio-7ed3868a9d089751ff1c55a7f49404da9c3f565d3f4a76aa9fb5d10287472017 WatchSource:0}: Error finding container 7ed3868a9d089751ff1c55a7f49404da9c3f565d3f4a76aa9fb5d10287472017: Status 404 returned error can't find the container with id 7ed3868a9d089751ff1c55a7f49404da9c3f565d3f4a76aa9fb5d10287472017 Dec 06 01:17:59 crc kubenswrapper[4991]: I1206 01:17:59.259650 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fkgqs" event={"ID":"a82fdef0-9a99-4150-b86b-c8839caa41a1","Type":"ContainerStarted","Data":"52617a34db4fe7212a1588547635b70965939c1301dab25f5f48f4246df02e6d"} Dec 06 01:17:59 crc kubenswrapper[4991]: I1206 01:17:59.261130 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-qtv46" event={"ID":"8110ff83-33b6-4f27-946e-383e8bff44c7","Type":"ContainerStarted","Data":"edaa46e3b0786b92549577058ea8dabd94ef712ec10025bc73782efb37cae87b"} Dec 06 01:17:59 crc kubenswrapper[4991]: I1206 01:17:59.261180 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-qtv46" event={"ID":"8110ff83-33b6-4f27-946e-383e8bff44c7","Type":"ContainerStarted","Data":"7ed3868a9d089751ff1c55a7f49404da9c3f565d3f4a76aa9fb5d10287472017"} Dec 06 01:17:59 crc kubenswrapper[4991]: I1206 01:17:59.261912 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-t5xxb" event={"ID":"61ac5c00-6180-4a7b-a4c0-fd98da3bb456","Type":"ContainerStarted","Data":"81fd873050b1ee3cb6b6d337874a87224f85e868a305432e2877d1ae6bcfebe8"} Dec 06 01:17:59 crc kubenswrapper[4991]: I1206 01:17:59.526243 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/79797e2b-b768-414a-8f3f-0a68ae045269-memberlist\") pod \"speaker-4hfhs\" (UID: \"79797e2b-b768-414a-8f3f-0a68ae045269\") " pod="metallb-system/speaker-4hfhs" Dec 06 01:17:59 crc kubenswrapper[4991]: I1206 01:17:59.533477 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/79797e2b-b768-414a-8f3f-0a68ae045269-memberlist\") pod \"speaker-4hfhs\" (UID: \"79797e2b-b768-414a-8f3f-0a68ae045269\") " pod="metallb-system/speaker-4hfhs" Dec 06 01:17:59 crc kubenswrapper[4991]: I1206 01:17:59.576521 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-4hfhs" Dec 06 01:17:59 crc kubenswrapper[4991]: W1206 01:17:59.598902 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79797e2b_b768_414a_8f3f_0a68ae045269.slice/crio-4eb14baeb1ad959399a6c2bd489d93585e8fe2d95941d602b2ead5bf4d049c3f WatchSource:0}: Error finding container 4eb14baeb1ad959399a6c2bd489d93585e8fe2d95941d602b2ead5bf4d049c3f: Status 404 returned error can't find the container with id 4eb14baeb1ad959399a6c2bd489d93585e8fe2d95941d602b2ead5bf4d049c3f Dec 06 01:18:00 crc kubenswrapper[4991]: I1206 01:18:00.282639 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-qtv46" event={"ID":"8110ff83-33b6-4f27-946e-383e8bff44c7","Type":"ContainerStarted","Data":"499c8697aa79fcec79fd73fad8de01e20ec8452a02eb4ceb5f3d2e6cc9403fa1"} Dec 06 01:18:00 crc kubenswrapper[4991]: I1206 01:18:00.283296 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-qtv46" Dec 06 01:18:00 crc kubenswrapper[4991]: I1206 01:18:00.287451 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-4hfhs" event={"ID":"79797e2b-b768-414a-8f3f-0a68ae045269","Type":"ContainerStarted","Data":"1a5c720c7cd75ff1fea4e3e8dcd04947de677bd6258276feb01a1e4529fd85ab"} Dec 06 01:18:00 crc kubenswrapper[4991]: I1206 01:18:00.287525 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-4hfhs" event={"ID":"79797e2b-b768-414a-8f3f-0a68ae045269","Type":"ContainerStarted","Data":"4eb14baeb1ad959399a6c2bd489d93585e8fe2d95941d602b2ead5bf4d049c3f"} Dec 06 01:18:00 crc kubenswrapper[4991]: I1206 01:18:00.305150 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-qtv46" podStartSLOduration=3.305129628 podStartE2EDuration="3.305129628s" podCreationTimestamp="2025-12-06 01:17:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:18:00.303974167 +0000 UTC m=+953.654264635" watchObservedRunningTime="2025-12-06 01:18:00.305129628 +0000 UTC m=+953.655420096" Dec 06 01:18:01 crc kubenswrapper[4991]: I1206 01:18:01.307537 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-4hfhs" event={"ID":"79797e2b-b768-414a-8f3f-0a68ae045269","Type":"ContainerStarted","Data":"62492152a9d4c7d2b1ce18369ae6351894be273a55a9f8dc6d3a58dbf96244e4"} Dec 06 01:18:01 crc kubenswrapper[4991]: I1206 01:18:01.339788 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-4hfhs" podStartSLOduration=4.339763391 podStartE2EDuration="4.339763391s" podCreationTimestamp="2025-12-06 01:17:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:18:01.339097913 +0000 UTC m=+954.689388381" watchObservedRunningTime="2025-12-06 01:18:01.339763391 +0000 UTC m=+954.690053859" Dec 06 01:18:02 crc kubenswrapper[4991]: I1206 01:18:02.317160 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-4hfhs" Dec 06 01:18:06 crc kubenswrapper[4991]: I1206 01:18:06.346162 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-t5xxb" event={"ID":"61ac5c00-6180-4a7b-a4c0-fd98da3bb456","Type":"ContainerStarted","Data":"2b9fc69e0dff2b07919f39c5886c961725454d861b92f88c037a1a6c02d34af2"} Dec 06 01:18:06 crc kubenswrapper[4991]: I1206 01:18:06.346874 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-t5xxb" Dec 06 01:18:06 crc kubenswrapper[4991]: I1206 01:18:06.347962 4991 generic.go:334] "Generic (PLEG): container finished" podID="a82fdef0-9a99-4150-b86b-c8839caa41a1" containerID="7bbb98a9f6c9a9381ff1cf377afcee3d3172cbd28b138824a6f4cb119281f6e5" exitCode=0 Dec 06 01:18:06 crc kubenswrapper[4991]: I1206 01:18:06.348042 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fkgqs" event={"ID":"a82fdef0-9a99-4150-b86b-c8839caa41a1","Type":"ContainerDied","Data":"7bbb98a9f6c9a9381ff1cf377afcee3d3172cbd28b138824a6f4cb119281f6e5"} Dec 06 01:18:06 crc kubenswrapper[4991]: I1206 01:18:06.379607 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-t5xxb" podStartSLOduration=2.016152487 podStartE2EDuration="9.379585506s" podCreationTimestamp="2025-12-06 01:17:57 +0000 UTC" firstStartedPulling="2025-12-06 01:17:58.437830372 +0000 UTC m=+951.788120840" lastFinishedPulling="2025-12-06 01:18:05.801263381 +0000 UTC m=+959.151553859" observedRunningTime="2025-12-06 01:18:06.378062695 +0000 UTC m=+959.728353173" watchObservedRunningTime="2025-12-06 01:18:06.379585506 +0000 UTC m=+959.729875984" Dec 06 01:18:07 crc kubenswrapper[4991]: I1206 01:18:07.358255 4991 generic.go:334] "Generic (PLEG): container finished" podID="a82fdef0-9a99-4150-b86b-c8839caa41a1" containerID="8b1b8db7651d44af06ad345571106862d9e8e885cb8207d2f0afe79dda14e16c" exitCode=0 Dec 06 01:18:07 crc kubenswrapper[4991]: I1206 01:18:07.358403 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fkgqs" event={"ID":"a82fdef0-9a99-4150-b86b-c8839caa41a1","Type":"ContainerDied","Data":"8b1b8db7651d44af06ad345571106862d9e8e885cb8207d2f0afe79dda14e16c"} Dec 06 01:18:08 crc kubenswrapper[4991]: I1206 01:18:08.370327 4991 generic.go:334] "Generic (PLEG): container finished" podID="a82fdef0-9a99-4150-b86b-c8839caa41a1" containerID="e1ff1485f5d51e076427720f4790495242f057c3d24c7e7d0bfc698fd9f228a4" exitCode=0 Dec 06 01:18:08 crc kubenswrapper[4991]: I1206 01:18:08.370410 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fkgqs" event={"ID":"a82fdef0-9a99-4150-b86b-c8839caa41a1","Type":"ContainerDied","Data":"e1ff1485f5d51e076427720f4790495242f057c3d24c7e7d0bfc698fd9f228a4"} Dec 06 01:18:09 crc kubenswrapper[4991]: I1206 01:18:09.383780 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fkgqs" event={"ID":"a82fdef0-9a99-4150-b86b-c8839caa41a1","Type":"ContainerStarted","Data":"64956f1c2eca55497fe3be79407506eee84d5cb97bd3b774e2920c676ff25f99"} Dec 06 01:18:09 crc kubenswrapper[4991]: I1206 01:18:09.384290 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fkgqs" event={"ID":"a82fdef0-9a99-4150-b86b-c8839caa41a1","Type":"ContainerStarted","Data":"5e13b4dae151fcaf514d0c8e921f3240d698f0c482164bee2cee497870b5fe21"} Dec 06 01:18:09 crc kubenswrapper[4991]: I1206 01:18:09.384313 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fkgqs" event={"ID":"a82fdef0-9a99-4150-b86b-c8839caa41a1","Type":"ContainerStarted","Data":"679e7bf4f0f3e1dd7721992a2c370b2e05ae0713db754de8b998b7bf3a5e4da7"} Dec 06 01:18:09 crc kubenswrapper[4991]: I1206 01:18:09.384329 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fkgqs" event={"ID":"a82fdef0-9a99-4150-b86b-c8839caa41a1","Type":"ContainerStarted","Data":"af956c966fcb8c9f88c0c5ba6b641d579890a3d49a8bac3a32cde4b6cc49e4a7"} Dec 06 01:18:09 crc kubenswrapper[4991]: I1206 01:18:09.384344 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fkgqs" event={"ID":"a82fdef0-9a99-4150-b86b-c8839caa41a1","Type":"ContainerStarted","Data":"ae8052e47a06dab0977580f5cbe0f36914a816b44bbadbb8bea88ae9280f6e73"} Dec 06 01:18:09 crc kubenswrapper[4991]: I1206 01:18:09.580618 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-4hfhs" Dec 06 01:18:10 crc kubenswrapper[4991]: I1206 01:18:10.400063 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fkgqs" event={"ID":"a82fdef0-9a99-4150-b86b-c8839caa41a1","Type":"ContainerStarted","Data":"3534502d772048e0ace2cef38767d707e6e27b8fcb0e3cda9c8f3c895131a297"} Dec 06 01:18:10 crc kubenswrapper[4991]: I1206 01:18:10.400713 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-fkgqs" Dec 06 01:18:10 crc kubenswrapper[4991]: I1206 01:18:10.438109 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-fkgqs" podStartSLOduration=6.362798222 podStartE2EDuration="13.438075331s" podCreationTimestamp="2025-12-06 01:17:57 +0000 UTC" firstStartedPulling="2025-12-06 01:17:58.73301774 +0000 UTC m=+952.083308248" lastFinishedPulling="2025-12-06 01:18:05.808294889 +0000 UTC m=+959.158585357" observedRunningTime="2025-12-06 01:18:10.430147589 +0000 UTC m=+963.780438117" watchObservedRunningTime="2025-12-06 01:18:10.438075331 +0000 UTC m=+963.788365859" Dec 06 01:18:12 crc kubenswrapper[4991]: I1206 01:18:12.427593 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-rzpch"] Dec 06 01:18:12 crc kubenswrapper[4991]: I1206 01:18:12.428604 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rzpch" Dec 06 01:18:12 crc kubenswrapper[4991]: I1206 01:18:12.430657 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-j2kjr" Dec 06 01:18:12 crc kubenswrapper[4991]: I1206 01:18:12.430772 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 06 01:18:12 crc kubenswrapper[4991]: I1206 01:18:12.430915 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 06 01:18:12 crc kubenswrapper[4991]: I1206 01:18:12.444507 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rzpch"] Dec 06 01:18:12 crc kubenswrapper[4991]: I1206 01:18:12.469732 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz7zp\" (UniqueName: \"kubernetes.io/projected/8ad09275-756f-4d22-8744-f2f17045b88f-kube-api-access-vz7zp\") pod \"openstack-operator-index-rzpch\" (UID: \"8ad09275-756f-4d22-8744-f2f17045b88f\") " pod="openstack-operators/openstack-operator-index-rzpch" Dec 06 01:18:12 crc kubenswrapper[4991]: I1206 01:18:12.571097 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vz7zp\" (UniqueName: \"kubernetes.io/projected/8ad09275-756f-4d22-8744-f2f17045b88f-kube-api-access-vz7zp\") pod \"openstack-operator-index-rzpch\" (UID: \"8ad09275-756f-4d22-8744-f2f17045b88f\") " pod="openstack-operators/openstack-operator-index-rzpch" Dec 06 01:18:12 crc kubenswrapper[4991]: I1206 01:18:12.601785 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz7zp\" (UniqueName: \"kubernetes.io/projected/8ad09275-756f-4d22-8744-f2f17045b88f-kube-api-access-vz7zp\") pod \"openstack-operator-index-rzpch\" (UID: \"8ad09275-756f-4d22-8744-f2f17045b88f\") " pod="openstack-operators/openstack-operator-index-rzpch" Dec 06 01:18:12 crc kubenswrapper[4991]: I1206 01:18:12.762902 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rzpch" Dec 06 01:18:13 crc kubenswrapper[4991]: I1206 01:18:13.282652 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rzpch"] Dec 06 01:18:13 crc kubenswrapper[4991]: W1206 01:18:13.283723 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ad09275_756f_4d22_8744_f2f17045b88f.slice/crio-7ea6a358af971de18f4467b3fb68d147c4b722456a9fb0d89071aa396ffd3eaa WatchSource:0}: Error finding container 7ea6a358af971de18f4467b3fb68d147c4b722456a9fb0d89071aa396ffd3eaa: Status 404 returned error can't find the container with id 7ea6a358af971de18f4467b3fb68d147c4b722456a9fb0d89071aa396ffd3eaa Dec 06 01:18:13 crc kubenswrapper[4991]: I1206 01:18:13.421326 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rzpch" event={"ID":"8ad09275-756f-4d22-8744-f2f17045b88f","Type":"ContainerStarted","Data":"7ea6a358af971de18f4467b3fb68d147c4b722456a9fb0d89071aa396ffd3eaa"} Dec 06 01:18:13 crc kubenswrapper[4991]: I1206 01:18:13.553589 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-fkgqs" Dec 06 01:18:13 crc kubenswrapper[4991]: I1206 01:18:13.603668 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-fkgqs" Dec 06 01:18:15 crc kubenswrapper[4991]: I1206 01:18:15.607002 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-rzpch"] Dec 06 01:18:16 crc kubenswrapper[4991]: I1206 01:18:16.221901 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-qv7hh"] Dec 06 01:18:16 crc kubenswrapper[4991]: I1206 01:18:16.223478 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qv7hh" Dec 06 01:18:16 crc kubenswrapper[4991]: I1206 01:18:16.240118 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-qv7hh"] Dec 06 01:18:16 crc kubenswrapper[4991]: I1206 01:18:16.324871 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbsk4\" (UniqueName: \"kubernetes.io/projected/8ed7eb4a-d10c-40bd-9db4-2f8c0434fd75-kube-api-access-hbsk4\") pod \"openstack-operator-index-qv7hh\" (UID: \"8ed7eb4a-d10c-40bd-9db4-2f8c0434fd75\") " pod="openstack-operators/openstack-operator-index-qv7hh" Dec 06 01:18:16 crc kubenswrapper[4991]: I1206 01:18:16.427444 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbsk4\" (UniqueName: \"kubernetes.io/projected/8ed7eb4a-d10c-40bd-9db4-2f8c0434fd75-kube-api-access-hbsk4\") pod \"openstack-operator-index-qv7hh\" (UID: \"8ed7eb4a-d10c-40bd-9db4-2f8c0434fd75\") " pod="openstack-operators/openstack-operator-index-qv7hh" Dec 06 01:18:16 crc kubenswrapper[4991]: I1206 01:18:16.445126 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rzpch" event={"ID":"8ad09275-756f-4d22-8744-f2f17045b88f","Type":"ContainerStarted","Data":"b22bf78358af1c49406dd0e31299c881f98fb33555e66cdc6c470bddb30ff5fe"} Dec 06 01:18:16 crc kubenswrapper[4991]: I1206 01:18:16.445420 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-rzpch" podUID="8ad09275-756f-4d22-8744-f2f17045b88f" containerName="registry-server" containerID="cri-o://b22bf78358af1c49406dd0e31299c881f98fb33555e66cdc6c470bddb30ff5fe" gracePeriod=2 Dec 06 01:18:16 crc kubenswrapper[4991]: I1206 01:18:16.467707 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbsk4\" (UniqueName: \"kubernetes.io/projected/8ed7eb4a-d10c-40bd-9db4-2f8c0434fd75-kube-api-access-hbsk4\") pod \"openstack-operator-index-qv7hh\" (UID: \"8ed7eb4a-d10c-40bd-9db4-2f8c0434fd75\") " pod="openstack-operators/openstack-operator-index-qv7hh" Dec 06 01:18:16 crc kubenswrapper[4991]: I1206 01:18:16.476514 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-rzpch" podStartSLOduration=2.185976397 podStartE2EDuration="4.476493958s" podCreationTimestamp="2025-12-06 01:18:12 +0000 UTC" firstStartedPulling="2025-12-06 01:18:13.286381408 +0000 UTC m=+966.636671906" lastFinishedPulling="2025-12-06 01:18:15.576898999 +0000 UTC m=+968.927189467" observedRunningTime="2025-12-06 01:18:16.475311666 +0000 UTC m=+969.825602174" watchObservedRunningTime="2025-12-06 01:18:16.476493958 +0000 UTC m=+969.826784416" Dec 06 01:18:16 crc kubenswrapper[4991]: I1206 01:18:16.561372 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qv7hh" Dec 06 01:18:16 crc kubenswrapper[4991]: I1206 01:18:16.870796 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rzpch" Dec 06 01:18:17 crc kubenswrapper[4991]: I1206 01:18:17.040954 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vz7zp\" (UniqueName: \"kubernetes.io/projected/8ad09275-756f-4d22-8744-f2f17045b88f-kube-api-access-vz7zp\") pod \"8ad09275-756f-4d22-8744-f2f17045b88f\" (UID: \"8ad09275-756f-4d22-8744-f2f17045b88f\") " Dec 06 01:18:17 crc kubenswrapper[4991]: W1206 01:18:17.047298 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ed7eb4a_d10c_40bd_9db4_2f8c0434fd75.slice/crio-6ecd781938758633abf48980cb0e7f163bc8b2895a67777d5cbe9f6446227ff8 WatchSource:0}: Error finding container 6ecd781938758633abf48980cb0e7f163bc8b2895a67777d5cbe9f6446227ff8: Status 404 returned error can't find the container with id 6ecd781938758633abf48980cb0e7f163bc8b2895a67777d5cbe9f6446227ff8 Dec 06 01:18:17 crc kubenswrapper[4991]: I1206 01:18:17.050316 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ad09275-756f-4d22-8744-f2f17045b88f-kube-api-access-vz7zp" (OuterVolumeSpecName: "kube-api-access-vz7zp") pod "8ad09275-756f-4d22-8744-f2f17045b88f" (UID: "8ad09275-756f-4d22-8744-f2f17045b88f"). InnerVolumeSpecName "kube-api-access-vz7zp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:18:17 crc kubenswrapper[4991]: I1206 01:18:17.055891 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-qv7hh"] Dec 06 01:18:17 crc kubenswrapper[4991]: I1206 01:18:17.142816 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vz7zp\" (UniqueName: \"kubernetes.io/projected/8ad09275-756f-4d22-8744-f2f17045b88f-kube-api-access-vz7zp\") on node \"crc\" DevicePath \"\"" Dec 06 01:18:17 crc kubenswrapper[4991]: I1206 01:18:17.463200 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qv7hh" event={"ID":"8ed7eb4a-d10c-40bd-9db4-2f8c0434fd75","Type":"ContainerStarted","Data":"1251444a26d000d876a9d9032b306c97da900d496b370ba9e977a79038f911b5"} Dec 06 01:18:17 crc kubenswrapper[4991]: I1206 01:18:17.463273 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qv7hh" event={"ID":"8ed7eb4a-d10c-40bd-9db4-2f8c0434fd75","Type":"ContainerStarted","Data":"6ecd781938758633abf48980cb0e7f163bc8b2895a67777d5cbe9f6446227ff8"} Dec 06 01:18:17 crc kubenswrapper[4991]: I1206 01:18:17.466979 4991 generic.go:334] "Generic (PLEG): container finished" podID="8ad09275-756f-4d22-8744-f2f17045b88f" containerID="b22bf78358af1c49406dd0e31299c881f98fb33555e66cdc6c470bddb30ff5fe" exitCode=0 Dec 06 01:18:17 crc kubenswrapper[4991]: I1206 01:18:17.467092 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rzpch" event={"ID":"8ad09275-756f-4d22-8744-f2f17045b88f","Type":"ContainerDied","Data":"b22bf78358af1c49406dd0e31299c881f98fb33555e66cdc6c470bddb30ff5fe"} Dec 06 01:18:17 crc kubenswrapper[4991]: I1206 01:18:17.467144 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rzpch" event={"ID":"8ad09275-756f-4d22-8744-f2f17045b88f","Type":"ContainerDied","Data":"7ea6a358af971de18f4467b3fb68d147c4b722456a9fb0d89071aa396ffd3eaa"} Dec 06 01:18:17 crc kubenswrapper[4991]: I1206 01:18:17.467158 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rzpch" Dec 06 01:18:17 crc kubenswrapper[4991]: I1206 01:18:17.467184 4991 scope.go:117] "RemoveContainer" containerID="b22bf78358af1c49406dd0e31299c881f98fb33555e66cdc6c470bddb30ff5fe" Dec 06 01:18:17 crc kubenswrapper[4991]: I1206 01:18:17.493148 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-qv7hh" podStartSLOduration=1.439917641 podStartE2EDuration="1.493069369s" podCreationTimestamp="2025-12-06 01:18:16 +0000 UTC" firstStartedPulling="2025-12-06 01:18:17.050745774 +0000 UTC m=+970.401036242" lastFinishedPulling="2025-12-06 01:18:17.103897502 +0000 UTC m=+970.454187970" observedRunningTime="2025-12-06 01:18:17.484293115 +0000 UTC m=+970.834583623" watchObservedRunningTime="2025-12-06 01:18:17.493069369 +0000 UTC m=+970.843359877" Dec 06 01:18:17 crc kubenswrapper[4991]: I1206 01:18:17.508016 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-rzpch"] Dec 06 01:18:17 crc kubenswrapper[4991]: I1206 01:18:17.512894 4991 scope.go:117] "RemoveContainer" containerID="b22bf78358af1c49406dd0e31299c881f98fb33555e66cdc6c470bddb30ff5fe" Dec 06 01:18:17 crc kubenswrapper[4991]: E1206 01:18:17.513663 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b22bf78358af1c49406dd0e31299c881f98fb33555e66cdc6c470bddb30ff5fe\": container with ID starting with b22bf78358af1c49406dd0e31299c881f98fb33555e66cdc6c470bddb30ff5fe not found: ID does not exist" containerID="b22bf78358af1c49406dd0e31299c881f98fb33555e66cdc6c470bddb30ff5fe" Dec 06 01:18:17 crc kubenswrapper[4991]: I1206 01:18:17.513789 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b22bf78358af1c49406dd0e31299c881f98fb33555e66cdc6c470bddb30ff5fe"} err="failed to get container status \"b22bf78358af1c49406dd0e31299c881f98fb33555e66cdc6c470bddb30ff5fe\": rpc error: code = NotFound desc = could not find container \"b22bf78358af1c49406dd0e31299c881f98fb33555e66cdc6c470bddb30ff5fe\": container with ID starting with b22bf78358af1c49406dd0e31299c881f98fb33555e66cdc6c470bddb30ff5fe not found: ID does not exist" Dec 06 01:18:17 crc kubenswrapper[4991]: I1206 01:18:17.517676 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-rzpch"] Dec 06 01:18:17 crc kubenswrapper[4991]: I1206 01:18:17.971534 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-t5xxb" Dec 06 01:18:18 crc kubenswrapper[4991]: I1206 01:18:18.558217 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-fkgqs" Dec 06 01:18:18 crc kubenswrapper[4991]: I1206 01:18:18.697308 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-qtv46" Dec 06 01:18:19 crc kubenswrapper[4991]: I1206 01:18:19.049777 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ad09275-756f-4d22-8744-f2f17045b88f" path="/var/lib/kubelet/pods/8ad09275-756f-4d22-8744-f2f17045b88f/volumes" Dec 06 01:18:26 crc kubenswrapper[4991]: I1206 01:18:26.561681 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-qv7hh" Dec 06 01:18:26 crc kubenswrapper[4991]: I1206 01:18:26.562389 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-qv7hh" Dec 06 01:18:26 crc kubenswrapper[4991]: I1206 01:18:26.605039 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-qv7hh" Dec 06 01:18:27 crc kubenswrapper[4991]: I1206 01:18:27.578751 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-qv7hh" Dec 06 01:18:35 crc kubenswrapper[4991]: I1206 01:18:35.207452 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66v82fh"] Dec 06 01:18:35 crc kubenswrapper[4991]: E1206 01:18:35.208752 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ad09275-756f-4d22-8744-f2f17045b88f" containerName="registry-server" Dec 06 01:18:35 crc kubenswrapper[4991]: I1206 01:18:35.208771 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ad09275-756f-4d22-8744-f2f17045b88f" containerName="registry-server" Dec 06 01:18:35 crc kubenswrapper[4991]: I1206 01:18:35.208925 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ad09275-756f-4d22-8744-f2f17045b88f" containerName="registry-server" Dec 06 01:18:35 crc kubenswrapper[4991]: I1206 01:18:35.209864 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66v82fh" Dec 06 01:18:35 crc kubenswrapper[4991]: I1206 01:18:35.213632 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-q6ftn" Dec 06 01:18:35 crc kubenswrapper[4991]: I1206 01:18:35.229034 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66v82fh"] Dec 06 01:18:35 crc kubenswrapper[4991]: I1206 01:18:35.248229 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8263afa1-30a1-4b88-b31a-7ca24507edbf-bundle\") pod \"be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66v82fh\" (UID: \"8263afa1-30a1-4b88-b31a-7ca24507edbf\") " pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66v82fh" Dec 06 01:18:35 crc kubenswrapper[4991]: I1206 01:18:35.248277 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8263afa1-30a1-4b88-b31a-7ca24507edbf-util\") pod \"be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66v82fh\" (UID: \"8263afa1-30a1-4b88-b31a-7ca24507edbf\") " pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66v82fh" Dec 06 01:18:35 crc kubenswrapper[4991]: I1206 01:18:35.248306 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spf4d\" (UniqueName: \"kubernetes.io/projected/8263afa1-30a1-4b88-b31a-7ca24507edbf-kube-api-access-spf4d\") pod \"be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66v82fh\" (UID: \"8263afa1-30a1-4b88-b31a-7ca24507edbf\") " pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66v82fh" Dec 06 01:18:35 crc kubenswrapper[4991]: I1206 01:18:35.349729 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8263afa1-30a1-4b88-b31a-7ca24507edbf-bundle\") pod \"be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66v82fh\" (UID: \"8263afa1-30a1-4b88-b31a-7ca24507edbf\") " pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66v82fh" Dec 06 01:18:35 crc kubenswrapper[4991]: I1206 01:18:35.349792 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8263afa1-30a1-4b88-b31a-7ca24507edbf-util\") pod \"be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66v82fh\" (UID: \"8263afa1-30a1-4b88-b31a-7ca24507edbf\") " pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66v82fh" Dec 06 01:18:35 crc kubenswrapper[4991]: I1206 01:18:35.349829 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spf4d\" (UniqueName: \"kubernetes.io/projected/8263afa1-30a1-4b88-b31a-7ca24507edbf-kube-api-access-spf4d\") pod \"be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66v82fh\" (UID: \"8263afa1-30a1-4b88-b31a-7ca24507edbf\") " pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66v82fh" Dec 06 01:18:35 crc kubenswrapper[4991]: I1206 01:18:35.350433 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8263afa1-30a1-4b88-b31a-7ca24507edbf-bundle\") pod \"be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66v82fh\" (UID: \"8263afa1-30a1-4b88-b31a-7ca24507edbf\") " pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66v82fh" Dec 06 01:18:35 crc kubenswrapper[4991]: I1206 01:18:35.350552 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8263afa1-30a1-4b88-b31a-7ca24507edbf-util\") pod \"be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66v82fh\" (UID: \"8263afa1-30a1-4b88-b31a-7ca24507edbf\") " pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66v82fh" Dec 06 01:18:35 crc kubenswrapper[4991]: I1206 01:18:35.374541 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spf4d\" (UniqueName: \"kubernetes.io/projected/8263afa1-30a1-4b88-b31a-7ca24507edbf-kube-api-access-spf4d\") pod \"be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66v82fh\" (UID: \"8263afa1-30a1-4b88-b31a-7ca24507edbf\") " pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66v82fh" Dec 06 01:18:35 crc kubenswrapper[4991]: I1206 01:18:35.592450 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66v82fh" Dec 06 01:18:35 crc kubenswrapper[4991]: I1206 01:18:35.890917 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66v82fh"] Dec 06 01:18:36 crc kubenswrapper[4991]: I1206 01:18:36.629572 4991 generic.go:334] "Generic (PLEG): container finished" podID="8263afa1-30a1-4b88-b31a-7ca24507edbf" containerID="260680f6b244d7c9e21e7dbafa5f9d25463d58a75d47575cc3ddf81d694629e3" exitCode=0 Dec 06 01:18:36 crc kubenswrapper[4991]: I1206 01:18:36.629828 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66v82fh" event={"ID":"8263afa1-30a1-4b88-b31a-7ca24507edbf","Type":"ContainerDied","Data":"260680f6b244d7c9e21e7dbafa5f9d25463d58a75d47575cc3ddf81d694629e3"} Dec 06 01:18:36 crc kubenswrapper[4991]: I1206 01:18:36.630209 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66v82fh" event={"ID":"8263afa1-30a1-4b88-b31a-7ca24507edbf","Type":"ContainerStarted","Data":"71038b4b2ac7e3e01bfdadd482f988175d0167498145fcbd1b5d71c7dddf94ca"} Dec 06 01:18:37 crc kubenswrapper[4991]: I1206 01:18:37.638977 4991 generic.go:334] "Generic (PLEG): container finished" podID="8263afa1-30a1-4b88-b31a-7ca24507edbf" containerID="b1e0c31bf175945af82afc39def795e4ae068b2fe957aa8e5f73aae4b40073d1" exitCode=0 Dec 06 01:18:37 crc kubenswrapper[4991]: I1206 01:18:37.639040 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66v82fh" event={"ID":"8263afa1-30a1-4b88-b31a-7ca24507edbf","Type":"ContainerDied","Data":"b1e0c31bf175945af82afc39def795e4ae068b2fe957aa8e5f73aae4b40073d1"} Dec 06 01:18:38 crc kubenswrapper[4991]: I1206 01:18:38.650354 4991 generic.go:334] "Generic (PLEG): container finished" podID="8263afa1-30a1-4b88-b31a-7ca24507edbf" containerID="b7d635b78cc93c2263e39170bccc62b32d3d23f687d586088dddd44ee57784a3" exitCode=0 Dec 06 01:18:38 crc kubenswrapper[4991]: I1206 01:18:38.650463 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66v82fh" event={"ID":"8263afa1-30a1-4b88-b31a-7ca24507edbf","Type":"ContainerDied","Data":"b7d635b78cc93c2263e39170bccc62b32d3d23f687d586088dddd44ee57784a3"} Dec 06 01:18:39 crc kubenswrapper[4991]: I1206 01:18:39.961295 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66v82fh" Dec 06 01:18:40 crc kubenswrapper[4991]: I1206 01:18:40.059956 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8263afa1-30a1-4b88-b31a-7ca24507edbf-util\") pod \"8263afa1-30a1-4b88-b31a-7ca24507edbf\" (UID: \"8263afa1-30a1-4b88-b31a-7ca24507edbf\") " Dec 06 01:18:40 crc kubenswrapper[4991]: I1206 01:18:40.060057 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spf4d\" (UniqueName: \"kubernetes.io/projected/8263afa1-30a1-4b88-b31a-7ca24507edbf-kube-api-access-spf4d\") pod \"8263afa1-30a1-4b88-b31a-7ca24507edbf\" (UID: \"8263afa1-30a1-4b88-b31a-7ca24507edbf\") " Dec 06 01:18:40 crc kubenswrapper[4991]: I1206 01:18:40.060113 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8263afa1-30a1-4b88-b31a-7ca24507edbf-bundle\") pod \"8263afa1-30a1-4b88-b31a-7ca24507edbf\" (UID: \"8263afa1-30a1-4b88-b31a-7ca24507edbf\") " Dec 06 01:18:40 crc kubenswrapper[4991]: I1206 01:18:40.060866 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8263afa1-30a1-4b88-b31a-7ca24507edbf-bundle" (OuterVolumeSpecName: "bundle") pod "8263afa1-30a1-4b88-b31a-7ca24507edbf" (UID: "8263afa1-30a1-4b88-b31a-7ca24507edbf"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:18:40 crc kubenswrapper[4991]: I1206 01:18:40.066656 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8263afa1-30a1-4b88-b31a-7ca24507edbf-kube-api-access-spf4d" (OuterVolumeSpecName: "kube-api-access-spf4d") pod "8263afa1-30a1-4b88-b31a-7ca24507edbf" (UID: "8263afa1-30a1-4b88-b31a-7ca24507edbf"). InnerVolumeSpecName "kube-api-access-spf4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:18:40 crc kubenswrapper[4991]: I1206 01:18:40.074766 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8263afa1-30a1-4b88-b31a-7ca24507edbf-util" (OuterVolumeSpecName: "util") pod "8263afa1-30a1-4b88-b31a-7ca24507edbf" (UID: "8263afa1-30a1-4b88-b31a-7ca24507edbf"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:18:40 crc kubenswrapper[4991]: I1206 01:18:40.164587 4991 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8263afa1-30a1-4b88-b31a-7ca24507edbf-util\") on node \"crc\" DevicePath \"\"" Dec 06 01:18:40 crc kubenswrapper[4991]: I1206 01:18:40.164689 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spf4d\" (UniqueName: \"kubernetes.io/projected/8263afa1-30a1-4b88-b31a-7ca24507edbf-kube-api-access-spf4d\") on node \"crc\" DevicePath \"\"" Dec 06 01:18:40 crc kubenswrapper[4991]: I1206 01:18:40.164723 4991 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8263afa1-30a1-4b88-b31a-7ca24507edbf-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:18:40 crc kubenswrapper[4991]: I1206 01:18:40.665136 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66v82fh" event={"ID":"8263afa1-30a1-4b88-b31a-7ca24507edbf","Type":"ContainerDied","Data":"71038b4b2ac7e3e01bfdadd482f988175d0167498145fcbd1b5d71c7dddf94ca"} Dec 06 01:18:40 crc kubenswrapper[4991]: I1206 01:18:40.665368 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71038b4b2ac7e3e01bfdadd482f988175d0167498145fcbd1b5d71c7dddf94ca" Dec 06 01:18:40 crc kubenswrapper[4991]: I1206 01:18:40.665218 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66v82fh" Dec 06 01:18:40 crc kubenswrapper[4991]: E1206 01:18:40.766531 4991 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8263afa1_30a1_4b88_b31a_7ca24507edbf.slice\": RecentStats: unable to find data in memory cache]" Dec 06 01:18:46 crc kubenswrapper[4991]: I1206 01:18:46.739702 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 01:18:46 crc kubenswrapper[4991]: I1206 01:18:46.740691 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 01:18:47 crc kubenswrapper[4991]: I1206 01:18:47.475236 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5f6c6584bc-6swc7"] Dec 06 01:18:47 crc kubenswrapper[4991]: E1206 01:18:47.475526 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8263afa1-30a1-4b88-b31a-7ca24507edbf" containerName="pull" Dec 06 01:18:47 crc kubenswrapper[4991]: I1206 01:18:47.475540 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="8263afa1-30a1-4b88-b31a-7ca24507edbf" containerName="pull" Dec 06 01:18:47 crc kubenswrapper[4991]: E1206 01:18:47.475556 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8263afa1-30a1-4b88-b31a-7ca24507edbf" containerName="extract" Dec 06 01:18:47 crc kubenswrapper[4991]: I1206 01:18:47.475563 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="8263afa1-30a1-4b88-b31a-7ca24507edbf" containerName="extract" Dec 06 01:18:47 crc kubenswrapper[4991]: E1206 01:18:47.475585 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8263afa1-30a1-4b88-b31a-7ca24507edbf" containerName="util" Dec 06 01:18:47 crc kubenswrapper[4991]: I1206 01:18:47.475594 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="8263afa1-30a1-4b88-b31a-7ca24507edbf" containerName="util" Dec 06 01:18:47 crc kubenswrapper[4991]: I1206 01:18:47.475725 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="8263afa1-30a1-4b88-b31a-7ca24507edbf" containerName="extract" Dec 06 01:18:47 crc kubenswrapper[4991]: I1206 01:18:47.476201 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-5f6c6584bc-6swc7" Dec 06 01:18:47 crc kubenswrapper[4991]: I1206 01:18:47.479212 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-rbtv7" Dec 06 01:18:47 crc kubenswrapper[4991]: I1206 01:18:47.498848 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5f6c6584bc-6swc7"] Dec 06 01:18:47 crc kubenswrapper[4991]: I1206 01:18:47.602001 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q48cc\" (UniqueName: \"kubernetes.io/projected/dd84f245-235a-406c-be63-88e9170e9902-kube-api-access-q48cc\") pod \"openstack-operator-controller-operator-5f6c6584bc-6swc7\" (UID: \"dd84f245-235a-406c-be63-88e9170e9902\") " pod="openstack-operators/openstack-operator-controller-operator-5f6c6584bc-6swc7" Dec 06 01:18:47 crc kubenswrapper[4991]: I1206 01:18:47.703386 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q48cc\" (UniqueName: \"kubernetes.io/projected/dd84f245-235a-406c-be63-88e9170e9902-kube-api-access-q48cc\") pod \"openstack-operator-controller-operator-5f6c6584bc-6swc7\" (UID: \"dd84f245-235a-406c-be63-88e9170e9902\") " pod="openstack-operators/openstack-operator-controller-operator-5f6c6584bc-6swc7" Dec 06 01:18:47 crc kubenswrapper[4991]: I1206 01:18:47.728066 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q48cc\" (UniqueName: \"kubernetes.io/projected/dd84f245-235a-406c-be63-88e9170e9902-kube-api-access-q48cc\") pod \"openstack-operator-controller-operator-5f6c6584bc-6swc7\" (UID: \"dd84f245-235a-406c-be63-88e9170e9902\") " pod="openstack-operators/openstack-operator-controller-operator-5f6c6584bc-6swc7" Dec 06 01:18:47 crc kubenswrapper[4991]: I1206 01:18:47.800946 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-5f6c6584bc-6swc7" Dec 06 01:18:48 crc kubenswrapper[4991]: I1206 01:18:48.313371 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5f6c6584bc-6swc7"] Dec 06 01:18:48 crc kubenswrapper[4991]: I1206 01:18:48.736291 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5f6c6584bc-6swc7" event={"ID":"dd84f245-235a-406c-be63-88e9170e9902","Type":"ContainerStarted","Data":"c48b151f77d6b960ae37863c46f5ab0835122a9aad1e10372d5c8c8937d3e269"} Dec 06 01:18:53 crc kubenswrapper[4991]: I1206 01:18:53.799319 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5f6c6584bc-6swc7" event={"ID":"dd84f245-235a-406c-be63-88e9170e9902","Type":"ContainerStarted","Data":"7ea72787422e0fe027f396994e046a0b407d8ddb1e13d946a59f9bff2adabc2b"} Dec 06 01:18:53 crc kubenswrapper[4991]: I1206 01:18:53.800173 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-5f6c6584bc-6swc7" Dec 06 01:18:57 crc kubenswrapper[4991]: I1206 01:18:57.805582 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-5f6c6584bc-6swc7" Dec 06 01:18:57 crc kubenswrapper[4991]: I1206 01:18:57.864380 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-5f6c6584bc-6swc7" podStartSLOduration=6.526041416 podStartE2EDuration="10.864359997s" podCreationTimestamp="2025-12-06 01:18:47 +0000 UTC" firstStartedPulling="2025-12-06 01:18:48.340280816 +0000 UTC m=+1001.690571284" lastFinishedPulling="2025-12-06 01:18:52.678599397 +0000 UTC m=+1006.028889865" observedRunningTime="2025-12-06 01:18:53.846701016 +0000 UTC m=+1007.196991534" watchObservedRunningTime="2025-12-06 01:18:57.864359997 +0000 UTC m=+1011.214650475" Dec 06 01:19:16 crc kubenswrapper[4991]: I1206 01:19:16.739852 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 01:19:16 crc kubenswrapper[4991]: I1206 01:19:16.741146 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.596090 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-crkd2"] Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.598565 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-crkd2" Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.599848 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-gbqt8"] Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.601007 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-gbqt8" Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.601936 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-wnmtj" Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.609611 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-79zjd" Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.617244 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-crkd2"] Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.621935 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-gbqt8"] Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.646579 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-twvbc"] Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.647815 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-twvbc" Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.650091 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-mjh4q" Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.678038 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-hll9p"] Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.679105 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-hll9p" Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.685279 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-r7w7w" Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.686633 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-twvbc"] Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.710423 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-hll9p"] Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.714785 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5j6br"] Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.715812 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5j6br" Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.719217 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-gbltm" Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.725883 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-h9lxn"] Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.728135 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-h9lxn" Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.731368 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-bszft" Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.748969 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5j6br"] Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.754322 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-h9lxn"] Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.767803 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58tnq\" (UniqueName: \"kubernetes.io/projected/c99be400-34b2-431b-ba71-9c6c03f70c91-kube-api-access-58tnq\") pod \"cinder-operator-controller-manager-859b6ccc6-gbqt8\" (UID: \"c99be400-34b2-431b-ba71-9c6c03f70c91\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-gbqt8" Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.767894 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hppgj\" (UniqueName: \"kubernetes.io/projected/7ceea028-825c-4235-b1bb-52adec27f46d-kube-api-access-hppgj\") pod \"designate-operator-controller-manager-78b4bc895b-twvbc\" (UID: \"7ceea028-825c-4235-b1bb-52adec27f46d\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-twvbc" Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.767928 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrkdf\" (UniqueName: \"kubernetes.io/projected/66cf73ac-d6ca-4a8c-855b-3d26bec5adb7-kube-api-access-vrkdf\") pod \"barbican-operator-controller-manager-7d9dfd778-crkd2\" (UID: \"66cf73ac-d6ca-4a8c-855b-3d26bec5adb7\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-crkd2" Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.780777 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-96kw2"] Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.795054 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-ftl59"] Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.795746 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-ftl59" Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.796165 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-96kw2" Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.803977 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.804257 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-79cw4" Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.804422 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-86fgh" Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.822602 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-lwqvk"] Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.829166 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-lwqvk" Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.834819 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-p8q97" Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.848266 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-96kw2"] Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.863726 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-ftl59"] Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.871553 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hppgj\" (UniqueName: \"kubernetes.io/projected/7ceea028-825c-4235-b1bb-52adec27f46d-kube-api-access-hppgj\") pod \"designate-operator-controller-manager-78b4bc895b-twvbc\" (UID: \"7ceea028-825c-4235-b1bb-52adec27f46d\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-twvbc" Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.871621 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cdcl\" (UniqueName: \"kubernetes.io/projected/37dd0775-3a19-4005-9cf0-7c038c293004-kube-api-access-4cdcl\") pod \"infra-operator-controller-manager-57548d458d-96kw2\" (UID: \"37dd0775-3a19-4005-9cf0-7c038c293004\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-96kw2" Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.871654 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrkdf\" (UniqueName: \"kubernetes.io/projected/66cf73ac-d6ca-4a8c-855b-3d26bec5adb7-kube-api-access-vrkdf\") pod \"barbican-operator-controller-manager-7d9dfd778-crkd2\" (UID: \"66cf73ac-d6ca-4a8c-855b-3d26bec5adb7\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-crkd2" Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.871692 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5hzd\" (UniqueName: \"kubernetes.io/projected/4275f29c-130a-4e02-aeb5-6410643706b8-kube-api-access-l5hzd\") pod \"glance-operator-controller-manager-77987cd8cd-hll9p\" (UID: \"4275f29c-130a-4e02-aeb5-6410643706b8\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-hll9p" Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.871715 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58tnq\" (UniqueName: \"kubernetes.io/projected/c99be400-34b2-431b-ba71-9c6c03f70c91-kube-api-access-58tnq\") pod \"cinder-operator-controller-manager-859b6ccc6-gbqt8\" (UID: \"c99be400-34b2-431b-ba71-9c6c03f70c91\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-gbqt8" Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.871749 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vnmc\" (UniqueName: \"kubernetes.io/projected/d21383b5-1b67-4b8f-a176-05769a28528d-kube-api-access-8vnmc\") pod \"keystone-operator-controller-manager-7765d96ddf-lwqvk\" (UID: \"d21383b5-1b67-4b8f-a176-05769a28528d\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-lwqvk" Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.871771 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s27b8\" (UniqueName: \"kubernetes.io/projected/852a1018-8ddc-4a27-ba0c-8a1fcf0d010f-kube-api-access-s27b8\") pod \"ironic-operator-controller-manager-6c548fd776-ftl59\" (UID: \"852a1018-8ddc-4a27-ba0c-8a1fcf0d010f\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-ftl59" Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.871790 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnbtt\" (UniqueName: \"kubernetes.io/projected/14a62a5b-5719-4192-b2d8-40dc2e000f41-kube-api-access-mnbtt\") pod \"heat-operator-controller-manager-5f64f6f8bb-5j6br\" (UID: \"14a62a5b-5719-4192-b2d8-40dc2e000f41\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5j6br" Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.871810 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwl4d\" (UniqueName: \"kubernetes.io/projected/fd88f0ae-0cbd-4958-a05a-51e18d3ff32f-kube-api-access-pwl4d\") pod \"horizon-operator-controller-manager-68c6d99b8f-h9lxn\" (UID: \"fd88f0ae-0cbd-4958-a05a-51e18d3ff32f\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-h9lxn" Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.871844 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37dd0775-3a19-4005-9cf0-7c038c293004-cert\") pod \"infra-operator-controller-manager-57548d458d-96kw2\" (UID: \"37dd0775-3a19-4005-9cf0-7c038c293004\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-96kw2" Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.882277 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-lwqvk"] Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.894614 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-trx9l"] Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.895670 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-trx9l" Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.906503 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-4pvnj"] Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.907757 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-4pvnj" Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.908211 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-2pwlf" Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.911359 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-b6r4j" Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.941891 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58tnq\" (UniqueName: \"kubernetes.io/projected/c99be400-34b2-431b-ba71-9c6c03f70c91-kube-api-access-58tnq\") pod \"cinder-operator-controller-manager-859b6ccc6-gbqt8\" (UID: \"c99be400-34b2-431b-ba71-9c6c03f70c91\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-gbqt8" Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.955828 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrkdf\" (UniqueName: \"kubernetes.io/projected/66cf73ac-d6ca-4a8c-855b-3d26bec5adb7-kube-api-access-vrkdf\") pod \"barbican-operator-controller-manager-7d9dfd778-crkd2\" (UID: \"66cf73ac-d6ca-4a8c-855b-3d26bec5adb7\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-crkd2" Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.962969 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hppgj\" (UniqueName: \"kubernetes.io/projected/7ceea028-825c-4235-b1bb-52adec27f46d-kube-api-access-hppgj\") pod \"designate-operator-controller-manager-78b4bc895b-twvbc\" (UID: \"7ceea028-825c-4235-b1bb-52adec27f46d\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-twvbc" Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.965408 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-gbqt8" Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.993482 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vnmc\" (UniqueName: \"kubernetes.io/projected/d21383b5-1b67-4b8f-a176-05769a28528d-kube-api-access-8vnmc\") pod \"keystone-operator-controller-manager-7765d96ddf-lwqvk\" (UID: \"d21383b5-1b67-4b8f-a176-05769a28528d\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-lwqvk" Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.993545 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s27b8\" (UniqueName: \"kubernetes.io/projected/852a1018-8ddc-4a27-ba0c-8a1fcf0d010f-kube-api-access-s27b8\") pod \"ironic-operator-controller-manager-6c548fd776-ftl59\" (UID: \"852a1018-8ddc-4a27-ba0c-8a1fcf0d010f\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-ftl59" Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.993577 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnbtt\" (UniqueName: \"kubernetes.io/projected/14a62a5b-5719-4192-b2d8-40dc2e000f41-kube-api-access-mnbtt\") pod \"heat-operator-controller-manager-5f64f6f8bb-5j6br\" (UID: \"14a62a5b-5719-4192-b2d8-40dc2e000f41\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5j6br" Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.993596 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwl4d\" (UniqueName: \"kubernetes.io/projected/fd88f0ae-0cbd-4958-a05a-51e18d3ff32f-kube-api-access-pwl4d\") pod \"horizon-operator-controller-manager-68c6d99b8f-h9lxn\" (UID: \"fd88f0ae-0cbd-4958-a05a-51e18d3ff32f\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-h9lxn" Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.993672 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37dd0775-3a19-4005-9cf0-7c038c293004-cert\") pod \"infra-operator-controller-manager-57548d458d-96kw2\" (UID: \"37dd0775-3a19-4005-9cf0-7c038c293004\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-96kw2" Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.993743 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cdcl\" (UniqueName: \"kubernetes.io/projected/37dd0775-3a19-4005-9cf0-7c038c293004-kube-api-access-4cdcl\") pod \"infra-operator-controller-manager-57548d458d-96kw2\" (UID: \"37dd0775-3a19-4005-9cf0-7c038c293004\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-96kw2" Dec 06 01:19:34 crc kubenswrapper[4991]: I1206 01:19:34.993832 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5hzd\" (UniqueName: \"kubernetes.io/projected/4275f29c-130a-4e02-aeb5-6410643706b8-kube-api-access-l5hzd\") pod \"glance-operator-controller-manager-77987cd8cd-hll9p\" (UID: \"4275f29c-130a-4e02-aeb5-6410643706b8\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-hll9p" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.003891 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-twvbc" Dec 06 01:19:35 crc kubenswrapper[4991]: E1206 01:19:35.005705 4991 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 06 01:19:35 crc kubenswrapper[4991]: E1206 01:19:35.005801 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37dd0775-3a19-4005-9cf0-7c038c293004-cert podName:37dd0775-3a19-4005-9cf0-7c038c293004 nodeName:}" failed. No retries permitted until 2025-12-06 01:19:35.505778001 +0000 UTC m=+1048.856068469 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/37dd0775-3a19-4005-9cf0-7c038c293004-cert") pod "infra-operator-controller-manager-57548d458d-96kw2" (UID: "37dd0775-3a19-4005-9cf0-7c038c293004") : secret "infra-operator-webhook-server-cert" not found Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.100348 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnbtt\" (UniqueName: \"kubernetes.io/projected/14a62a5b-5719-4192-b2d8-40dc2e000f41-kube-api-access-mnbtt\") pod \"heat-operator-controller-manager-5f64f6f8bb-5j6br\" (UID: \"14a62a5b-5719-4192-b2d8-40dc2e000f41\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5j6br" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.101148 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq22k\" (UniqueName: \"kubernetes.io/projected/4fafae92-8cd1-450f-9e3b-7abd1020ab75-kube-api-access-dq22k\") pod \"mariadb-operator-controller-manager-56bbcc9d85-4pvnj\" (UID: \"4fafae92-8cd1-450f-9e3b-7abd1020ab75\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-4pvnj" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.101185 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcsvd\" (UniqueName: \"kubernetes.io/projected/7f3695d3-1d23-4779-90a6-d1b62ac3edc5-kube-api-access-jcsvd\") pod \"manila-operator-controller-manager-7c79b5df47-trx9l\" (UID: \"7f3695d3-1d23-4779-90a6-d1b62ac3edc5\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-trx9l" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.103812 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-trx9l"] Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.106459 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cdcl\" (UniqueName: \"kubernetes.io/projected/37dd0775-3a19-4005-9cf0-7c038c293004-kube-api-access-4cdcl\") pod \"infra-operator-controller-manager-57548d458d-96kw2\" (UID: \"37dd0775-3a19-4005-9cf0-7c038c293004\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-96kw2" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.111712 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vnmc\" (UniqueName: \"kubernetes.io/projected/d21383b5-1b67-4b8f-a176-05769a28528d-kube-api-access-8vnmc\") pod \"keystone-operator-controller-manager-7765d96ddf-lwqvk\" (UID: \"d21383b5-1b67-4b8f-a176-05769a28528d\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-lwqvk" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.112015 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5hzd\" (UniqueName: \"kubernetes.io/projected/4275f29c-130a-4e02-aeb5-6410643706b8-kube-api-access-l5hzd\") pod \"glance-operator-controller-manager-77987cd8cd-hll9p\" (UID: \"4275f29c-130a-4e02-aeb5-6410643706b8\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-hll9p" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.112553 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s27b8\" (UniqueName: \"kubernetes.io/projected/852a1018-8ddc-4a27-ba0c-8a1fcf0d010f-kube-api-access-s27b8\") pod \"ironic-operator-controller-manager-6c548fd776-ftl59\" (UID: \"852a1018-8ddc-4a27-ba0c-8a1fcf0d010f\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-ftl59" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.125151 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-4pvnj"] Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.129417 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwl4d\" (UniqueName: \"kubernetes.io/projected/fd88f0ae-0cbd-4958-a05a-51e18d3ff32f-kube-api-access-pwl4d\") pod \"horizon-operator-controller-manager-68c6d99b8f-h9lxn\" (UID: \"fd88f0ae-0cbd-4958-a05a-51e18d3ff32f\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-h9lxn" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.140073 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-ftl59" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.153804 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-rlhms"] Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.155205 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-rlhms" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.164895 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-pcw9d" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.194263 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-rlhms"] Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.195651 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-lwqvk" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.208058 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-xjgls"] Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.209310 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-xjgls" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.210773 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq22k\" (UniqueName: \"kubernetes.io/projected/4fafae92-8cd1-450f-9e3b-7abd1020ab75-kube-api-access-dq22k\") pod \"mariadb-operator-controller-manager-56bbcc9d85-4pvnj\" (UID: \"4fafae92-8cd1-450f-9e3b-7abd1020ab75\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-4pvnj" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.210805 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcsvd\" (UniqueName: \"kubernetes.io/projected/7f3695d3-1d23-4779-90a6-d1b62ac3edc5-kube-api-access-jcsvd\") pod \"manila-operator-controller-manager-7c79b5df47-trx9l\" (UID: \"7f3695d3-1d23-4779-90a6-d1b62ac3edc5\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-trx9l" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.217798 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-k45hw" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.226175 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-crkd2" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.229813 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-4hx8k"] Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.237942 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-4hx8k" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.241142 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq22k\" (UniqueName: \"kubernetes.io/projected/4fafae92-8cd1-450f-9e3b-7abd1020ab75-kube-api-access-dq22k\") pod \"mariadb-operator-controller-manager-56bbcc9d85-4pvnj\" (UID: \"4fafae92-8cd1-450f-9e3b-7abd1020ab75\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-4pvnj" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.241859 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-n2pqb" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.248144 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcsvd\" (UniqueName: \"kubernetes.io/projected/7f3695d3-1d23-4779-90a6-d1b62ac3edc5-kube-api-access-jcsvd\") pod \"manila-operator-controller-manager-7c79b5df47-trx9l\" (UID: \"7f3695d3-1d23-4779-90a6-d1b62ac3edc5\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-trx9l" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.256116 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-xjgls"] Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.277134 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-4hx8k"] Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.290884 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4wxbnr"] Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.292047 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4wxbnr" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.295188 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.295412 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-pk65b" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.310279 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-9phkz"] Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.313021 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-988w8\" (UniqueName: \"kubernetes.io/projected/e5351a0d-ece3-441f-95f6-ccc4281e9462-kube-api-access-988w8\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-rlhms\" (UID: \"e5351a0d-ece3-441f-95f6-ccc4281e9462\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-rlhms" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.313059 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlzlj\" (UniqueName: \"kubernetes.io/projected/8c5d96f9-58a1-4ee7-82dd-f71237ef875c-kube-api-access-mlzlj\") pod \"nova-operator-controller-manager-697bc559fc-xjgls\" (UID: \"8c5d96f9-58a1-4ee7-82dd-f71237ef875c\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-xjgls" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.315055 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-9phkz" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.321761 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-x9gzz" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.330293 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-hll9p" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.341054 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4wxbnr"] Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.350221 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-9phkz"] Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.352449 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5j6br" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.363898 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-trx9l" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.364828 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-h9lxn" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.370466 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-72txt"] Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.371642 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-72txt" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.377334 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-vjptt" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.402104 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-72txt"] Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.402676 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-4pvnj" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.412178 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-gkl7x"] Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.413357 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-gkl7x" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.414861 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z5qd\" (UniqueName: \"kubernetes.io/projected/1d549c0d-f4fd-4eeb-852b-78e302f65110-kube-api-access-4z5qd\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4wxbnr\" (UID: \"1d549c0d-f4fd-4eeb-852b-78e302f65110\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4wxbnr" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.414922 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-988w8\" (UniqueName: \"kubernetes.io/projected/e5351a0d-ece3-441f-95f6-ccc4281e9462-kube-api-access-988w8\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-rlhms\" (UID: \"e5351a0d-ece3-441f-95f6-ccc4281e9462\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-rlhms" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.414946 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlzlj\" (UniqueName: \"kubernetes.io/projected/8c5d96f9-58a1-4ee7-82dd-f71237ef875c-kube-api-access-mlzlj\") pod \"nova-operator-controller-manager-697bc559fc-xjgls\" (UID: \"8c5d96f9-58a1-4ee7-82dd-f71237ef875c\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-xjgls" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.414964 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d549c0d-f4fd-4eeb-852b-78e302f65110-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4wxbnr\" (UID: \"1d549c0d-f4fd-4eeb-852b-78e302f65110\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4wxbnr" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.415082 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc47v\" (UniqueName: \"kubernetes.io/projected/a4fd6e71-89f4-41d3-beed-a894e45b8fe8-kube-api-access-jc47v\") pod \"ovn-operator-controller-manager-b6456fdb6-9phkz\" (UID: \"a4fd6e71-89f4-41d3-beed-a894e45b8fe8\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-9phkz" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.415109 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prjql\" (UniqueName: \"kubernetes.io/projected/91bdc603-1933-4f22-9ecd-d3e95f2f0924-kube-api-access-prjql\") pod \"octavia-operator-controller-manager-998648c74-4hx8k\" (UID: \"91bdc603-1933-4f22-9ecd-d3e95f2f0924\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-4hx8k" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.421392 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-kdtbl" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.428661 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-gkl7x"] Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.447074 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-gdckl"] Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.448500 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-gdckl" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.455456 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-km7n2" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.458907 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-gdckl"] Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.462424 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlzlj\" (UniqueName: \"kubernetes.io/projected/8c5d96f9-58a1-4ee7-82dd-f71237ef875c-kube-api-access-mlzlj\") pod \"nova-operator-controller-manager-697bc559fc-xjgls\" (UID: \"8c5d96f9-58a1-4ee7-82dd-f71237ef875c\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-xjgls" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.463698 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-988w8\" (UniqueName: \"kubernetes.io/projected/e5351a0d-ece3-441f-95f6-ccc4281e9462-kube-api-access-988w8\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-rlhms\" (UID: \"e5351a0d-ece3-441f-95f6-ccc4281e9462\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-rlhms" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.511405 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-rlhms" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.516643 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d549c0d-f4fd-4eeb-852b-78e302f65110-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4wxbnr\" (UID: \"1d549c0d-f4fd-4eeb-852b-78e302f65110\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4wxbnr" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.516675 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc47v\" (UniqueName: \"kubernetes.io/projected/a4fd6e71-89f4-41d3-beed-a894e45b8fe8-kube-api-access-jc47v\") pod \"ovn-operator-controller-manager-b6456fdb6-9phkz\" (UID: \"a4fd6e71-89f4-41d3-beed-a894e45b8fe8\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-9phkz" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.516724 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prjql\" (UniqueName: \"kubernetes.io/projected/91bdc603-1933-4f22-9ecd-d3e95f2f0924-kube-api-access-prjql\") pod \"octavia-operator-controller-manager-998648c74-4hx8k\" (UID: \"91bdc603-1933-4f22-9ecd-d3e95f2f0924\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-4hx8k" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.516817 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4z5qd\" (UniqueName: \"kubernetes.io/projected/1d549c0d-f4fd-4eeb-852b-78e302f65110-kube-api-access-4z5qd\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4wxbnr\" (UID: \"1d549c0d-f4fd-4eeb-852b-78e302f65110\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4wxbnr" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.516867 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdrnv\" (UniqueName: \"kubernetes.io/projected/725ee172-29cd-40c6-b0a1-9be795634566-kube-api-access-tdrnv\") pod \"placement-operator-controller-manager-78f8948974-72txt\" (UID: \"725ee172-29cd-40c6-b0a1-9be795634566\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-72txt" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.516891 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcpj8\" (UniqueName: \"kubernetes.io/projected/54706fa6-a00c-4f66-bce5-660d5475ff7b-kube-api-access-gcpj8\") pod \"telemetry-operator-controller-manager-76cc84c6bb-gdckl\" (UID: \"54706fa6-a00c-4f66-bce5-660d5475ff7b\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-gdckl" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.516939 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz2dk\" (UniqueName: \"kubernetes.io/projected/6b1fa1e2-0eb6-48bf-bebc-93db7d90a644-kube-api-access-tz2dk\") pod \"swift-operator-controller-manager-5f8c65bbfc-gkl7x\" (UID: \"6b1fa1e2-0eb6-48bf-bebc-93db7d90a644\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-gkl7x" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.516960 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37dd0775-3a19-4005-9cf0-7c038c293004-cert\") pod \"infra-operator-controller-manager-57548d458d-96kw2\" (UID: \"37dd0775-3a19-4005-9cf0-7c038c293004\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-96kw2" Dec 06 01:19:35 crc kubenswrapper[4991]: E1206 01:19:35.517128 4991 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 06 01:19:35 crc kubenswrapper[4991]: E1206 01:19:35.517190 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37dd0775-3a19-4005-9cf0-7c038c293004-cert podName:37dd0775-3a19-4005-9cf0-7c038c293004 nodeName:}" failed. No retries permitted until 2025-12-06 01:19:36.517154257 +0000 UTC m=+1049.867444725 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/37dd0775-3a19-4005-9cf0-7c038c293004-cert") pod "infra-operator-controller-manager-57548d458d-96kw2" (UID: "37dd0775-3a19-4005-9cf0-7c038c293004") : secret "infra-operator-webhook-server-cert" not found Dec 06 01:19:35 crc kubenswrapper[4991]: E1206 01:19:35.519250 4991 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 01:19:35 crc kubenswrapper[4991]: E1206 01:19:35.519359 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d549c0d-f4fd-4eeb-852b-78e302f65110-cert podName:1d549c0d-f4fd-4eeb-852b-78e302f65110 nodeName:}" failed. No retries permitted until 2025-12-06 01:19:36.019330004 +0000 UTC m=+1049.369620612 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1d549c0d-f4fd-4eeb-852b-78e302f65110-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4wxbnr" (UID: "1d549c0d-f4fd-4eeb-852b-78e302f65110") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.519613 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-5v8fh"] Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.521309 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-5v8fh" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.526783 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-w4ch6" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.540874 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-xjgls" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.543166 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z5qd\" (UniqueName: \"kubernetes.io/projected/1d549c0d-f4fd-4eeb-852b-78e302f65110-kube-api-access-4z5qd\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4wxbnr\" (UID: \"1d549c0d-f4fd-4eeb-852b-78e302f65110\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4wxbnr" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.547020 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prjql\" (UniqueName: \"kubernetes.io/projected/91bdc603-1933-4f22-9ecd-d3e95f2f0924-kube-api-access-prjql\") pod \"octavia-operator-controller-manager-998648c74-4hx8k\" (UID: \"91bdc603-1933-4f22-9ecd-d3e95f2f0924\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-4hx8k" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.548032 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc47v\" (UniqueName: \"kubernetes.io/projected/a4fd6e71-89f4-41d3-beed-a894e45b8fe8-kube-api-access-jc47v\") pod \"ovn-operator-controller-manager-b6456fdb6-9phkz\" (UID: \"a4fd6e71-89f4-41d3-beed-a894e45b8fe8\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-9phkz" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.554588 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-5v8fh"] Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.560371 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-r6xm6"] Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.562221 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-r6xm6" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.571175 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-fgtzv" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.585733 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-4hx8k" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.603048 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-r6xm6"] Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.620672 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56pkq\" (UniqueName: \"kubernetes.io/projected/7869a8de-b32f-4600-af69-49a42ec4d096-kube-api-access-56pkq\") pod \"test-operator-controller-manager-5854674fcc-5v8fh\" (UID: \"7869a8de-b32f-4600-af69-49a42ec4d096\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-5v8fh" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.620782 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdrnv\" (UniqueName: \"kubernetes.io/projected/725ee172-29cd-40c6-b0a1-9be795634566-kube-api-access-tdrnv\") pod \"placement-operator-controller-manager-78f8948974-72txt\" (UID: \"725ee172-29cd-40c6-b0a1-9be795634566\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-72txt" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.620816 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcpj8\" (UniqueName: \"kubernetes.io/projected/54706fa6-a00c-4f66-bce5-660d5475ff7b-kube-api-access-gcpj8\") pod \"telemetry-operator-controller-manager-76cc84c6bb-gdckl\" (UID: \"54706fa6-a00c-4f66-bce5-660d5475ff7b\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-gdckl" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.620880 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz2dk\" (UniqueName: \"kubernetes.io/projected/6b1fa1e2-0eb6-48bf-bebc-93db7d90a644-kube-api-access-tz2dk\") pod \"swift-operator-controller-manager-5f8c65bbfc-gkl7x\" (UID: \"6b1fa1e2-0eb6-48bf-bebc-93db7d90a644\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-gkl7x" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.643600 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcpj8\" (UniqueName: \"kubernetes.io/projected/54706fa6-a00c-4f66-bce5-660d5475ff7b-kube-api-access-gcpj8\") pod \"telemetry-operator-controller-manager-76cc84c6bb-gdckl\" (UID: \"54706fa6-a00c-4f66-bce5-660d5475ff7b\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-gdckl" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.645048 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz2dk\" (UniqueName: \"kubernetes.io/projected/6b1fa1e2-0eb6-48bf-bebc-93db7d90a644-kube-api-access-tz2dk\") pod \"swift-operator-controller-manager-5f8c65bbfc-gkl7x\" (UID: \"6b1fa1e2-0eb6-48bf-bebc-93db7d90a644\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-gkl7x" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.652057 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdrnv\" (UniqueName: \"kubernetes.io/projected/725ee172-29cd-40c6-b0a1-9be795634566-kube-api-access-tdrnv\") pod \"placement-operator-controller-manager-78f8948974-72txt\" (UID: \"725ee172-29cd-40c6-b0a1-9be795634566\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-72txt" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.668873 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-9phkz" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.684804 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7d5d6748cd-5gjhx"] Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.686063 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7d5d6748cd-5gjhx" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.696853 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.696916 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-2qrrk" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.697194 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.703334 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7d5d6748cd-5gjhx"] Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.708399 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-72txt" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.723781 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxp4z\" (UniqueName: \"kubernetes.io/projected/35ecf5a7-ec56-4cfc-a96b-84dada809373-kube-api-access-mxp4z\") pod \"watcher-operator-controller-manager-769dc69bc-r6xm6\" (UID: \"35ecf5a7-ec56-4cfc-a96b-84dada809373\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-r6xm6" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.723880 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56pkq\" (UniqueName: \"kubernetes.io/projected/7869a8de-b32f-4600-af69-49a42ec4d096-kube-api-access-56pkq\") pod \"test-operator-controller-manager-5854674fcc-5v8fh\" (UID: \"7869a8de-b32f-4600-af69-49a42ec4d096\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-5v8fh" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.730869 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-z5dnf"] Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.732400 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-z5dnf" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.737551 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-6w8b4" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.758233 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56pkq\" (UniqueName: \"kubernetes.io/projected/7869a8de-b32f-4600-af69-49a42ec4d096-kube-api-access-56pkq\") pod \"test-operator-controller-manager-5854674fcc-5v8fh\" (UID: \"7869a8de-b32f-4600-af69-49a42ec4d096\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-5v8fh" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.771662 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-z5dnf"] Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.772178 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-gkl7x" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.789707 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-gdckl" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.816113 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-gbqt8"] Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.825840 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhgj6\" (UniqueName: \"kubernetes.io/projected/5cfe648a-c326-46aa-83ed-31659feb31d0-kube-api-access-zhgj6\") pod \"rabbitmq-cluster-operator-manager-668c99d594-z5dnf\" (UID: \"5cfe648a-c326-46aa-83ed-31659feb31d0\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-z5dnf" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.825942 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a60f2feb-b041-4bb0-b718-9e669f3cf9c8-metrics-certs\") pod \"openstack-operator-controller-manager-7d5d6748cd-5gjhx\" (UID: \"a60f2feb-b041-4bb0-b718-9e669f3cf9c8\") " pod="openstack-operators/openstack-operator-controller-manager-7d5d6748cd-5gjhx" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.825975 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a60f2feb-b041-4bb0-b718-9e669f3cf9c8-webhook-certs\") pod \"openstack-operator-controller-manager-7d5d6748cd-5gjhx\" (UID: \"a60f2feb-b041-4bb0-b718-9e669f3cf9c8\") " pod="openstack-operators/openstack-operator-controller-manager-7d5d6748cd-5gjhx" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.826038 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2fjl\" (UniqueName: \"kubernetes.io/projected/a60f2feb-b041-4bb0-b718-9e669f3cf9c8-kube-api-access-j2fjl\") pod \"openstack-operator-controller-manager-7d5d6748cd-5gjhx\" (UID: \"a60f2feb-b041-4bb0-b718-9e669f3cf9c8\") " pod="openstack-operators/openstack-operator-controller-manager-7d5d6748cd-5gjhx" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.826101 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxp4z\" (UniqueName: \"kubernetes.io/projected/35ecf5a7-ec56-4cfc-a96b-84dada809373-kube-api-access-mxp4z\") pod \"watcher-operator-controller-manager-769dc69bc-r6xm6\" (UID: \"35ecf5a7-ec56-4cfc-a96b-84dada809373\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-r6xm6" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.827743 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-twvbc"] Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.844102 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxp4z\" (UniqueName: \"kubernetes.io/projected/35ecf5a7-ec56-4cfc-a96b-84dada809373-kube-api-access-mxp4z\") pod \"watcher-operator-controller-manager-769dc69bc-r6xm6\" (UID: \"35ecf5a7-ec56-4cfc-a96b-84dada809373\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-r6xm6" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.852959 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-5v8fh" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.925033 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-r6xm6" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.929378 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhgj6\" (UniqueName: \"kubernetes.io/projected/5cfe648a-c326-46aa-83ed-31659feb31d0-kube-api-access-zhgj6\") pod \"rabbitmq-cluster-operator-manager-668c99d594-z5dnf\" (UID: \"5cfe648a-c326-46aa-83ed-31659feb31d0\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-z5dnf" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.929469 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a60f2feb-b041-4bb0-b718-9e669f3cf9c8-metrics-certs\") pod \"openstack-operator-controller-manager-7d5d6748cd-5gjhx\" (UID: \"a60f2feb-b041-4bb0-b718-9e669f3cf9c8\") " pod="openstack-operators/openstack-operator-controller-manager-7d5d6748cd-5gjhx" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.929499 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a60f2feb-b041-4bb0-b718-9e669f3cf9c8-webhook-certs\") pod \"openstack-operator-controller-manager-7d5d6748cd-5gjhx\" (UID: \"a60f2feb-b041-4bb0-b718-9e669f3cf9c8\") " pod="openstack-operators/openstack-operator-controller-manager-7d5d6748cd-5gjhx" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.929532 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2fjl\" (UniqueName: \"kubernetes.io/projected/a60f2feb-b041-4bb0-b718-9e669f3cf9c8-kube-api-access-j2fjl\") pod \"openstack-operator-controller-manager-7d5d6748cd-5gjhx\" (UID: \"a60f2feb-b041-4bb0-b718-9e669f3cf9c8\") " pod="openstack-operators/openstack-operator-controller-manager-7d5d6748cd-5gjhx" Dec 06 01:19:35 crc kubenswrapper[4991]: E1206 01:19:35.930924 4991 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 06 01:19:35 crc kubenswrapper[4991]: E1206 01:19:35.931026 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a60f2feb-b041-4bb0-b718-9e669f3cf9c8-metrics-certs podName:a60f2feb-b041-4bb0-b718-9e669f3cf9c8 nodeName:}" failed. No retries permitted until 2025-12-06 01:19:36.430999927 +0000 UTC m=+1049.781290395 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a60f2feb-b041-4bb0-b718-9e669f3cf9c8-metrics-certs") pod "openstack-operator-controller-manager-7d5d6748cd-5gjhx" (UID: "a60f2feb-b041-4bb0-b718-9e669f3cf9c8") : secret "metrics-server-cert" not found Dec 06 01:19:35 crc kubenswrapper[4991]: E1206 01:19:35.931242 4991 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 06 01:19:35 crc kubenswrapper[4991]: E1206 01:19:35.931275 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a60f2feb-b041-4bb0-b718-9e669f3cf9c8-webhook-certs podName:a60f2feb-b041-4bb0-b718-9e669f3cf9c8 nodeName:}" failed. No retries permitted until 2025-12-06 01:19:36.431265304 +0000 UTC m=+1049.781555772 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a60f2feb-b041-4bb0-b718-9e669f3cf9c8-webhook-certs") pod "openstack-operator-controller-manager-7d5d6748cd-5gjhx" (UID: "a60f2feb-b041-4bb0-b718-9e669f3cf9c8") : secret "webhook-server-cert" not found Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.959599 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-ftl59"] Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.961364 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhgj6\" (UniqueName: \"kubernetes.io/projected/5cfe648a-c326-46aa-83ed-31659feb31d0-kube-api-access-zhgj6\") pod \"rabbitmq-cluster-operator-manager-668c99d594-z5dnf\" (UID: \"5cfe648a-c326-46aa-83ed-31659feb31d0\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-z5dnf" Dec 06 01:19:35 crc kubenswrapper[4991]: I1206 01:19:35.979274 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2fjl\" (UniqueName: \"kubernetes.io/projected/a60f2feb-b041-4bb0-b718-9e669f3cf9c8-kube-api-access-j2fjl\") pod \"openstack-operator-controller-manager-7d5d6748cd-5gjhx\" (UID: \"a60f2feb-b041-4bb0-b718-9e669f3cf9c8\") " pod="openstack-operators/openstack-operator-controller-manager-7d5d6748cd-5gjhx" Dec 06 01:19:36 crc kubenswrapper[4991]: I1206 01:19:36.031207 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d549c0d-f4fd-4eeb-852b-78e302f65110-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4wxbnr\" (UID: \"1d549c0d-f4fd-4eeb-852b-78e302f65110\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4wxbnr" Dec 06 01:19:36 crc kubenswrapper[4991]: E1206 01:19:36.031376 4991 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 01:19:36 crc kubenswrapper[4991]: E1206 01:19:36.031439 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d549c0d-f4fd-4eeb-852b-78e302f65110-cert podName:1d549c0d-f4fd-4eeb-852b-78e302f65110 nodeName:}" failed. No retries permitted until 2025-12-06 01:19:37.031419878 +0000 UTC m=+1050.381710346 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1d549c0d-f4fd-4eeb-852b-78e302f65110-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4wxbnr" (UID: "1d549c0d-f4fd-4eeb-852b-78e302f65110") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 01:19:36 crc kubenswrapper[4991]: W1206 01:19:36.047698 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod852a1018_8ddc_4a27_ba0c_8a1fcf0d010f.slice/crio-10a497a60be2bb5df2e10af5fe5804ef7930572bd26afe28983dcb4cd413244a WatchSource:0}: Error finding container 10a497a60be2bb5df2e10af5fe5804ef7930572bd26afe28983dcb4cd413244a: Status 404 returned error can't find the container with id 10a497a60be2bb5df2e10af5fe5804ef7930572bd26afe28983dcb4cd413244a Dec 06 01:19:36 crc kubenswrapper[4991]: I1206 01:19:36.063912 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-lwqvk"] Dec 06 01:19:36 crc kubenswrapper[4991]: I1206 01:19:36.128278 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-z5dnf" Dec 06 01:19:36 crc kubenswrapper[4991]: I1206 01:19:36.138564 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-gbqt8" event={"ID":"c99be400-34b2-431b-ba71-9c6c03f70c91","Type":"ContainerStarted","Data":"95d2cc4af7677c1218c6382b2c3e6412d5a0c49716b1166fa1838fa261387179"} Dec 06 01:19:36 crc kubenswrapper[4991]: I1206 01:19:36.140554 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-twvbc" event={"ID":"7ceea028-825c-4235-b1bb-52adec27f46d","Type":"ContainerStarted","Data":"f6b8f964d37dd9bd7a83bcdc5ddceb5bb0502e221738ed98cb9496748bd11bf5"} Dec 06 01:19:36 crc kubenswrapper[4991]: I1206 01:19:36.141451 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-ftl59" event={"ID":"852a1018-8ddc-4a27-ba0c-8a1fcf0d010f","Type":"ContainerStarted","Data":"10a497a60be2bb5df2e10af5fe5804ef7930572bd26afe28983dcb4cd413244a"} Dec 06 01:19:36 crc kubenswrapper[4991]: I1206 01:19:36.438042 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a60f2feb-b041-4bb0-b718-9e669f3cf9c8-metrics-certs\") pod \"openstack-operator-controller-manager-7d5d6748cd-5gjhx\" (UID: \"a60f2feb-b041-4bb0-b718-9e669f3cf9c8\") " pod="openstack-operators/openstack-operator-controller-manager-7d5d6748cd-5gjhx" Dec 06 01:19:36 crc kubenswrapper[4991]: I1206 01:19:36.438118 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a60f2feb-b041-4bb0-b718-9e669f3cf9c8-webhook-certs\") pod \"openstack-operator-controller-manager-7d5d6748cd-5gjhx\" (UID: \"a60f2feb-b041-4bb0-b718-9e669f3cf9c8\") " pod="openstack-operators/openstack-operator-controller-manager-7d5d6748cd-5gjhx" Dec 06 01:19:36 crc kubenswrapper[4991]: E1206 01:19:36.438367 4991 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 06 01:19:36 crc kubenswrapper[4991]: E1206 01:19:36.438506 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a60f2feb-b041-4bb0-b718-9e669f3cf9c8-metrics-certs podName:a60f2feb-b041-4bb0-b718-9e669f3cf9c8 nodeName:}" failed. No retries permitted until 2025-12-06 01:19:37.438479221 +0000 UTC m=+1050.788769689 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a60f2feb-b041-4bb0-b718-9e669f3cf9c8-metrics-certs") pod "openstack-operator-controller-manager-7d5d6748cd-5gjhx" (UID: "a60f2feb-b041-4bb0-b718-9e669f3cf9c8") : secret "metrics-server-cert" not found Dec 06 01:19:36 crc kubenswrapper[4991]: E1206 01:19:36.438840 4991 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 06 01:19:36 crc kubenswrapper[4991]: E1206 01:19:36.438929 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a60f2feb-b041-4bb0-b718-9e669f3cf9c8-webhook-certs podName:a60f2feb-b041-4bb0-b718-9e669f3cf9c8 nodeName:}" failed. No retries permitted until 2025-12-06 01:19:37.438903332 +0000 UTC m=+1050.789193800 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a60f2feb-b041-4bb0-b718-9e669f3cf9c8-webhook-certs") pod "openstack-operator-controller-manager-7d5d6748cd-5gjhx" (UID: "a60f2feb-b041-4bb0-b718-9e669f3cf9c8") : secret "webhook-server-cert" not found Dec 06 01:19:36 crc kubenswrapper[4991]: I1206 01:19:36.455631 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-crkd2"] Dec 06 01:19:36 crc kubenswrapper[4991]: I1206 01:19:36.462695 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-hll9p"] Dec 06 01:19:36 crc kubenswrapper[4991]: W1206 01:19:36.465154 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66cf73ac_d6ca_4a8c_855b_3d26bec5adb7.slice/crio-5e800aafe54162c16c33a4aca235f027ea922bbf227e67d9b140638575ec0127 WatchSource:0}: Error finding container 5e800aafe54162c16c33a4aca235f027ea922bbf227e67d9b140638575ec0127: Status 404 returned error can't find the container with id 5e800aafe54162c16c33a4aca235f027ea922bbf227e67d9b140638575ec0127 Dec 06 01:19:36 crc kubenswrapper[4991]: I1206 01:19:36.470102 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-4pvnj"] Dec 06 01:19:36 crc kubenswrapper[4991]: W1206 01:19:36.472736 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4275f29c_130a_4e02_aeb5_6410643706b8.slice/crio-3427d80b65b3658a338c947d01d4da1ce732ba5e686c6e6bab57ca6865f4d35a WatchSource:0}: Error finding container 3427d80b65b3658a338c947d01d4da1ce732ba5e686c6e6bab57ca6865f4d35a: Status 404 returned error can't find the container with id 3427d80b65b3658a338c947d01d4da1ce732ba5e686c6e6bab57ca6865f4d35a Dec 06 01:19:36 crc kubenswrapper[4991]: W1206 01:19:36.473446 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14a62a5b_5719_4192_b2d8_40dc2e000f41.slice/crio-b814f1c02531cd3aa34d8fec35eef72a8bbf51de0c57ab7eb541251b05f8f625 WatchSource:0}: Error finding container b814f1c02531cd3aa34d8fec35eef72a8bbf51de0c57ab7eb541251b05f8f625: Status 404 returned error can't find the container with id b814f1c02531cd3aa34d8fec35eef72a8bbf51de0c57ab7eb541251b05f8f625 Dec 06 01:19:36 crc kubenswrapper[4991]: I1206 01:19:36.481541 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5j6br"] Dec 06 01:19:36 crc kubenswrapper[4991]: W1206 01:19:36.485079 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd88f0ae_0cbd_4958_a05a_51e18d3ff32f.slice/crio-2d9d14969181986728d417ca3d281d0d594a0c00d0afea468789bcf696262e86 WatchSource:0}: Error finding container 2d9d14969181986728d417ca3d281d0d594a0c00d0afea468789bcf696262e86: Status 404 returned error can't find the container with id 2d9d14969181986728d417ca3d281d0d594a0c00d0afea468789bcf696262e86 Dec 06 01:19:36 crc kubenswrapper[4991]: I1206 01:19:36.486898 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-h9lxn"] Dec 06 01:19:36 crc kubenswrapper[4991]: I1206 01:19:36.538426 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-rlhms"] Dec 06 01:19:36 crc kubenswrapper[4991]: I1206 01:19:36.539195 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37dd0775-3a19-4005-9cf0-7c038c293004-cert\") pod \"infra-operator-controller-manager-57548d458d-96kw2\" (UID: \"37dd0775-3a19-4005-9cf0-7c038c293004\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-96kw2" Dec 06 01:19:36 crc kubenswrapper[4991]: E1206 01:19:36.539449 4991 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 06 01:19:36 crc kubenswrapper[4991]: E1206 01:19:36.539553 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37dd0775-3a19-4005-9cf0-7c038c293004-cert podName:37dd0775-3a19-4005-9cf0-7c038c293004 nodeName:}" failed. No retries permitted until 2025-12-06 01:19:38.539523578 +0000 UTC m=+1051.889814236 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/37dd0775-3a19-4005-9cf0-7c038c293004-cert") pod "infra-operator-controller-manager-57548d458d-96kw2" (UID: "37dd0775-3a19-4005-9cf0-7c038c293004") : secret "infra-operator-webhook-server-cert" not found Dec 06 01:19:36 crc kubenswrapper[4991]: I1206 01:19:36.546495 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-xjgls"] Dec 06 01:19:36 crc kubenswrapper[4991]: W1206 01:19:36.548525 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c5d96f9_58a1_4ee7_82dd_f71237ef875c.slice/crio-4b32887821d4c4ad1654fcebee0194b2809078d54925e65ce8b49d9c000e81cd WatchSource:0}: Error finding container 4b32887821d4c4ad1654fcebee0194b2809078d54925e65ce8b49d9c000e81cd: Status 404 returned error can't find the container with id 4b32887821d4c4ad1654fcebee0194b2809078d54925e65ce8b49d9c000e81cd Dec 06 01:19:36 crc kubenswrapper[4991]: I1206 01:19:36.553200 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-trx9l"] Dec 06 01:19:36 crc kubenswrapper[4991]: I1206 01:19:36.675592 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-gkl7x"] Dec 06 01:19:36 crc kubenswrapper[4991]: I1206 01:19:36.682405 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-5v8fh"] Dec 06 01:19:36 crc kubenswrapper[4991]: I1206 01:19:36.707690 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-4hx8k"] Dec 06 01:19:36 crc kubenswrapper[4991]: E1206 01:19:36.713754 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tz2dk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-gkl7x_openstack-operators(6b1fa1e2-0eb6-48bf-bebc-93db7d90a644): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 01:19:36 crc kubenswrapper[4991]: E1206 01:19:36.714572 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tdrnv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-72txt_openstack-operators(725ee172-29cd-40c6-b0a1-9be795634566): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 01:19:36 crc kubenswrapper[4991]: E1206 01:19:36.715476 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gcpj8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-gdckl_openstack-operators(54706fa6-a00c-4f66-bce5-660d5475ff7b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 01:19:36 crc kubenswrapper[4991]: E1206 01:19:36.716374 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tz2dk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-gkl7x_openstack-operators(6b1fa1e2-0eb6-48bf-bebc-93db7d90a644): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 01:19:36 crc kubenswrapper[4991]: E1206 01:19:36.718047 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-gkl7x" podUID="6b1fa1e2-0eb6-48bf-bebc-93db7d90a644" Dec 06 01:19:36 crc kubenswrapper[4991]: E1206 01:19:36.720249 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tdrnv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-72txt_openstack-operators(725ee172-29cd-40c6-b0a1-9be795634566): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 01:19:36 crc kubenswrapper[4991]: E1206 01:19:36.720706 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-prjql,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-4hx8k_openstack-operators(91bdc603-1933-4f22-9ecd-d3e95f2f0924): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 01:19:36 crc kubenswrapper[4991]: E1206 01:19:36.721186 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gcpj8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-gdckl_openstack-operators(54706fa6-a00c-4f66-bce5-660d5475ff7b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 01:19:36 crc kubenswrapper[4991]: I1206 01:19:36.721394 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-9phkz"] Dec 06 01:19:36 crc kubenswrapper[4991]: E1206 01:19:36.721399 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-72txt" podUID="725ee172-29cd-40c6-b0a1-9be795634566" Dec 06 01:19:36 crc kubenswrapper[4991]: E1206 01:19:36.722346 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-gdckl" podUID="54706fa6-a00c-4f66-bce5-660d5475ff7b" Dec 06 01:19:36 crc kubenswrapper[4991]: E1206 01:19:36.723318 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-prjql,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-4hx8k_openstack-operators(91bdc603-1933-4f22-9ecd-d3e95f2f0924): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 01:19:36 crc kubenswrapper[4991]: E1206 01:19:36.726063 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/octavia-operator-controller-manager-998648c74-4hx8k" podUID="91bdc603-1933-4f22-9ecd-d3e95f2f0924" Dec 06 01:19:36 crc kubenswrapper[4991]: I1206 01:19:36.728888 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-gdckl"] Dec 06 01:19:36 crc kubenswrapper[4991]: I1206 01:19:36.733458 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-72txt"] Dec 06 01:19:36 crc kubenswrapper[4991]: I1206 01:19:36.865996 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-r6xm6"] Dec 06 01:19:36 crc kubenswrapper[4991]: E1206 01:19:36.871049 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mxp4z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-r6xm6_openstack-operators(35ecf5a7-ec56-4cfc-a96b-84dada809373): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 01:19:36 crc kubenswrapper[4991]: E1206 01:19:36.873871 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mxp4z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-r6xm6_openstack-operators(35ecf5a7-ec56-4cfc-a96b-84dada809373): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 06 01:19:36 crc kubenswrapper[4991]: E1206 01:19:36.875111 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-r6xm6" podUID="35ecf5a7-ec56-4cfc-a96b-84dada809373" Dec 06 01:19:36 crc kubenswrapper[4991]: I1206 01:19:36.878284 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-z5dnf"] Dec 06 01:19:37 crc kubenswrapper[4991]: I1206 01:19:37.053266 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d549c0d-f4fd-4eeb-852b-78e302f65110-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4wxbnr\" (UID: \"1d549c0d-f4fd-4eeb-852b-78e302f65110\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4wxbnr" Dec 06 01:19:37 crc kubenswrapper[4991]: E1206 01:19:37.053857 4991 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 01:19:37 crc kubenswrapper[4991]: E1206 01:19:37.053917 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d549c0d-f4fd-4eeb-852b-78e302f65110-cert podName:1d549c0d-f4fd-4eeb-852b-78e302f65110 nodeName:}" failed. No retries permitted until 2025-12-06 01:19:39.053900222 +0000 UTC m=+1052.404190680 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1d549c0d-f4fd-4eeb-852b-78e302f65110-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4wxbnr" (UID: "1d549c0d-f4fd-4eeb-852b-78e302f65110") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 01:19:37 crc kubenswrapper[4991]: I1206 01:19:37.169582 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-trx9l" event={"ID":"7f3695d3-1d23-4779-90a6-d1b62ac3edc5","Type":"ContainerStarted","Data":"47118f108f83ccea38d687ceedd7fae36375f6452b3156f5a11a3793c8eebb77"} Dec 06 01:19:37 crc kubenswrapper[4991]: I1206 01:19:37.171276 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-xjgls" event={"ID":"8c5d96f9-58a1-4ee7-82dd-f71237ef875c","Type":"ContainerStarted","Data":"4b32887821d4c4ad1654fcebee0194b2809078d54925e65ce8b49d9c000e81cd"} Dec 06 01:19:37 crc kubenswrapper[4991]: I1206 01:19:37.172555 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-h9lxn" event={"ID":"fd88f0ae-0cbd-4958-a05a-51e18d3ff32f","Type":"ContainerStarted","Data":"2d9d14969181986728d417ca3d281d0d594a0c00d0afea468789bcf696262e86"} Dec 06 01:19:37 crc kubenswrapper[4991]: I1206 01:19:37.173839 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-z5dnf" event={"ID":"5cfe648a-c326-46aa-83ed-31659feb31d0","Type":"ContainerStarted","Data":"2f3bc2620d5d5329cfbdf69446756a97aabc2b01efe23e0c578e1a9c053e2c77"} Dec 06 01:19:37 crc kubenswrapper[4991]: I1206 01:19:37.176291 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-r6xm6" event={"ID":"35ecf5a7-ec56-4cfc-a96b-84dada809373","Type":"ContainerStarted","Data":"ece3077511890dc9ac2e21625c5a036c16adb2291aea7ed3800ff83af343c1d7"} Dec 06 01:19:37 crc kubenswrapper[4991]: E1206 01:19:37.182422 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-r6xm6" podUID="35ecf5a7-ec56-4cfc-a96b-84dada809373" Dec 06 01:19:37 crc kubenswrapper[4991]: I1206 01:19:37.182524 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-9phkz" event={"ID":"a4fd6e71-89f4-41d3-beed-a894e45b8fe8","Type":"ContainerStarted","Data":"dea75229fda0147559f0623073ac1805259d0895c049f5cd579242dbff1f1cd2"} Dec 06 01:19:37 crc kubenswrapper[4991]: I1206 01:19:37.185533 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-gdckl" event={"ID":"54706fa6-a00c-4f66-bce5-660d5475ff7b","Type":"ContainerStarted","Data":"067192d595d76494cc0296b359eb3ca9c720140dedc41f84291297dfa8dba7d1"} Dec 06 01:19:37 crc kubenswrapper[4991]: I1206 01:19:37.187567 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-4hx8k" event={"ID":"91bdc603-1933-4f22-9ecd-d3e95f2f0924","Type":"ContainerStarted","Data":"380067ea26c27bb67d12671c98135057da62d4d9fee7cbf3614fa60e35d3a7bc"} Dec 06 01:19:37 crc kubenswrapper[4991]: I1206 01:19:37.189400 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-72txt" event={"ID":"725ee172-29cd-40c6-b0a1-9be795634566","Type":"ContainerStarted","Data":"d1b509bcd50170d4da0aec2a74484b693202b20dd713005321b46e998cb91b7d"} Dec 06 01:19:37 crc kubenswrapper[4991]: E1206 01:19:37.190818 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-gdckl" podUID="54706fa6-a00c-4f66-bce5-660d5475ff7b" Dec 06 01:19:37 crc kubenswrapper[4991]: E1206 01:19:37.194504 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/octavia-operator-controller-manager-998648c74-4hx8k" podUID="91bdc603-1933-4f22-9ecd-d3e95f2f0924" Dec 06 01:19:37 crc kubenswrapper[4991]: I1206 01:19:37.195789 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5j6br" event={"ID":"14a62a5b-5719-4192-b2d8-40dc2e000f41","Type":"ContainerStarted","Data":"b814f1c02531cd3aa34d8fec35eef72a8bbf51de0c57ab7eb541251b05f8f625"} Dec 06 01:19:37 crc kubenswrapper[4991]: E1206 01:19:37.196415 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-72txt" podUID="725ee172-29cd-40c6-b0a1-9be795634566" Dec 06 01:19:37 crc kubenswrapper[4991]: I1206 01:19:37.200560 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-lwqvk" event={"ID":"d21383b5-1b67-4b8f-a176-05769a28528d","Type":"ContainerStarted","Data":"f494f0af17801f72df2a241d92ebe71dcacdac504a64f927e427fdd1bdbcd157"} Dec 06 01:19:37 crc kubenswrapper[4991]: I1206 01:19:37.201849 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-rlhms" event={"ID":"e5351a0d-ece3-441f-95f6-ccc4281e9462","Type":"ContainerStarted","Data":"26cc9a95e363cd40e221949db6ed88a1da9e214b3d4e12402da22ec8c4b418e7"} Dec 06 01:19:37 crc kubenswrapper[4991]: I1206 01:19:37.202849 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-hll9p" event={"ID":"4275f29c-130a-4e02-aeb5-6410643706b8","Type":"ContainerStarted","Data":"3427d80b65b3658a338c947d01d4da1ce732ba5e686c6e6bab57ca6865f4d35a"} Dec 06 01:19:37 crc kubenswrapper[4991]: I1206 01:19:37.204153 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-4pvnj" event={"ID":"4fafae92-8cd1-450f-9e3b-7abd1020ab75","Type":"ContainerStarted","Data":"f87ed2c8ffc9251154244dc1b599f224a74d4d61eaccd4d4a3af686936e03b61"} Dec 06 01:19:37 crc kubenswrapper[4991]: I1206 01:19:37.206564 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-gkl7x" event={"ID":"6b1fa1e2-0eb6-48bf-bebc-93db7d90a644","Type":"ContainerStarted","Data":"91dd47528cf6651b252a76abb72231ee1c469615fa01f9f97960ab3d1759706e"} Dec 06 01:19:37 crc kubenswrapper[4991]: I1206 01:19:37.208196 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-5v8fh" event={"ID":"7869a8de-b32f-4600-af69-49a42ec4d096","Type":"ContainerStarted","Data":"2e4555be72a45430e42dbd290cb9cc8ca214d06a638641ee38d91f09c691dfc1"} Dec 06 01:19:37 crc kubenswrapper[4991]: E1206 01:19:37.209402 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-gkl7x" podUID="6b1fa1e2-0eb6-48bf-bebc-93db7d90a644" Dec 06 01:19:37 crc kubenswrapper[4991]: I1206 01:19:37.210284 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-crkd2" event={"ID":"66cf73ac-d6ca-4a8c-855b-3d26bec5adb7","Type":"ContainerStarted","Data":"5e800aafe54162c16c33a4aca235f027ea922bbf227e67d9b140638575ec0127"} Dec 06 01:19:37 crc kubenswrapper[4991]: I1206 01:19:37.464913 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a60f2feb-b041-4bb0-b718-9e669f3cf9c8-metrics-certs\") pod \"openstack-operator-controller-manager-7d5d6748cd-5gjhx\" (UID: \"a60f2feb-b041-4bb0-b718-9e669f3cf9c8\") " pod="openstack-operators/openstack-operator-controller-manager-7d5d6748cd-5gjhx" Dec 06 01:19:37 crc kubenswrapper[4991]: I1206 01:19:37.464995 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a60f2feb-b041-4bb0-b718-9e669f3cf9c8-webhook-certs\") pod \"openstack-operator-controller-manager-7d5d6748cd-5gjhx\" (UID: \"a60f2feb-b041-4bb0-b718-9e669f3cf9c8\") " pod="openstack-operators/openstack-operator-controller-manager-7d5d6748cd-5gjhx" Dec 06 01:19:37 crc kubenswrapper[4991]: E1206 01:19:37.465206 4991 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 06 01:19:37 crc kubenswrapper[4991]: E1206 01:19:37.465362 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a60f2feb-b041-4bb0-b718-9e669f3cf9c8-metrics-certs podName:a60f2feb-b041-4bb0-b718-9e669f3cf9c8 nodeName:}" failed. No retries permitted until 2025-12-06 01:19:39.465315509 +0000 UTC m=+1052.815606167 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a60f2feb-b041-4bb0-b718-9e669f3cf9c8-metrics-certs") pod "openstack-operator-controller-manager-7d5d6748cd-5gjhx" (UID: "a60f2feb-b041-4bb0-b718-9e669f3cf9c8") : secret "metrics-server-cert" not found Dec 06 01:19:37 crc kubenswrapper[4991]: E1206 01:19:37.465232 4991 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 06 01:19:37 crc kubenswrapper[4991]: E1206 01:19:37.465460 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a60f2feb-b041-4bb0-b718-9e669f3cf9c8-webhook-certs podName:a60f2feb-b041-4bb0-b718-9e669f3cf9c8 nodeName:}" failed. No retries permitted until 2025-12-06 01:19:39.465434042 +0000 UTC m=+1052.815724700 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a60f2feb-b041-4bb0-b718-9e669f3cf9c8-webhook-certs") pod "openstack-operator-controller-manager-7d5d6748cd-5gjhx" (UID: "a60f2feb-b041-4bb0-b718-9e669f3cf9c8") : secret "webhook-server-cert" not found Dec 06 01:19:38 crc kubenswrapper[4991]: E1206 01:19:38.230909 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-gdckl" podUID="54706fa6-a00c-4f66-bce5-660d5475ff7b" Dec 06 01:19:38 crc kubenswrapper[4991]: E1206 01:19:38.232723 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-gkl7x" podUID="6b1fa1e2-0eb6-48bf-bebc-93db7d90a644" Dec 06 01:19:38 crc kubenswrapper[4991]: E1206 01:19:38.233059 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-r6xm6" podUID="35ecf5a7-ec56-4cfc-a96b-84dada809373" Dec 06 01:19:38 crc kubenswrapper[4991]: E1206 01:19:38.233054 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-72txt" podUID="725ee172-29cd-40c6-b0a1-9be795634566" Dec 06 01:19:38 crc kubenswrapper[4991]: E1206 01:19:38.264373 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/octavia-operator-controller-manager-998648c74-4hx8k" podUID="91bdc603-1933-4f22-9ecd-d3e95f2f0924" Dec 06 01:19:38 crc kubenswrapper[4991]: I1206 01:19:38.586159 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37dd0775-3a19-4005-9cf0-7c038c293004-cert\") pod \"infra-operator-controller-manager-57548d458d-96kw2\" (UID: \"37dd0775-3a19-4005-9cf0-7c038c293004\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-96kw2" Dec 06 01:19:38 crc kubenswrapper[4991]: E1206 01:19:38.586498 4991 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 06 01:19:38 crc kubenswrapper[4991]: E1206 01:19:38.586776 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37dd0775-3a19-4005-9cf0-7c038c293004-cert podName:37dd0775-3a19-4005-9cf0-7c038c293004 nodeName:}" failed. No retries permitted until 2025-12-06 01:19:42.586751914 +0000 UTC m=+1055.937042372 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/37dd0775-3a19-4005-9cf0-7c038c293004-cert") pod "infra-operator-controller-manager-57548d458d-96kw2" (UID: "37dd0775-3a19-4005-9cf0-7c038c293004") : secret "infra-operator-webhook-server-cert" not found Dec 06 01:19:39 crc kubenswrapper[4991]: I1206 01:19:39.093174 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d549c0d-f4fd-4eeb-852b-78e302f65110-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4wxbnr\" (UID: \"1d549c0d-f4fd-4eeb-852b-78e302f65110\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4wxbnr" Dec 06 01:19:39 crc kubenswrapper[4991]: E1206 01:19:39.093534 4991 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 01:19:39 crc kubenswrapper[4991]: E1206 01:19:39.093668 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d549c0d-f4fd-4eeb-852b-78e302f65110-cert podName:1d549c0d-f4fd-4eeb-852b-78e302f65110 nodeName:}" failed. No retries permitted until 2025-12-06 01:19:43.093633291 +0000 UTC m=+1056.443923759 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1d549c0d-f4fd-4eeb-852b-78e302f65110-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4wxbnr" (UID: "1d549c0d-f4fd-4eeb-852b-78e302f65110") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 01:19:39 crc kubenswrapper[4991]: I1206 01:19:39.503380 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a60f2feb-b041-4bb0-b718-9e669f3cf9c8-metrics-certs\") pod \"openstack-operator-controller-manager-7d5d6748cd-5gjhx\" (UID: \"a60f2feb-b041-4bb0-b718-9e669f3cf9c8\") " pod="openstack-operators/openstack-operator-controller-manager-7d5d6748cd-5gjhx" Dec 06 01:19:39 crc kubenswrapper[4991]: I1206 01:19:39.503442 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a60f2feb-b041-4bb0-b718-9e669f3cf9c8-webhook-certs\") pod \"openstack-operator-controller-manager-7d5d6748cd-5gjhx\" (UID: \"a60f2feb-b041-4bb0-b718-9e669f3cf9c8\") " pod="openstack-operators/openstack-operator-controller-manager-7d5d6748cd-5gjhx" Dec 06 01:19:39 crc kubenswrapper[4991]: E1206 01:19:39.503669 4991 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 06 01:19:39 crc kubenswrapper[4991]: E1206 01:19:39.503730 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a60f2feb-b041-4bb0-b718-9e669f3cf9c8-webhook-certs podName:a60f2feb-b041-4bb0-b718-9e669f3cf9c8 nodeName:}" failed. No retries permitted until 2025-12-06 01:19:43.503711843 +0000 UTC m=+1056.854002311 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a60f2feb-b041-4bb0-b718-9e669f3cf9c8-webhook-certs") pod "openstack-operator-controller-manager-7d5d6748cd-5gjhx" (UID: "a60f2feb-b041-4bb0-b718-9e669f3cf9c8") : secret "webhook-server-cert" not found Dec 06 01:19:39 crc kubenswrapper[4991]: E1206 01:19:39.503884 4991 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 06 01:19:39 crc kubenswrapper[4991]: E1206 01:19:39.504077 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a60f2feb-b041-4bb0-b718-9e669f3cf9c8-metrics-certs podName:a60f2feb-b041-4bb0-b718-9e669f3cf9c8 nodeName:}" failed. No retries permitted until 2025-12-06 01:19:43.50397551 +0000 UTC m=+1056.854265998 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a60f2feb-b041-4bb0-b718-9e669f3cf9c8-metrics-certs") pod "openstack-operator-controller-manager-7d5d6748cd-5gjhx" (UID: "a60f2feb-b041-4bb0-b718-9e669f3cf9c8") : secret "metrics-server-cert" not found Dec 06 01:19:42 crc kubenswrapper[4991]: I1206 01:19:42.685781 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37dd0775-3a19-4005-9cf0-7c038c293004-cert\") pod \"infra-operator-controller-manager-57548d458d-96kw2\" (UID: \"37dd0775-3a19-4005-9cf0-7c038c293004\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-96kw2" Dec 06 01:19:42 crc kubenswrapper[4991]: E1206 01:19:42.686392 4991 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 06 01:19:42 crc kubenswrapper[4991]: E1206 01:19:42.686494 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37dd0775-3a19-4005-9cf0-7c038c293004-cert podName:37dd0775-3a19-4005-9cf0-7c038c293004 nodeName:}" failed. No retries permitted until 2025-12-06 01:19:50.686471414 +0000 UTC m=+1064.036761962 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/37dd0775-3a19-4005-9cf0-7c038c293004-cert") pod "infra-operator-controller-manager-57548d458d-96kw2" (UID: "37dd0775-3a19-4005-9cf0-7c038c293004") : secret "infra-operator-webhook-server-cert" not found Dec 06 01:19:43 crc kubenswrapper[4991]: I1206 01:19:43.193838 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d549c0d-f4fd-4eeb-852b-78e302f65110-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4wxbnr\" (UID: \"1d549c0d-f4fd-4eeb-852b-78e302f65110\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4wxbnr" Dec 06 01:19:43 crc kubenswrapper[4991]: E1206 01:19:43.194151 4991 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 01:19:43 crc kubenswrapper[4991]: E1206 01:19:43.194284 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d549c0d-f4fd-4eeb-852b-78e302f65110-cert podName:1d549c0d-f4fd-4eeb-852b-78e302f65110 nodeName:}" failed. No retries permitted until 2025-12-06 01:19:51.194255566 +0000 UTC m=+1064.544546034 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1d549c0d-f4fd-4eeb-852b-78e302f65110-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4wxbnr" (UID: "1d549c0d-f4fd-4eeb-852b-78e302f65110") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 06 01:19:43 crc kubenswrapper[4991]: I1206 01:19:43.600612 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a60f2feb-b041-4bb0-b718-9e669f3cf9c8-metrics-certs\") pod \"openstack-operator-controller-manager-7d5d6748cd-5gjhx\" (UID: \"a60f2feb-b041-4bb0-b718-9e669f3cf9c8\") " pod="openstack-operators/openstack-operator-controller-manager-7d5d6748cd-5gjhx" Dec 06 01:19:43 crc kubenswrapper[4991]: I1206 01:19:43.600957 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a60f2feb-b041-4bb0-b718-9e669f3cf9c8-webhook-certs\") pod \"openstack-operator-controller-manager-7d5d6748cd-5gjhx\" (UID: \"a60f2feb-b041-4bb0-b718-9e669f3cf9c8\") " pod="openstack-operators/openstack-operator-controller-manager-7d5d6748cd-5gjhx" Dec 06 01:19:43 crc kubenswrapper[4991]: E1206 01:19:43.601083 4991 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 06 01:19:43 crc kubenswrapper[4991]: E1206 01:19:43.601137 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a60f2feb-b041-4bb0-b718-9e669f3cf9c8-webhook-certs podName:a60f2feb-b041-4bb0-b718-9e669f3cf9c8 nodeName:}" failed. No retries permitted until 2025-12-06 01:19:51.601121404 +0000 UTC m=+1064.951411862 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a60f2feb-b041-4bb0-b718-9e669f3cf9c8-webhook-certs") pod "openstack-operator-controller-manager-7d5d6748cd-5gjhx" (UID: "a60f2feb-b041-4bb0-b718-9e669f3cf9c8") : secret "webhook-server-cert" not found Dec 06 01:19:43 crc kubenswrapper[4991]: E1206 01:19:43.601474 4991 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 06 01:19:43 crc kubenswrapper[4991]: E1206 01:19:43.601505 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a60f2feb-b041-4bb0-b718-9e669f3cf9c8-metrics-certs podName:a60f2feb-b041-4bb0-b718-9e669f3cf9c8 nodeName:}" failed. No retries permitted until 2025-12-06 01:19:51.601496824 +0000 UTC m=+1064.951787292 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a60f2feb-b041-4bb0-b718-9e669f3cf9c8-metrics-certs") pod "openstack-operator-controller-manager-7d5d6748cd-5gjhx" (UID: "a60f2feb-b041-4bb0-b718-9e669f3cf9c8") : secret "metrics-server-cert" not found Dec 06 01:19:46 crc kubenswrapper[4991]: I1206 01:19:46.739726 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 01:19:46 crc kubenswrapper[4991]: I1206 01:19:46.740138 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 01:19:46 crc kubenswrapper[4991]: I1206 01:19:46.740184 4991 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" Dec 06 01:19:46 crc kubenswrapper[4991]: I1206 01:19:46.740915 4991 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f375611be938212e8f0efe7042b61ab4ee937e28fa539d018afc5fdc1b82d418"} pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 01:19:46 crc kubenswrapper[4991]: I1206 01:19:46.740966 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" containerID="cri-o://f375611be938212e8f0efe7042b61ab4ee937e28fa539d018afc5fdc1b82d418" gracePeriod=600 Dec 06 01:19:47 crc kubenswrapper[4991]: I1206 01:19:47.326225 4991 generic.go:334] "Generic (PLEG): container finished" podID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerID="f375611be938212e8f0efe7042b61ab4ee937e28fa539d018afc5fdc1b82d418" exitCode=0 Dec 06 01:19:47 crc kubenswrapper[4991]: I1206 01:19:47.326294 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" event={"ID":"f35cea72-9e31-4432-9e3b-16a50ebce794","Type":"ContainerDied","Data":"f375611be938212e8f0efe7042b61ab4ee937e28fa539d018afc5fdc1b82d418"} Dec 06 01:19:47 crc kubenswrapper[4991]: I1206 01:19:47.326342 4991 scope.go:117] "RemoveContainer" containerID="42613daf70f47f0e109690b595c837fa9976654bde53b0036709cbe0f06846cc" Dec 06 01:19:49 crc kubenswrapper[4991]: E1206 01:19:49.414788 4991 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:0f523b7e2fa9e86fef986acf07d0c42d5658c475d565f11eaea926ebffcb6530" Dec 06 01:19:49 crc kubenswrapper[4991]: E1206 01:19:49.414977 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:0f523b7e2fa9e86fef986acf07d0c42d5658c475d565f11eaea926ebffcb6530,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s27b8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-6c548fd776-ftl59_openstack-operators(852a1018-8ddc-4a27-ba0c-8a1fcf0d010f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 01:19:49 crc kubenswrapper[4991]: E1206 01:19:49.993113 4991 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5" Dec 06 01:19:49 crc kubenswrapper[4991]: E1206 01:19:49.993318 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pwl4d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-68c6d99b8f-h9lxn_openstack-operators(fd88f0ae-0cbd-4958-a05a-51e18d3ff32f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 01:19:50 crc kubenswrapper[4991]: E1206 01:19:50.627758 4991 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:2e59cfbeefc3aff0bb0a6ae9ce2235129f5173c98dd5ee8dac229ad4895faea9" Dec 06 01:19:50 crc kubenswrapper[4991]: E1206 01:19:50.628331 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:2e59cfbeefc3aff0bb0a6ae9ce2235129f5173c98dd5ee8dac229ad4895faea9,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jcsvd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-7c79b5df47-trx9l_openstack-operators(7f3695d3-1d23-4779-90a6-d1b62ac3edc5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 01:19:50 crc kubenswrapper[4991]: I1206 01:19:50.732413 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37dd0775-3a19-4005-9cf0-7c038c293004-cert\") pod \"infra-operator-controller-manager-57548d458d-96kw2\" (UID: \"37dd0775-3a19-4005-9cf0-7c038c293004\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-96kw2" Dec 06 01:19:50 crc kubenswrapper[4991]: I1206 01:19:50.762685 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37dd0775-3a19-4005-9cf0-7c038c293004-cert\") pod \"infra-operator-controller-manager-57548d458d-96kw2\" (UID: \"37dd0775-3a19-4005-9cf0-7c038c293004\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-96kw2" Dec 06 01:19:50 crc kubenswrapper[4991]: I1206 01:19:50.775113 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-96kw2" Dec 06 01:19:51 crc kubenswrapper[4991]: I1206 01:19:51.239227 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d549c0d-f4fd-4eeb-852b-78e302f65110-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4wxbnr\" (UID: \"1d549c0d-f4fd-4eeb-852b-78e302f65110\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4wxbnr" Dec 06 01:19:51 crc kubenswrapper[4991]: I1206 01:19:51.243113 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d549c0d-f4fd-4eeb-852b-78e302f65110-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4wxbnr\" (UID: \"1d549c0d-f4fd-4eeb-852b-78e302f65110\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4wxbnr" Dec 06 01:19:51 crc kubenswrapper[4991]: E1206 01:19:51.254907 4991 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Dec 06 01:19:51 crc kubenswrapper[4991]: E1206 01:19:51.255144 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zhgj6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-z5dnf_openstack-operators(5cfe648a-c326-46aa-83ed-31659feb31d0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 01:19:51 crc kubenswrapper[4991]: E1206 01:19:51.256796 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-z5dnf" podUID="5cfe648a-c326-46aa-83ed-31659feb31d0" Dec 06 01:19:51 crc kubenswrapper[4991]: E1206 01:19:51.370804 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-z5dnf" podUID="5cfe648a-c326-46aa-83ed-31659feb31d0" Dec 06 01:19:51 crc kubenswrapper[4991]: I1206 01:19:51.523327 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4wxbnr" Dec 06 01:19:51 crc kubenswrapper[4991]: I1206 01:19:51.644932 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a60f2feb-b041-4bb0-b718-9e669f3cf9c8-metrics-certs\") pod \"openstack-operator-controller-manager-7d5d6748cd-5gjhx\" (UID: \"a60f2feb-b041-4bb0-b718-9e669f3cf9c8\") " pod="openstack-operators/openstack-operator-controller-manager-7d5d6748cd-5gjhx" Dec 06 01:19:51 crc kubenswrapper[4991]: I1206 01:19:51.645012 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a60f2feb-b041-4bb0-b718-9e669f3cf9c8-webhook-certs\") pod \"openstack-operator-controller-manager-7d5d6748cd-5gjhx\" (UID: \"a60f2feb-b041-4bb0-b718-9e669f3cf9c8\") " pod="openstack-operators/openstack-operator-controller-manager-7d5d6748cd-5gjhx" Dec 06 01:19:51 crc kubenswrapper[4991]: I1206 01:19:51.648872 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a60f2feb-b041-4bb0-b718-9e669f3cf9c8-webhook-certs\") pod \"openstack-operator-controller-manager-7d5d6748cd-5gjhx\" (UID: \"a60f2feb-b041-4bb0-b718-9e669f3cf9c8\") " pod="openstack-operators/openstack-operator-controller-manager-7d5d6748cd-5gjhx" Dec 06 01:19:51 crc kubenswrapper[4991]: I1206 01:19:51.649177 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a60f2feb-b041-4bb0-b718-9e669f3cf9c8-metrics-certs\") pod \"openstack-operator-controller-manager-7d5d6748cd-5gjhx\" (UID: \"a60f2feb-b041-4bb0-b718-9e669f3cf9c8\") " pod="openstack-operators/openstack-operator-controller-manager-7d5d6748cd-5gjhx" Dec 06 01:19:51 crc kubenswrapper[4991]: I1206 01:19:51.924351 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7d5d6748cd-5gjhx" Dec 06 01:19:52 crc kubenswrapper[4991]: E1206 01:19:52.080678 4991 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7" Dec 06 01:19:52 crc kubenswrapper[4991]: E1206 01:19:52.080973 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8vnmc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-lwqvk_openstack-operators(d21383b5-1b67-4b8f-a176-05769a28528d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 01:19:52 crc kubenswrapper[4991]: E1206 01:19:52.609537 4991 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670" Dec 06 01:19:52 crc kubenswrapper[4991]: E1206 01:19:52.609750 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mlzlj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-xjgls_openstack-operators(8c5d96f9-58a1-4ee7-82dd-f71237ef875c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 01:19:54 crc kubenswrapper[4991]: I1206 01:19:54.976363 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4wxbnr"] Dec 06 01:19:55 crc kubenswrapper[4991]: I1206 01:19:55.076264 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7d5d6748cd-5gjhx"] Dec 06 01:19:55 crc kubenswrapper[4991]: W1206 01:19:55.174373 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d549c0d_f4fd_4eeb_852b_78e302f65110.slice/crio-f39548c29519db26e3486d8957facc53cf50c6f7147e7b3492a38841f613dd5e WatchSource:0}: Error finding container f39548c29519db26e3486d8957facc53cf50c6f7147e7b3492a38841f613dd5e: Status 404 returned error can't find the container with id f39548c29519db26e3486d8957facc53cf50c6f7147e7b3492a38841f613dd5e Dec 06 01:19:55 crc kubenswrapper[4991]: W1206 01:19:55.182799 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda60f2feb_b041_4bb0_b718_9e669f3cf9c8.slice/crio-3b7c75bab8c4015e06eb2aa9165e604dd9b5f21cba88c0d75989d740d69e1431 WatchSource:0}: Error finding container 3b7c75bab8c4015e06eb2aa9165e604dd9b5f21cba88c0d75989d740d69e1431: Status 404 returned error can't find the container with id 3b7c75bab8c4015e06eb2aa9165e604dd9b5f21cba88c0d75989d740d69e1431 Dec 06 01:19:55 crc kubenswrapper[4991]: I1206 01:19:55.223820 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-96kw2"] Dec 06 01:19:55 crc kubenswrapper[4991]: I1206 01:19:55.415613 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7d5d6748cd-5gjhx" event={"ID":"a60f2feb-b041-4bb0-b718-9e669f3cf9c8","Type":"ContainerStarted","Data":"3b7c75bab8c4015e06eb2aa9165e604dd9b5f21cba88c0d75989d740d69e1431"} Dec 06 01:19:55 crc kubenswrapper[4991]: I1206 01:19:55.419583 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-gbqt8" event={"ID":"c99be400-34b2-431b-ba71-9c6c03f70c91","Type":"ContainerStarted","Data":"c7a4b9f30b3cc6023f4634aabc6da74f2fee963b631dae0dbec13c7107c5d66f"} Dec 06 01:19:55 crc kubenswrapper[4991]: I1206 01:19:55.429819 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" event={"ID":"f35cea72-9e31-4432-9e3b-16a50ebce794","Type":"ContainerStarted","Data":"dba3ac16754d5a6b2843c8e44c7c8c77526e3ed39ca408957653647034fac3f9"} Dec 06 01:19:55 crc kubenswrapper[4991]: I1206 01:19:55.435651 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-5v8fh" event={"ID":"7869a8de-b32f-4600-af69-49a42ec4d096","Type":"ContainerStarted","Data":"3fba822fe95f92f766d5db948b8deec1312d2fc84b2757f1098dc0d057bee95c"} Dec 06 01:19:55 crc kubenswrapper[4991]: I1206 01:19:55.437753 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4wxbnr" event={"ID":"1d549c0d-f4fd-4eeb-852b-78e302f65110","Type":"ContainerStarted","Data":"f39548c29519db26e3486d8957facc53cf50c6f7147e7b3492a38841f613dd5e"} Dec 06 01:19:55 crc kubenswrapper[4991]: W1206 01:19:55.761290 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37dd0775_3a19_4005_9cf0_7c038c293004.slice/crio-448edfe000d67b0e7298fdcb345fe9da93c859fb86eedb44d9d4a464dac64cf4 WatchSource:0}: Error finding container 448edfe000d67b0e7298fdcb345fe9da93c859fb86eedb44d9d4a464dac64cf4: Status 404 returned error can't find the container with id 448edfe000d67b0e7298fdcb345fe9da93c859fb86eedb44d9d4a464dac64cf4 Dec 06 01:19:56 crc kubenswrapper[4991]: I1206 01:19:56.444653 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-4pvnj" event={"ID":"4fafae92-8cd1-450f-9e3b-7abd1020ab75","Type":"ContainerStarted","Data":"005abe9631a21151a669dd093e38fac56eba6047636b0f423b736c1460c00104"} Dec 06 01:19:56 crc kubenswrapper[4991]: I1206 01:19:56.449544 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5j6br" event={"ID":"14a62a5b-5719-4192-b2d8-40dc2e000f41","Type":"ContainerStarted","Data":"13114d60c3c5f4c53ff93a1a42de403a65630f49b2cecaada422bd07095eb7f3"} Dec 06 01:19:56 crc kubenswrapper[4991]: I1206 01:19:56.453119 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-9phkz" event={"ID":"a4fd6e71-89f4-41d3-beed-a894e45b8fe8","Type":"ContainerStarted","Data":"2dd26d3f35aa44556eb78481fb370b909257d0113d434822b8b25e224adbbe88"} Dec 06 01:19:56 crc kubenswrapper[4991]: I1206 01:19:56.454847 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-96kw2" event={"ID":"37dd0775-3a19-4005-9cf0-7c038c293004","Type":"ContainerStarted","Data":"448edfe000d67b0e7298fdcb345fe9da93c859fb86eedb44d9d4a464dac64cf4"} Dec 06 01:19:56 crc kubenswrapper[4991]: I1206 01:19:56.456653 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-rlhms" event={"ID":"e5351a0d-ece3-441f-95f6-ccc4281e9462","Type":"ContainerStarted","Data":"4d0cc9f8d0dfdb60f1463bdc6f51b060df821ac822456586f99c2183c26838b3"} Dec 06 01:19:56 crc kubenswrapper[4991]: I1206 01:19:56.459181 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-hll9p" event={"ID":"4275f29c-130a-4e02-aeb5-6410643706b8","Type":"ContainerStarted","Data":"6d4991a95fa346c154b6650dd5cf7d74a465f8cd6beda30cc649d321fae1c0c9"} Dec 06 01:19:57 crc kubenswrapper[4991]: I1206 01:19:57.466001 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-crkd2" event={"ID":"66cf73ac-d6ca-4a8c-855b-3d26bec5adb7","Type":"ContainerStarted","Data":"3b8e95f6280c5381e2ff2ced6d00fdb83a5cd38bea4a0561649f989239fead7c"} Dec 06 01:19:57 crc kubenswrapper[4991]: I1206 01:19:57.469405 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-twvbc" event={"ID":"7ceea028-825c-4235-b1bb-52adec27f46d","Type":"ContainerStarted","Data":"6fe6ad05924154a5298c4dbd84373891e5a4448d1b51313244168955ddaf7d4d"} Dec 06 01:19:57 crc kubenswrapper[4991]: I1206 01:19:57.470563 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-gkl7x" event={"ID":"6b1fa1e2-0eb6-48bf-bebc-93db7d90a644","Type":"ContainerStarted","Data":"a49424935496ce93cc18fbd288f57fb6a098fceb98570495dba61fb1e59e1abc"} Dec 06 01:19:58 crc kubenswrapper[4991]: I1206 01:19:58.479321 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-r6xm6" event={"ID":"35ecf5a7-ec56-4cfc-a96b-84dada809373","Type":"ContainerStarted","Data":"09dd5295a70062ace2472b802947ef2a58282207957a754dea66343142127bc1"} Dec 06 01:19:58 crc kubenswrapper[4991]: I1206 01:19:58.496823 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-72txt" event={"ID":"725ee172-29cd-40c6-b0a1-9be795634566","Type":"ContainerStarted","Data":"8952a119512994c4933ca372db776d1712f043f3abae61c1c1232166d435607a"} Dec 06 01:19:58 crc kubenswrapper[4991]: I1206 01:19:58.498142 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7d5d6748cd-5gjhx" event={"ID":"a60f2feb-b041-4bb0-b718-9e669f3cf9c8","Type":"ContainerStarted","Data":"a96d492b725ba4f94de51739e3036ebc18d9180745c0f902e73eacaacc223c04"} Dec 06 01:19:58 crc kubenswrapper[4991]: I1206 01:19:58.499487 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7d5d6748cd-5gjhx" Dec 06 01:19:58 crc kubenswrapper[4991]: I1206 01:19:58.530338 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7d5d6748cd-5gjhx" podStartSLOduration=23.530318759 podStartE2EDuration="23.530318759s" podCreationTimestamp="2025-12-06 01:19:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:19:58.524840606 +0000 UTC m=+1071.875131064" watchObservedRunningTime="2025-12-06 01:19:58.530318759 +0000 UTC m=+1071.880609227" Dec 06 01:20:02 crc kubenswrapper[4991]: I1206 01:20:02.544512 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-gdckl" event={"ID":"54706fa6-a00c-4f66-bce5-660d5475ff7b","Type":"ContainerStarted","Data":"6afde5d500c5ca7a5c077769ccdf2cde38ab910c1549e817dc5a939027c35e95"} Dec 06 01:20:02 crc kubenswrapper[4991]: I1206 01:20:02.547226 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-4hx8k" event={"ID":"91bdc603-1933-4f22-9ecd-d3e95f2f0924","Type":"ContainerStarted","Data":"c051a9e0644540e60487ddb5908b03c43687e66fc1048c037bd08e8a1a054f3b"} Dec 06 01:20:02 crc kubenswrapper[4991]: E1206 01:20:02.963018 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-h9lxn" podUID="fd88f0ae-0cbd-4958-a05a-51e18d3ff32f" Dec 06 01:20:02 crc kubenswrapper[4991]: E1206 01:20:02.986425 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-lwqvk" podUID="d21383b5-1b67-4b8f-a176-05769a28528d" Dec 06 01:20:03 crc kubenswrapper[4991]: E1206 01:20:03.037854 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-ftl59" podUID="852a1018-8ddc-4a27-ba0c-8a1fcf0d010f" Dec 06 01:20:03 crc kubenswrapper[4991]: E1206 01:20:03.371106 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-xjgls" podUID="8c5d96f9-58a1-4ee7-82dd-f71237ef875c" Dec 06 01:20:03 crc kubenswrapper[4991]: I1206 01:20:03.557375 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-xjgls" event={"ID":"8c5d96f9-58a1-4ee7-82dd-f71237ef875c","Type":"ContainerStarted","Data":"3873a38a0568f1f7641af7fae0ffbb4b8c8199d6b4229566f58d738e89ccaf37"} Dec 06 01:20:03 crc kubenswrapper[4991]: I1206 01:20:03.561561 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-h9lxn" event={"ID":"fd88f0ae-0cbd-4958-a05a-51e18d3ff32f","Type":"ContainerStarted","Data":"fa66bea204c557eeffc890ecfbfef08c81c7b2b6eb30dc5bdba115012e7e87d7"} Dec 06 01:20:03 crc kubenswrapper[4991]: I1206 01:20:03.567435 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-hll9p" event={"ID":"4275f29c-130a-4e02-aeb5-6410643706b8","Type":"ContainerStarted","Data":"fd7cada018a4d181f8c7d12c3c9204dc1918d4f110dd33d68c5d522709994d41"} Dec 06 01:20:03 crc kubenswrapper[4991]: I1206 01:20:03.569618 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-hll9p" Dec 06 01:20:03 crc kubenswrapper[4991]: I1206 01:20:03.572118 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-hll9p" Dec 06 01:20:03 crc kubenswrapper[4991]: I1206 01:20:03.575296 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-twvbc" event={"ID":"7ceea028-825c-4235-b1bb-52adec27f46d","Type":"ContainerStarted","Data":"6df94bb1ee07c110338ba83f0f29afe8c6c28500b2573ece01206973138f5c3e"} Dec 06 01:20:03 crc kubenswrapper[4991]: I1206 01:20:03.577698 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-twvbc" Dec 06 01:20:03 crc kubenswrapper[4991]: I1206 01:20:03.578772 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-twvbc" Dec 06 01:20:03 crc kubenswrapper[4991]: I1206 01:20:03.587811 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-lwqvk" event={"ID":"d21383b5-1b67-4b8f-a176-05769a28528d","Type":"ContainerStarted","Data":"119b5b19b4b8bc7c84d8505bd4bfa3921ce57b124a069980703c98783913d707"} Dec 06 01:20:03 crc kubenswrapper[4991]: I1206 01:20:03.608796 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-96kw2" event={"ID":"37dd0775-3a19-4005-9cf0-7c038c293004","Type":"ContainerStarted","Data":"a9b8b0d2b3513c40fd652e9fe606d7fd587889f0eb78a9a052b0deb676afb998"} Dec 06 01:20:03 crc kubenswrapper[4991]: I1206 01:20:03.639699 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4wxbnr" event={"ID":"1d549c0d-f4fd-4eeb-852b-78e302f65110","Type":"ContainerStarted","Data":"b73ee0ba36421b8e513406ce9dc4abb16a890a7bf299cec637683f18a92de6c7"} Dec 06 01:20:03 crc kubenswrapper[4991]: I1206 01:20:03.641658 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-twvbc" podStartSLOduration=3.026114173 podStartE2EDuration="29.641642897s" podCreationTimestamp="2025-12-06 01:19:34 +0000 UTC" firstStartedPulling="2025-12-06 01:19:35.960229563 +0000 UTC m=+1049.310520031" lastFinishedPulling="2025-12-06 01:20:02.575758287 +0000 UTC m=+1075.926048755" observedRunningTime="2025-12-06 01:20:03.638554506 +0000 UTC m=+1076.988844974" watchObservedRunningTime="2025-12-06 01:20:03.641642897 +0000 UTC m=+1076.991933365" Dec 06 01:20:03 crc kubenswrapper[4991]: I1206 01:20:03.651626 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-ftl59" event={"ID":"852a1018-8ddc-4a27-ba0c-8a1fcf0d010f","Type":"ContainerStarted","Data":"eae50f5f3c334c170893f3b0bb3135ff44b1929c9c8ff9649276e9dcfa3dda2e"} Dec 06 01:20:03 crc kubenswrapper[4991]: I1206 01:20:03.670270 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-9phkz" event={"ID":"a4fd6e71-89f4-41d3-beed-a894e45b8fe8","Type":"ContainerStarted","Data":"7fcff4cab44a5f5c39a140d9c7f0d9190c0ddf387857de63d1fb6d06483740af"} Dec 06 01:20:03 crc kubenswrapper[4991]: I1206 01:20:03.672122 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-hll9p" podStartSLOduration=3.68760081 podStartE2EDuration="29.672098925s" podCreationTimestamp="2025-12-06 01:19:34 +0000 UTC" firstStartedPulling="2025-12-06 01:19:36.475572552 +0000 UTC m=+1049.825863020" lastFinishedPulling="2025-12-06 01:20:02.460070627 +0000 UTC m=+1075.810361135" observedRunningTime="2025-12-06 01:20:03.670878003 +0000 UTC m=+1077.021168471" watchObservedRunningTime="2025-12-06 01:20:03.672098925 +0000 UTC m=+1077.022389393" Dec 06 01:20:03 crc kubenswrapper[4991]: I1206 01:20:03.673945 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-9phkz" Dec 06 01:20:03 crc kubenswrapper[4991]: I1206 01:20:03.681700 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-9phkz" Dec 06 01:20:03 crc kubenswrapper[4991]: I1206 01:20:03.765912 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-9phkz" podStartSLOduration=3.029772454 podStartE2EDuration="28.765892302s" podCreationTimestamp="2025-12-06 01:19:35 +0000 UTC" firstStartedPulling="2025-12-06 01:19:36.713411783 +0000 UTC m=+1050.063702251" lastFinishedPulling="2025-12-06 01:20:02.449531631 +0000 UTC m=+1075.799822099" observedRunningTime="2025-12-06 01:20:03.739644434 +0000 UTC m=+1077.089934902" watchObservedRunningTime="2025-12-06 01:20:03.765892302 +0000 UTC m=+1077.116182770" Dec 06 01:20:04 crc kubenswrapper[4991]: E1206 01:20:04.060054 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-trx9l" podUID="7f3695d3-1d23-4779-90a6-d1b62ac3edc5" Dec 06 01:20:04 crc kubenswrapper[4991]: I1206 01:20:04.681071 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-rlhms" event={"ID":"e5351a0d-ece3-441f-95f6-ccc4281e9462","Type":"ContainerStarted","Data":"cfb50ef75c20affe5d107b88f5c3494d253ec9b2f79eb1b74a361e9f186a8087"} Dec 06 01:20:04 crc kubenswrapper[4991]: I1206 01:20:04.682344 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-rlhms" Dec 06 01:20:04 crc kubenswrapper[4991]: I1206 01:20:04.684081 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-rlhms" Dec 06 01:20:04 crc kubenswrapper[4991]: I1206 01:20:04.688580 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-r6xm6" event={"ID":"35ecf5a7-ec56-4cfc-a96b-84dada809373","Type":"ContainerStarted","Data":"92f195f8372065acad5706ad932de01e26cdd237ec753ba59cfd4ca0ff425692"} Dec 06 01:20:04 crc kubenswrapper[4991]: I1206 01:20:04.689274 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-r6xm6" Dec 06 01:20:04 crc kubenswrapper[4991]: I1206 01:20:04.695023 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-72txt" event={"ID":"725ee172-29cd-40c6-b0a1-9be795634566","Type":"ContainerStarted","Data":"1a124edfdb7049f1a9f015aac705366ec39334c641659c11f6f41298123ceb9a"} Dec 06 01:20:04 crc kubenswrapper[4991]: I1206 01:20:04.697603 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-72txt" Dec 06 01:20:04 crc kubenswrapper[4991]: I1206 01:20:04.697712 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-r6xm6" Dec 06 01:20:04 crc kubenswrapper[4991]: I1206 01:20:04.701935 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-gbqt8" event={"ID":"c99be400-34b2-431b-ba71-9c6c03f70c91","Type":"ContainerStarted","Data":"be7e66b9051a65b132ed3e57b97964965e019f444d8eb7d9b0f719e9ccfe106e"} Dec 06 01:20:04 crc kubenswrapper[4991]: I1206 01:20:04.703267 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-gbqt8" Dec 06 01:20:04 crc kubenswrapper[4991]: I1206 01:20:04.704808 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-72txt" Dec 06 01:20:04 crc kubenswrapper[4991]: I1206 01:20:04.704927 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-trx9l" event={"ID":"7f3695d3-1d23-4779-90a6-d1b62ac3edc5","Type":"ContainerStarted","Data":"6d542e0aa7a31b4ceade77e7954b5f4e0d5f66e4025b106efbfe5efbd2633094"} Dec 06 01:20:04 crc kubenswrapper[4991]: I1206 01:20:04.705735 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-gbqt8" Dec 06 01:20:04 crc kubenswrapper[4991]: I1206 01:20:04.709350 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4wxbnr" event={"ID":"1d549c0d-f4fd-4eeb-852b-78e302f65110","Type":"ContainerStarted","Data":"70ba7b87549f493f22584eff0ef3116377352a93ca42c8494d462ece7fe262e2"} Dec 06 01:20:04 crc kubenswrapper[4991]: I1206 01:20:04.710202 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4wxbnr" Dec 06 01:20:04 crc kubenswrapper[4991]: I1206 01:20:04.711785 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-96kw2" event={"ID":"37dd0775-3a19-4005-9cf0-7c038c293004","Type":"ContainerStarted","Data":"d15c78a97d41808d90a9670b48c28ada7a5c4b7af43787cd94cc6234e25ccc0d"} Dec 06 01:20:04 crc kubenswrapper[4991]: I1206 01:20:04.712485 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-96kw2" Dec 06 01:20:04 crc kubenswrapper[4991]: I1206 01:20:04.716358 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-crkd2" event={"ID":"66cf73ac-d6ca-4a8c-855b-3d26bec5adb7","Type":"ContainerStarted","Data":"a560b401238d220e85f1d8f1ed0f8e75f57d8827aa1977f5224c167c3af778d9"} Dec 06 01:20:04 crc kubenswrapper[4991]: I1206 01:20:04.718072 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-crkd2" Dec 06 01:20:04 crc kubenswrapper[4991]: I1206 01:20:04.723717 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-crkd2" Dec 06 01:20:04 crc kubenswrapper[4991]: I1206 01:20:04.726191 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-rlhms" podStartSLOduration=4.9029522960000005 podStartE2EDuration="30.726167176s" podCreationTimestamp="2025-12-06 01:19:34 +0000 UTC" firstStartedPulling="2025-12-06 01:19:36.549418647 +0000 UTC m=+1049.899709115" lastFinishedPulling="2025-12-06 01:20:02.372633517 +0000 UTC m=+1075.722923995" observedRunningTime="2025-12-06 01:20:04.72400917 +0000 UTC m=+1078.074299648" watchObservedRunningTime="2025-12-06 01:20:04.726167176 +0000 UTC m=+1078.076457644" Dec 06 01:20:04 crc kubenswrapper[4991]: I1206 01:20:04.727792 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-4pvnj" event={"ID":"4fafae92-8cd1-450f-9e3b-7abd1020ab75","Type":"ContainerStarted","Data":"f625fb0fe4a5f0db1033561ea149c4a490f90516598353b8ca47ce6af68b7bd6"} Dec 06 01:20:04 crc kubenswrapper[4991]: I1206 01:20:04.728443 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-4pvnj" Dec 06 01:20:04 crc kubenswrapper[4991]: I1206 01:20:04.730105 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-gdckl" event={"ID":"54706fa6-a00c-4f66-bce5-660d5475ff7b","Type":"ContainerStarted","Data":"5dd1856f6d7cb9e8e3cc776441d3183219acf65cd159e3b230ea3803a6271508"} Dec 06 01:20:04 crc kubenswrapper[4991]: I1206 01:20:04.730194 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-gdckl" Dec 06 01:20:04 crc kubenswrapper[4991]: I1206 01:20:04.731251 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-4pvnj" Dec 06 01:20:04 crc kubenswrapper[4991]: I1206 01:20:04.733027 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-4hx8k" event={"ID":"91bdc603-1933-4f22-9ecd-d3e95f2f0924","Type":"ContainerStarted","Data":"0a8ca2cb8f54906e87fda9488d5aa4db20f71c5662ef216d922f9663f1406ec8"} Dec 06 01:20:04 crc kubenswrapper[4991]: I1206 01:20:04.734417 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-4hx8k" Dec 06 01:20:04 crc kubenswrapper[4991]: I1206 01:20:04.735833 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-5v8fh" event={"ID":"7869a8de-b32f-4600-af69-49a42ec4d096","Type":"ContainerStarted","Data":"bdc85bcc43de5585d570d526ebeeb2744486e79b10b067842b6b9f1d9ae06447"} Dec 06 01:20:04 crc kubenswrapper[4991]: I1206 01:20:04.737739 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-5v8fh" Dec 06 01:20:04 crc kubenswrapper[4991]: I1206 01:20:04.739267 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-5v8fh" Dec 06 01:20:04 crc kubenswrapper[4991]: I1206 01:20:04.739950 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-gkl7x" event={"ID":"6b1fa1e2-0eb6-48bf-bebc-93db7d90a644","Type":"ContainerStarted","Data":"956b01a33c3d0aaa7920616b83e359bee5b1907bb886e83bb581a8b1d3ba72b9"} Dec 06 01:20:04 crc kubenswrapper[4991]: I1206 01:20:04.740791 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-gkl7x" Dec 06 01:20:04 crc kubenswrapper[4991]: I1206 01:20:04.743747 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-gkl7x" Dec 06 01:20:04 crc kubenswrapper[4991]: I1206 01:20:04.747648 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5j6br" event={"ID":"14a62a5b-5719-4192-b2d8-40dc2e000f41","Type":"ContainerStarted","Data":"3cc5346972d990b8b008c390e0401a58393af7023dd2937f8d2f1ed5da109791"} Dec 06 01:20:04 crc kubenswrapper[4991]: I1206 01:20:04.747718 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5j6br" Dec 06 01:20:04 crc kubenswrapper[4991]: I1206 01:20:04.748286 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5j6br" Dec 06 01:20:04 crc kubenswrapper[4991]: I1206 01:20:04.768649 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-crkd2" podStartSLOduration=4.739904686 podStartE2EDuration="30.768624309s" podCreationTimestamp="2025-12-06 01:19:34 +0000 UTC" firstStartedPulling="2025-12-06 01:19:36.471374082 +0000 UTC m=+1049.821664560" lastFinishedPulling="2025-12-06 01:20:02.500093705 +0000 UTC m=+1075.850384183" observedRunningTime="2025-12-06 01:20:04.757966559 +0000 UTC m=+1078.108257027" watchObservedRunningTime="2025-12-06 01:20:04.768624309 +0000 UTC m=+1078.118914777" Dec 06 01:20:04 crc kubenswrapper[4991]: I1206 01:20:04.847372 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-96kw2" podStartSLOduration=24.350422577 podStartE2EDuration="30.847345101s" podCreationTimestamp="2025-12-06 01:19:34 +0000 UTC" firstStartedPulling="2025-12-06 01:19:55.769815049 +0000 UTC m=+1069.120105517" lastFinishedPulling="2025-12-06 01:20:02.266737553 +0000 UTC m=+1075.617028041" observedRunningTime="2025-12-06 01:20:04.846661733 +0000 UTC m=+1078.196952201" watchObservedRunningTime="2025-12-06 01:20:04.847345101 +0000 UTC m=+1078.197635569" Dec 06 01:20:04 crc kubenswrapper[4991]: I1206 01:20:04.853805 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4wxbnr" podStartSLOduration=23.749208697 podStartE2EDuration="30.853786529s" podCreationTimestamp="2025-12-06 01:19:34 +0000 UTC" firstStartedPulling="2025-12-06 01:19:55.177923674 +0000 UTC m=+1068.528214142" lastFinishedPulling="2025-12-06 01:20:02.282501506 +0000 UTC m=+1075.632791974" observedRunningTime="2025-12-06 01:20:04.818786943 +0000 UTC m=+1078.169077431" watchObservedRunningTime="2025-12-06 01:20:04.853786529 +0000 UTC m=+1078.204077007" Dec 06 01:20:04 crc kubenswrapper[4991]: I1206 01:20:04.868968 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-r6xm6" podStartSLOduration=4.279429227 podStartE2EDuration="29.868945746s" podCreationTimestamp="2025-12-06 01:19:35 +0000 UTC" firstStartedPulling="2025-12-06 01:19:36.870728273 +0000 UTC m=+1050.221018741" lastFinishedPulling="2025-12-06 01:20:02.460244782 +0000 UTC m=+1075.810535260" observedRunningTime="2025-12-06 01:20:04.861021209 +0000 UTC m=+1078.211311677" watchObservedRunningTime="2025-12-06 01:20:04.868945746 +0000 UTC m=+1078.219236214" Dec 06 01:20:04 crc kubenswrapper[4991]: I1206 01:20:04.896065 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-gbqt8" podStartSLOduration=4.266393251 podStartE2EDuration="30.896045926s" podCreationTimestamp="2025-12-06 01:19:34 +0000 UTC" firstStartedPulling="2025-12-06 01:19:35.874913558 +0000 UTC m=+1049.225204026" lastFinishedPulling="2025-12-06 01:20:02.504566223 +0000 UTC m=+1075.854856701" observedRunningTime="2025-12-06 01:20:04.87785285 +0000 UTC m=+1078.228143318" watchObservedRunningTime="2025-12-06 01:20:04.896045926 +0000 UTC m=+1078.246336394" Dec 06 01:20:04 crc kubenswrapper[4991]: I1206 01:20:04.935557 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5j6br" podStartSLOduration=4.90687766 podStartE2EDuration="30.935537891s" podCreationTimestamp="2025-12-06 01:19:34 +0000 UTC" firstStartedPulling="2025-12-06 01:19:36.475771418 +0000 UTC m=+1049.826061886" lastFinishedPulling="2025-12-06 01:20:02.504431609 +0000 UTC m=+1075.854722117" observedRunningTime="2025-12-06 01:20:04.929447151 +0000 UTC m=+1078.279737619" watchObservedRunningTime="2025-12-06 01:20:04.935537891 +0000 UTC m=+1078.285828359" Dec 06 01:20:04 crc kubenswrapper[4991]: I1206 01:20:04.936435 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-72txt" podStartSLOduration=4.102737929 podStartE2EDuration="29.936428464s" podCreationTimestamp="2025-12-06 01:19:35 +0000 UTC" firstStartedPulling="2025-12-06 01:19:36.714398988 +0000 UTC m=+1050.064689456" lastFinishedPulling="2025-12-06 01:20:02.548089513 +0000 UTC m=+1075.898379991" observedRunningTime="2025-12-06 01:20:04.912404055 +0000 UTC m=+1078.262694513" watchObservedRunningTime="2025-12-06 01:20:04.936428464 +0000 UTC m=+1078.286718932" Dec 06 01:20:04 crc kubenswrapper[4991]: I1206 01:20:04.981759 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-4pvnj" podStartSLOduration=4.996037745 podStartE2EDuration="30.981730471s" podCreationTimestamp="2025-12-06 01:19:34 +0000 UTC" firstStartedPulling="2025-12-06 01:19:36.471714721 +0000 UTC m=+1049.822005189" lastFinishedPulling="2025-12-06 01:20:02.457407407 +0000 UTC m=+1075.807697915" observedRunningTime="2025-12-06 01:20:04.957444435 +0000 UTC m=+1078.307734903" watchObservedRunningTime="2025-12-06 01:20:04.981730471 +0000 UTC m=+1078.332020939" Dec 06 01:20:04 crc kubenswrapper[4991]: I1206 01:20:04.982566 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-gdckl" podStartSLOduration=9.423999139 podStartE2EDuration="29.982560043s" podCreationTimestamp="2025-12-06 01:19:35 +0000 UTC" firstStartedPulling="2025-12-06 01:19:36.715290172 +0000 UTC m=+1050.065580640" lastFinishedPulling="2025-12-06 01:19:57.273851076 +0000 UTC m=+1070.624141544" observedRunningTime="2025-12-06 01:20:04.972720955 +0000 UTC m=+1078.323011423" watchObservedRunningTime="2025-12-06 01:20:04.982560043 +0000 UTC m=+1078.332850511" Dec 06 01:20:05 crc kubenswrapper[4991]: I1206 01:20:05.003188 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-5v8fh" podStartSLOduration=4.204390522 podStartE2EDuration="30.003159512s" podCreationTimestamp="2025-12-06 01:19:35 +0000 UTC" firstStartedPulling="2025-12-06 01:19:36.694797815 +0000 UTC m=+1050.045088283" lastFinishedPulling="2025-12-06 01:20:02.493566795 +0000 UTC m=+1075.843857273" observedRunningTime="2025-12-06 01:20:04.997206116 +0000 UTC m=+1078.347496584" watchObservedRunningTime="2025-12-06 01:20:05.003159512 +0000 UTC m=+1078.353449980" Dec 06 01:20:05 crc kubenswrapper[4991]: I1206 01:20:05.027946 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-4hx8k" podStartSLOduration=12.572285450999999 podStartE2EDuration="31.0279096s" podCreationTimestamp="2025-12-06 01:19:34 +0000 UTC" firstStartedPulling="2025-12-06 01:19:36.72058303 +0000 UTC m=+1050.070873498" lastFinishedPulling="2025-12-06 01:19:55.176207179 +0000 UTC m=+1068.526497647" observedRunningTime="2025-12-06 01:20:05.019970233 +0000 UTC m=+1078.370260701" watchObservedRunningTime="2025-12-06 01:20:05.0279096 +0000 UTC m=+1078.378200068" Dec 06 01:20:05 crc kubenswrapper[4991]: I1206 01:20:05.046707 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-gkl7x" podStartSLOduration=4.184359929 podStartE2EDuration="30.046689673s" podCreationTimestamp="2025-12-06 01:19:35 +0000 UTC" firstStartedPulling="2025-12-06 01:19:36.713577167 +0000 UTC m=+1050.063867635" lastFinishedPulling="2025-12-06 01:20:02.575906911 +0000 UTC m=+1075.926197379" observedRunningTime="2025-12-06 01:20:05.038343924 +0000 UTC m=+1078.388634382" watchObservedRunningTime="2025-12-06 01:20:05.046689673 +0000 UTC m=+1078.396980141" Dec 06 01:20:10 crc kubenswrapper[4991]: I1206 01:20:10.784055 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-96kw2" Dec 06 01:20:11 crc kubenswrapper[4991]: I1206 01:20:11.532491 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4wxbnr" Dec 06 01:20:11 crc kubenswrapper[4991]: I1206 01:20:11.820925 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-lwqvk" event={"ID":"d21383b5-1b67-4b8f-a176-05769a28528d","Type":"ContainerStarted","Data":"033f5c0388bf6f2ba7812c0a2d87dbef1ee33b3778c2569b361ce66f5b85e741"} Dec 06 01:20:11 crc kubenswrapper[4991]: I1206 01:20:11.935228 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7d5d6748cd-5gjhx" Dec 06 01:20:12 crc kubenswrapper[4991]: I1206 01:20:12.831174 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-xjgls" event={"ID":"8c5d96f9-58a1-4ee7-82dd-f71237ef875c","Type":"ContainerStarted","Data":"ac4612268af13087ecf717420c0e818ed459eebdfbcde882cb35fbb05f75d99c"} Dec 06 01:20:12 crc kubenswrapper[4991]: I1206 01:20:12.831645 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-lwqvk" Dec 06 01:20:12 crc kubenswrapper[4991]: I1206 01:20:12.831756 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-xjgls" Dec 06 01:20:12 crc kubenswrapper[4991]: I1206 01:20:12.858484 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-xjgls" podStartSLOduration=11.351942875 podStartE2EDuration="38.858458558s" podCreationTimestamp="2025-12-06 01:19:34 +0000 UTC" firstStartedPulling="2025-12-06 01:19:36.556504212 +0000 UTC m=+1049.906794680" lastFinishedPulling="2025-12-06 01:20:04.063019895 +0000 UTC m=+1077.413310363" observedRunningTime="2025-12-06 01:20:12.850209962 +0000 UTC m=+1086.200500420" watchObservedRunningTime="2025-12-06 01:20:12.858458558 +0000 UTC m=+1086.208749026" Dec 06 01:20:12 crc kubenswrapper[4991]: I1206 01:20:12.878790 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-lwqvk" podStartSLOduration=11.012242427 podStartE2EDuration="38.87876446s" podCreationTimestamp="2025-12-06 01:19:34 +0000 UTC" firstStartedPulling="2025-12-06 01:19:36.194768047 +0000 UTC m=+1049.545058515" lastFinishedPulling="2025-12-06 01:20:04.06129008 +0000 UTC m=+1077.411580548" observedRunningTime="2025-12-06 01:20:12.87606757 +0000 UTC m=+1086.226358068" watchObservedRunningTime="2025-12-06 01:20:12.87876446 +0000 UTC m=+1086.229054928" Dec 06 01:20:14 crc kubenswrapper[4991]: I1206 01:20:14.849901 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-h9lxn" event={"ID":"fd88f0ae-0cbd-4958-a05a-51e18d3ff32f","Type":"ContainerStarted","Data":"d4a648a6240d446a58e40bb42c95f2a061c3281b0f74994857776b6f8f3810d6"} Dec 06 01:20:14 crc kubenswrapper[4991]: I1206 01:20:14.850427 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-h9lxn" Dec 06 01:20:14 crc kubenswrapper[4991]: I1206 01:20:14.854520 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-z5dnf" event={"ID":"5cfe648a-c326-46aa-83ed-31659feb31d0","Type":"ContainerStarted","Data":"e359e2419a451d594900d0f9bc4afbdc58dfe48befb065272268378450b1f217"} Dec 06 01:20:14 crc kubenswrapper[4991]: I1206 01:20:14.858373 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-trx9l" event={"ID":"7f3695d3-1d23-4779-90a6-d1b62ac3edc5","Type":"ContainerStarted","Data":"7b1be07ede0a1340f05cd17f1081709f0673b511ac15a13b5d274afad668e7c7"} Dec 06 01:20:14 crc kubenswrapper[4991]: I1206 01:20:14.858715 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-trx9l" Dec 06 01:20:14 crc kubenswrapper[4991]: I1206 01:20:14.860924 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-ftl59" event={"ID":"852a1018-8ddc-4a27-ba0c-8a1fcf0d010f","Type":"ContainerStarted","Data":"a4577abb73b3de5a21e83148c353e385269a9269848ed511a3b891319eb92284"} Dec 06 01:20:14 crc kubenswrapper[4991]: I1206 01:20:14.861313 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-ftl59" Dec 06 01:20:14 crc kubenswrapper[4991]: I1206 01:20:14.873175 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-h9lxn" podStartSLOduration=2.836154228 podStartE2EDuration="40.873152012s" podCreationTimestamp="2025-12-06 01:19:34 +0000 UTC" firstStartedPulling="2025-12-06 01:19:36.491824498 +0000 UTC m=+1049.842114966" lastFinishedPulling="2025-12-06 01:20:14.528822272 +0000 UTC m=+1087.879112750" observedRunningTime="2025-12-06 01:20:14.867296428 +0000 UTC m=+1088.217586906" watchObservedRunningTime="2025-12-06 01:20:14.873152012 +0000 UTC m=+1088.223442490" Dec 06 01:20:14 crc kubenswrapper[4991]: I1206 01:20:14.892267 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-trx9l" podStartSLOduration=2.923785363 podStartE2EDuration="40.892247512s" podCreationTimestamp="2025-12-06 01:19:34 +0000 UTC" firstStartedPulling="2025-12-06 01:19:36.556202494 +0000 UTC m=+1049.906492962" lastFinishedPulling="2025-12-06 01:20:14.524664603 +0000 UTC m=+1087.874955111" observedRunningTime="2025-12-06 01:20:14.888009401 +0000 UTC m=+1088.238299879" watchObservedRunningTime="2025-12-06 01:20:14.892247512 +0000 UTC m=+1088.242537990" Dec 06 01:20:14 crc kubenswrapper[4991]: I1206 01:20:14.914485 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-ftl59" podStartSLOduration=2.44456121 podStartE2EDuration="40.914461394s" podCreationTimestamp="2025-12-06 01:19:34 +0000 UTC" firstStartedPulling="2025-12-06 01:19:36.055073697 +0000 UTC m=+1049.405364165" lastFinishedPulling="2025-12-06 01:20:14.524973881 +0000 UTC m=+1087.875264349" observedRunningTime="2025-12-06 01:20:14.909277008 +0000 UTC m=+1088.259567486" watchObservedRunningTime="2025-12-06 01:20:14.914461394 +0000 UTC m=+1088.264751872" Dec 06 01:20:14 crc kubenswrapper[4991]: I1206 01:20:14.936879 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-z5dnf" podStartSLOduration=2.291617249 podStartE2EDuration="39.936852631s" podCreationTimestamp="2025-12-06 01:19:35 +0000 UTC" firstStartedPulling="2025-12-06 01:19:36.878550238 +0000 UTC m=+1050.228840706" lastFinishedPulling="2025-12-06 01:20:14.52378562 +0000 UTC m=+1087.874076088" observedRunningTime="2025-12-06 01:20:14.930517835 +0000 UTC m=+1088.280808313" watchObservedRunningTime="2025-12-06 01:20:14.936852631 +0000 UTC m=+1088.287143099" Dec 06 01:20:15 crc kubenswrapper[4991]: I1206 01:20:15.589607 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-4hx8k" Dec 06 01:20:15 crc kubenswrapper[4991]: I1206 01:20:15.794506 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-gdckl" Dec 06 01:20:25 crc kubenswrapper[4991]: I1206 01:20:25.146160 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-ftl59" Dec 06 01:20:25 crc kubenswrapper[4991]: I1206 01:20:25.206582 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-lwqvk" Dec 06 01:20:25 crc kubenswrapper[4991]: I1206 01:20:25.367950 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-trx9l" Dec 06 01:20:25 crc kubenswrapper[4991]: I1206 01:20:25.368642 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-h9lxn" Dec 06 01:20:25 crc kubenswrapper[4991]: I1206 01:20:25.543826 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-xjgls" Dec 06 01:20:44 crc kubenswrapper[4991]: I1206 01:20:44.239263 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-l2qb2"] Dec 06 01:20:44 crc kubenswrapper[4991]: I1206 01:20:44.247383 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-l2qb2" Dec 06 01:20:44 crc kubenswrapper[4991]: I1206 01:20:44.250548 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 06 01:20:44 crc kubenswrapper[4991]: I1206 01:20:44.250574 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 06 01:20:44 crc kubenswrapper[4991]: I1206 01:20:44.250966 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 06 01:20:44 crc kubenswrapper[4991]: I1206 01:20:44.252152 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-vjt6t" Dec 06 01:20:44 crc kubenswrapper[4991]: I1206 01:20:44.268274 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-l2qb2"] Dec 06 01:20:44 crc kubenswrapper[4991]: I1206 01:20:44.307449 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3da75e9-2e39-490d-a4f6-ab86c82a7ce1-config\") pod \"dnsmasq-dns-675f4bcbfc-l2qb2\" (UID: \"e3da75e9-2e39-490d-a4f6-ab86c82a7ce1\") " pod="openstack/dnsmasq-dns-675f4bcbfc-l2qb2" Dec 06 01:20:44 crc kubenswrapper[4991]: I1206 01:20:44.307567 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr6mf\" (UniqueName: \"kubernetes.io/projected/e3da75e9-2e39-490d-a4f6-ab86c82a7ce1-kube-api-access-sr6mf\") pod \"dnsmasq-dns-675f4bcbfc-l2qb2\" (UID: \"e3da75e9-2e39-490d-a4f6-ab86c82a7ce1\") " pod="openstack/dnsmasq-dns-675f4bcbfc-l2qb2" Dec 06 01:20:44 crc kubenswrapper[4991]: I1206 01:20:44.392080 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-k6rsf"] Dec 06 01:20:44 crc kubenswrapper[4991]: I1206 01:20:44.393542 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-k6rsf" Dec 06 01:20:44 crc kubenswrapper[4991]: I1206 01:20:44.395337 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 06 01:20:44 crc kubenswrapper[4991]: I1206 01:20:44.408742 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3da75e9-2e39-490d-a4f6-ab86c82a7ce1-config\") pod \"dnsmasq-dns-675f4bcbfc-l2qb2\" (UID: \"e3da75e9-2e39-490d-a4f6-ab86c82a7ce1\") " pod="openstack/dnsmasq-dns-675f4bcbfc-l2qb2" Dec 06 01:20:44 crc kubenswrapper[4991]: I1206 01:20:44.408807 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsmwt\" (UniqueName: \"kubernetes.io/projected/4d13b5e3-846c-44ea-ae2d-3e4e28261447-kube-api-access-jsmwt\") pod \"dnsmasq-dns-78dd6ddcc-k6rsf\" (UID: \"4d13b5e3-846c-44ea-ae2d-3e4e28261447\") " pod="openstack/dnsmasq-dns-78dd6ddcc-k6rsf" Dec 06 01:20:44 crc kubenswrapper[4991]: I1206 01:20:44.408843 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sr6mf\" (UniqueName: \"kubernetes.io/projected/e3da75e9-2e39-490d-a4f6-ab86c82a7ce1-kube-api-access-sr6mf\") pod \"dnsmasq-dns-675f4bcbfc-l2qb2\" (UID: \"e3da75e9-2e39-490d-a4f6-ab86c82a7ce1\") " pod="openstack/dnsmasq-dns-675f4bcbfc-l2qb2" Dec 06 01:20:44 crc kubenswrapper[4991]: I1206 01:20:44.408883 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d13b5e3-846c-44ea-ae2d-3e4e28261447-config\") pod \"dnsmasq-dns-78dd6ddcc-k6rsf\" (UID: \"4d13b5e3-846c-44ea-ae2d-3e4e28261447\") " pod="openstack/dnsmasq-dns-78dd6ddcc-k6rsf" Dec 06 01:20:44 crc kubenswrapper[4991]: I1206 01:20:44.408925 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d13b5e3-846c-44ea-ae2d-3e4e28261447-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-k6rsf\" (UID: \"4d13b5e3-846c-44ea-ae2d-3e4e28261447\") " pod="openstack/dnsmasq-dns-78dd6ddcc-k6rsf" Dec 06 01:20:44 crc kubenswrapper[4991]: I1206 01:20:44.409774 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3da75e9-2e39-490d-a4f6-ab86c82a7ce1-config\") pod \"dnsmasq-dns-675f4bcbfc-l2qb2\" (UID: \"e3da75e9-2e39-490d-a4f6-ab86c82a7ce1\") " pod="openstack/dnsmasq-dns-675f4bcbfc-l2qb2" Dec 06 01:20:44 crc kubenswrapper[4991]: I1206 01:20:44.417998 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-k6rsf"] Dec 06 01:20:44 crc kubenswrapper[4991]: I1206 01:20:44.441329 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr6mf\" (UniqueName: \"kubernetes.io/projected/e3da75e9-2e39-490d-a4f6-ab86c82a7ce1-kube-api-access-sr6mf\") pod \"dnsmasq-dns-675f4bcbfc-l2qb2\" (UID: \"e3da75e9-2e39-490d-a4f6-ab86c82a7ce1\") " pod="openstack/dnsmasq-dns-675f4bcbfc-l2qb2" Dec 06 01:20:44 crc kubenswrapper[4991]: I1206 01:20:44.509810 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d13b5e3-846c-44ea-ae2d-3e4e28261447-config\") pod \"dnsmasq-dns-78dd6ddcc-k6rsf\" (UID: \"4d13b5e3-846c-44ea-ae2d-3e4e28261447\") " pod="openstack/dnsmasq-dns-78dd6ddcc-k6rsf" Dec 06 01:20:44 crc kubenswrapper[4991]: I1206 01:20:44.509896 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d13b5e3-846c-44ea-ae2d-3e4e28261447-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-k6rsf\" (UID: \"4d13b5e3-846c-44ea-ae2d-3e4e28261447\") " pod="openstack/dnsmasq-dns-78dd6ddcc-k6rsf" Dec 06 01:20:44 crc kubenswrapper[4991]: I1206 01:20:44.509948 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsmwt\" (UniqueName: \"kubernetes.io/projected/4d13b5e3-846c-44ea-ae2d-3e4e28261447-kube-api-access-jsmwt\") pod \"dnsmasq-dns-78dd6ddcc-k6rsf\" (UID: \"4d13b5e3-846c-44ea-ae2d-3e4e28261447\") " pod="openstack/dnsmasq-dns-78dd6ddcc-k6rsf" Dec 06 01:20:44 crc kubenswrapper[4991]: I1206 01:20:44.511565 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d13b5e3-846c-44ea-ae2d-3e4e28261447-config\") pod \"dnsmasq-dns-78dd6ddcc-k6rsf\" (UID: \"4d13b5e3-846c-44ea-ae2d-3e4e28261447\") " pod="openstack/dnsmasq-dns-78dd6ddcc-k6rsf" Dec 06 01:20:44 crc kubenswrapper[4991]: I1206 01:20:44.512265 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d13b5e3-846c-44ea-ae2d-3e4e28261447-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-k6rsf\" (UID: \"4d13b5e3-846c-44ea-ae2d-3e4e28261447\") " pod="openstack/dnsmasq-dns-78dd6ddcc-k6rsf" Dec 06 01:20:44 crc kubenswrapper[4991]: I1206 01:20:44.533447 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsmwt\" (UniqueName: \"kubernetes.io/projected/4d13b5e3-846c-44ea-ae2d-3e4e28261447-kube-api-access-jsmwt\") pod \"dnsmasq-dns-78dd6ddcc-k6rsf\" (UID: \"4d13b5e3-846c-44ea-ae2d-3e4e28261447\") " pod="openstack/dnsmasq-dns-78dd6ddcc-k6rsf" Dec 06 01:20:44 crc kubenswrapper[4991]: I1206 01:20:44.579909 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-l2qb2" Dec 06 01:20:44 crc kubenswrapper[4991]: I1206 01:20:44.719015 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-k6rsf" Dec 06 01:20:44 crc kubenswrapper[4991]: I1206 01:20:44.977063 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-k6rsf"] Dec 06 01:20:44 crc kubenswrapper[4991]: I1206 01:20:44.981832 4991 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 01:20:45 crc kubenswrapper[4991]: I1206 01:20:45.034669 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-l2qb2"] Dec 06 01:20:45 crc kubenswrapper[4991]: I1206 01:20:45.161169 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-l2qb2" event={"ID":"e3da75e9-2e39-490d-a4f6-ab86c82a7ce1","Type":"ContainerStarted","Data":"1abd97eefeb5e5fb4f43caed88683f30e450a8d7546d0c67ebce6da6a29b1a75"} Dec 06 01:20:45 crc kubenswrapper[4991]: I1206 01:20:45.162411 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-k6rsf" event={"ID":"4d13b5e3-846c-44ea-ae2d-3e4e28261447","Type":"ContainerStarted","Data":"3953203c4cb4439215b38c930c8be15e8ed5cf3a66f85ffe2d88917576eb947e"} Dec 06 01:20:46 crc kubenswrapper[4991]: I1206 01:20:46.510429 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-l2qb2"] Dec 06 01:20:46 crc kubenswrapper[4991]: I1206 01:20:46.530066 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-pqmt8"] Dec 06 01:20:46 crc kubenswrapper[4991]: I1206 01:20:46.531415 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-pqmt8" Dec 06 01:20:46 crc kubenswrapper[4991]: I1206 01:20:46.542146 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd4nd\" (UniqueName: \"kubernetes.io/projected/b6d417b0-d165-455f-9505-de10a93777c4-kube-api-access-hd4nd\") pod \"dnsmasq-dns-5ccc8479f9-pqmt8\" (UID: \"b6d417b0-d165-455f-9505-de10a93777c4\") " pod="openstack/dnsmasq-dns-5ccc8479f9-pqmt8" Dec 06 01:20:46 crc kubenswrapper[4991]: I1206 01:20:46.542189 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6d417b0-d165-455f-9505-de10a93777c4-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-pqmt8\" (UID: \"b6d417b0-d165-455f-9505-de10a93777c4\") " pod="openstack/dnsmasq-dns-5ccc8479f9-pqmt8" Dec 06 01:20:46 crc kubenswrapper[4991]: I1206 01:20:46.542219 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6d417b0-d165-455f-9505-de10a93777c4-config\") pod \"dnsmasq-dns-5ccc8479f9-pqmt8\" (UID: \"b6d417b0-d165-455f-9505-de10a93777c4\") " pod="openstack/dnsmasq-dns-5ccc8479f9-pqmt8" Dec 06 01:20:46 crc kubenswrapper[4991]: I1206 01:20:46.544128 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-pqmt8"] Dec 06 01:20:46 crc kubenswrapper[4991]: I1206 01:20:46.644099 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6d417b0-d165-455f-9505-de10a93777c4-config\") pod \"dnsmasq-dns-5ccc8479f9-pqmt8\" (UID: \"b6d417b0-d165-455f-9505-de10a93777c4\") " pod="openstack/dnsmasq-dns-5ccc8479f9-pqmt8" Dec 06 01:20:46 crc kubenswrapper[4991]: I1206 01:20:46.644272 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd4nd\" (UniqueName: \"kubernetes.io/projected/b6d417b0-d165-455f-9505-de10a93777c4-kube-api-access-hd4nd\") pod \"dnsmasq-dns-5ccc8479f9-pqmt8\" (UID: \"b6d417b0-d165-455f-9505-de10a93777c4\") " pod="openstack/dnsmasq-dns-5ccc8479f9-pqmt8" Dec 06 01:20:46 crc kubenswrapper[4991]: I1206 01:20:46.644304 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6d417b0-d165-455f-9505-de10a93777c4-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-pqmt8\" (UID: \"b6d417b0-d165-455f-9505-de10a93777c4\") " pod="openstack/dnsmasq-dns-5ccc8479f9-pqmt8" Dec 06 01:20:46 crc kubenswrapper[4991]: I1206 01:20:46.645228 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6d417b0-d165-455f-9505-de10a93777c4-config\") pod \"dnsmasq-dns-5ccc8479f9-pqmt8\" (UID: \"b6d417b0-d165-455f-9505-de10a93777c4\") " pod="openstack/dnsmasq-dns-5ccc8479f9-pqmt8" Dec 06 01:20:46 crc kubenswrapper[4991]: I1206 01:20:46.645326 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6d417b0-d165-455f-9505-de10a93777c4-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-pqmt8\" (UID: \"b6d417b0-d165-455f-9505-de10a93777c4\") " pod="openstack/dnsmasq-dns-5ccc8479f9-pqmt8" Dec 06 01:20:46 crc kubenswrapper[4991]: I1206 01:20:46.675675 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd4nd\" (UniqueName: \"kubernetes.io/projected/b6d417b0-d165-455f-9505-de10a93777c4-kube-api-access-hd4nd\") pod \"dnsmasq-dns-5ccc8479f9-pqmt8\" (UID: \"b6d417b0-d165-455f-9505-de10a93777c4\") " pod="openstack/dnsmasq-dns-5ccc8479f9-pqmt8" Dec 06 01:20:46 crc kubenswrapper[4991]: I1206 01:20:46.836107 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-k6rsf"] Dec 06 01:20:46 crc kubenswrapper[4991]: I1206 01:20:46.861720 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-pqmt8" Dec 06 01:20:46 crc kubenswrapper[4991]: I1206 01:20:46.864351 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-tc69q"] Dec 06 01:20:46 crc kubenswrapper[4991]: I1206 01:20:46.888855 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-tc69q" Dec 06 01:20:46 crc kubenswrapper[4991]: I1206 01:20:46.894418 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-tc69q"] Dec 06 01:20:46 crc kubenswrapper[4991]: I1206 01:20:46.957190 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ceab5909-eb2b-44e4-a0e2-3dbcfa7db592-config\") pod \"dnsmasq-dns-57d769cc4f-tc69q\" (UID: \"ceab5909-eb2b-44e4-a0e2-3dbcfa7db592\") " pod="openstack/dnsmasq-dns-57d769cc4f-tc69q" Dec 06 01:20:46 crc kubenswrapper[4991]: I1206 01:20:46.957397 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ceab5909-eb2b-44e4-a0e2-3dbcfa7db592-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-tc69q\" (UID: \"ceab5909-eb2b-44e4-a0e2-3dbcfa7db592\") " pod="openstack/dnsmasq-dns-57d769cc4f-tc69q" Dec 06 01:20:46 crc kubenswrapper[4991]: I1206 01:20:46.957657 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xstdg\" (UniqueName: \"kubernetes.io/projected/ceab5909-eb2b-44e4-a0e2-3dbcfa7db592-kube-api-access-xstdg\") pod \"dnsmasq-dns-57d769cc4f-tc69q\" (UID: \"ceab5909-eb2b-44e4-a0e2-3dbcfa7db592\") " pod="openstack/dnsmasq-dns-57d769cc4f-tc69q" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.065030 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xstdg\" (UniqueName: \"kubernetes.io/projected/ceab5909-eb2b-44e4-a0e2-3dbcfa7db592-kube-api-access-xstdg\") pod \"dnsmasq-dns-57d769cc4f-tc69q\" (UID: \"ceab5909-eb2b-44e4-a0e2-3dbcfa7db592\") " pod="openstack/dnsmasq-dns-57d769cc4f-tc69q" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.065131 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ceab5909-eb2b-44e4-a0e2-3dbcfa7db592-config\") pod \"dnsmasq-dns-57d769cc4f-tc69q\" (UID: \"ceab5909-eb2b-44e4-a0e2-3dbcfa7db592\") " pod="openstack/dnsmasq-dns-57d769cc4f-tc69q" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.065169 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ceab5909-eb2b-44e4-a0e2-3dbcfa7db592-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-tc69q\" (UID: \"ceab5909-eb2b-44e4-a0e2-3dbcfa7db592\") " pod="openstack/dnsmasq-dns-57d769cc4f-tc69q" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.066121 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ceab5909-eb2b-44e4-a0e2-3dbcfa7db592-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-tc69q\" (UID: \"ceab5909-eb2b-44e4-a0e2-3dbcfa7db592\") " pod="openstack/dnsmasq-dns-57d769cc4f-tc69q" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.066391 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ceab5909-eb2b-44e4-a0e2-3dbcfa7db592-config\") pod \"dnsmasq-dns-57d769cc4f-tc69q\" (UID: \"ceab5909-eb2b-44e4-a0e2-3dbcfa7db592\") " pod="openstack/dnsmasq-dns-57d769cc4f-tc69q" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.097927 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xstdg\" (UniqueName: \"kubernetes.io/projected/ceab5909-eb2b-44e4-a0e2-3dbcfa7db592-kube-api-access-xstdg\") pod \"dnsmasq-dns-57d769cc4f-tc69q\" (UID: \"ceab5909-eb2b-44e4-a0e2-3dbcfa7db592\") " pod="openstack/dnsmasq-dns-57d769cc4f-tc69q" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.248809 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-tc69q" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.476805 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-pqmt8"] Dec 06 01:20:47 crc kubenswrapper[4991]: W1206 01:20:47.488942 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6d417b0_d165_455f_9505_de10a93777c4.slice/crio-77ec72aac4fe99a2904b2feb019ff65768da504ca666d9d7cc7ca02bde05152c WatchSource:0}: Error finding container 77ec72aac4fe99a2904b2feb019ff65768da504ca666d9d7cc7ca02bde05152c: Status 404 returned error can't find the container with id 77ec72aac4fe99a2904b2feb019ff65768da504ca666d9d7cc7ca02bde05152c Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.663934 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.666420 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.680676 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.680719 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-dkcz5" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.681133 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.681265 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.681379 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.681503 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.690424 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.703143 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.775622 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8380dcf2-30be-4578-937b-5e2a8f4fc074-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8380dcf2-30be-4578-937b-5e2a8f4fc074\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.775706 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8380dcf2-30be-4578-937b-5e2a8f4fc074\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.775726 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8380dcf2-30be-4578-937b-5e2a8f4fc074-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8380dcf2-30be-4578-937b-5e2a8f4fc074\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.775763 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8380dcf2-30be-4578-937b-5e2a8f4fc074-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8380dcf2-30be-4578-937b-5e2a8f4fc074\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.775791 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8380dcf2-30be-4578-937b-5e2a8f4fc074-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8380dcf2-30be-4578-937b-5e2a8f4fc074\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.775817 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8380dcf2-30be-4578-937b-5e2a8f4fc074-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8380dcf2-30be-4578-937b-5e2a8f4fc074\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.775949 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dn5h\" (UniqueName: \"kubernetes.io/projected/8380dcf2-30be-4578-937b-5e2a8f4fc074-kube-api-access-5dn5h\") pod \"rabbitmq-cell1-server-0\" (UID: \"8380dcf2-30be-4578-937b-5e2a8f4fc074\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.776064 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-tc69q"] Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.776227 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8380dcf2-30be-4578-937b-5e2a8f4fc074-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8380dcf2-30be-4578-937b-5e2a8f4fc074\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.776279 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8380dcf2-30be-4578-937b-5e2a8f4fc074-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8380dcf2-30be-4578-937b-5e2a8f4fc074\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.776332 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8380dcf2-30be-4578-937b-5e2a8f4fc074-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8380dcf2-30be-4578-937b-5e2a8f4fc074\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.776355 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8380dcf2-30be-4578-937b-5e2a8f4fc074-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8380dcf2-30be-4578-937b-5e2a8f4fc074\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:20:47 crc kubenswrapper[4991]: W1206 01:20:47.784788 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podceab5909_eb2b_44e4_a0e2_3dbcfa7db592.slice/crio-a39f44d5ea884990ba22b36b87a150f29fd868c11d51325ea855eeaafb6597e6 WatchSource:0}: Error finding container a39f44d5ea884990ba22b36b87a150f29fd868c11d51325ea855eeaafb6597e6: Status 404 returned error can't find the container with id a39f44d5ea884990ba22b36b87a150f29fd868c11d51325ea855eeaafb6597e6 Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.878797 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8380dcf2-30be-4578-937b-5e2a8f4fc074-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8380dcf2-30be-4578-937b-5e2a8f4fc074\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.878954 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8380dcf2-30be-4578-937b-5e2a8f4fc074-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8380dcf2-30be-4578-937b-5e2a8f4fc074\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.879795 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8380dcf2-30be-4578-937b-5e2a8f4fc074-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8380dcf2-30be-4578-937b-5e2a8f4fc074\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.879857 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8380dcf2-30be-4578-937b-5e2a8f4fc074-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8380dcf2-30be-4578-937b-5e2a8f4fc074\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.879930 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8380dcf2-30be-4578-937b-5e2a8f4fc074\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.879958 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8380dcf2-30be-4578-937b-5e2a8f4fc074-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8380dcf2-30be-4578-937b-5e2a8f4fc074\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.880018 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8380dcf2-30be-4578-937b-5e2a8f4fc074-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8380dcf2-30be-4578-937b-5e2a8f4fc074\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.880041 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8380dcf2-30be-4578-937b-5e2a8f4fc074-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8380dcf2-30be-4578-937b-5e2a8f4fc074\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.880160 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8380dcf2-30be-4578-937b-5e2a8f4fc074-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8380dcf2-30be-4578-937b-5e2a8f4fc074\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.880195 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dn5h\" (UniqueName: \"kubernetes.io/projected/8380dcf2-30be-4578-937b-5e2a8f4fc074-kube-api-access-5dn5h\") pod \"rabbitmq-cell1-server-0\" (UID: \"8380dcf2-30be-4578-937b-5e2a8f4fc074\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.880278 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8380dcf2-30be-4578-937b-5e2a8f4fc074-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8380dcf2-30be-4578-937b-5e2a8f4fc074\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.880674 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8380dcf2-30be-4578-937b-5e2a8f4fc074-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8380dcf2-30be-4578-937b-5e2a8f4fc074\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.881009 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8380dcf2-30be-4578-937b-5e2a8f4fc074-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8380dcf2-30be-4578-937b-5e2a8f4fc074\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.881187 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8380dcf2-30be-4578-937b-5e2a8f4fc074-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8380dcf2-30be-4578-937b-5e2a8f4fc074\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.881223 4991 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8380dcf2-30be-4578-937b-5e2a8f4fc074\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.881589 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8380dcf2-30be-4578-937b-5e2a8f4fc074-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8380dcf2-30be-4578-937b-5e2a8f4fc074\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.882732 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8380dcf2-30be-4578-937b-5e2a8f4fc074-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8380dcf2-30be-4578-937b-5e2a8f4fc074\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.888671 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8380dcf2-30be-4578-937b-5e2a8f4fc074-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8380dcf2-30be-4578-937b-5e2a8f4fc074\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.889909 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8380dcf2-30be-4578-937b-5e2a8f4fc074-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8380dcf2-30be-4578-937b-5e2a8f4fc074\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.890873 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8380dcf2-30be-4578-937b-5e2a8f4fc074-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8380dcf2-30be-4578-937b-5e2a8f4fc074\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.891115 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8380dcf2-30be-4578-937b-5e2a8f4fc074-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8380dcf2-30be-4578-937b-5e2a8f4fc074\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.909282 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dn5h\" (UniqueName: \"kubernetes.io/projected/8380dcf2-30be-4578-937b-5e2a8f4fc074-kube-api-access-5dn5h\") pod \"rabbitmq-cell1-server-0\" (UID: \"8380dcf2-30be-4578-937b-5e2a8f4fc074\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:20:47 crc kubenswrapper[4991]: I1206 01:20:47.916728 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8380dcf2-30be-4578-937b-5e2a8f4fc074\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.012355 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.018612 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.018850 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.021437 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.021938 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.023802 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.022170 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.022343 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-259kt" Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.022706 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.022816 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.030835 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.185941 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-config-data\") pod \"rabbitmq-server-0\" (UID: \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\") " pod="openstack/rabbitmq-server-0" Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.186072 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\") " pod="openstack/rabbitmq-server-0" Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.186158 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\") " pod="openstack/rabbitmq-server-0" Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.186209 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\") " pod="openstack/rabbitmq-server-0" Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.186236 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\") " pod="openstack/rabbitmq-server-0" Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.186258 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\") " pod="openstack/rabbitmq-server-0" Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.186283 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\") " pod="openstack/rabbitmq-server-0" Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.186324 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68cs7\" (UniqueName: \"kubernetes.io/projected/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-kube-api-access-68cs7\") pod \"rabbitmq-server-0\" (UID: \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\") " pod="openstack/rabbitmq-server-0" Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.186403 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\") " pod="openstack/rabbitmq-server-0" Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.186440 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\") " pod="openstack/rabbitmq-server-0" Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.186468 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\") " pod="openstack/rabbitmq-server-0" Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.229859 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-tc69q" event={"ID":"ceab5909-eb2b-44e4-a0e2-3dbcfa7db592","Type":"ContainerStarted","Data":"a39f44d5ea884990ba22b36b87a150f29fd868c11d51325ea855eeaafb6597e6"} Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.231447 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-pqmt8" event={"ID":"b6d417b0-d165-455f-9505-de10a93777c4","Type":"ContainerStarted","Data":"77ec72aac4fe99a2904b2feb019ff65768da504ca666d9d7cc7ca02bde05152c"} Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.287440 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\") " pod="openstack/rabbitmq-server-0" Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.287493 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\") " pod="openstack/rabbitmq-server-0" Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.287519 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\") " pod="openstack/rabbitmq-server-0" Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.287545 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-config-data\") pod \"rabbitmq-server-0\" (UID: \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\") " pod="openstack/rabbitmq-server-0" Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.287566 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\") " pod="openstack/rabbitmq-server-0" Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.287604 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\") " pod="openstack/rabbitmq-server-0" Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.287631 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\") " pod="openstack/rabbitmq-server-0" Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.287648 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\") " pod="openstack/rabbitmq-server-0" Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.287667 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\") " pod="openstack/rabbitmq-server-0" Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.287687 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\") " pod="openstack/rabbitmq-server-0" Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.287704 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68cs7\" (UniqueName: \"kubernetes.io/projected/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-kube-api-access-68cs7\") pod \"rabbitmq-server-0\" (UID: \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\") " pod="openstack/rabbitmq-server-0" Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.288265 4991 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.295943 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\") " pod="openstack/rabbitmq-server-0" Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.296282 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\") " pod="openstack/rabbitmq-server-0" Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.298207 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\") " pod="openstack/rabbitmq-server-0" Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.298271 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-config-data\") pod \"rabbitmq-server-0\" (UID: \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\") " pod="openstack/rabbitmq-server-0" Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.300001 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\") " pod="openstack/rabbitmq-server-0" Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.305911 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\") " pod="openstack/rabbitmq-server-0" Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.306045 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\") " pod="openstack/rabbitmq-server-0" Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.306944 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\") " pod="openstack/rabbitmq-server-0" Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.307733 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\") " pod="openstack/rabbitmq-server-0" Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.312843 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68cs7\" (UniqueName: \"kubernetes.io/projected/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-kube-api-access-68cs7\") pod \"rabbitmq-server-0\" (UID: \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\") " pod="openstack/rabbitmq-server-0" Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.319346 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\") " pod="openstack/rabbitmq-server-0" Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.346233 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 06 01:20:48 crc kubenswrapper[4991]: I1206 01:20:48.908928 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 01:20:49 crc kubenswrapper[4991]: I1206 01:20:49.202416 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 01:20:49 crc kubenswrapper[4991]: I1206 01:20:49.251289 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8380dcf2-30be-4578-937b-5e2a8f4fc074","Type":"ContainerStarted","Data":"bdae74660fde60c7bce1f8fd00b55f9fd77c7d915f167364a5bb557731ed285a"} Dec 06 01:20:49 crc kubenswrapper[4991]: I1206 01:20:49.369155 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 06 01:20:49 crc kubenswrapper[4991]: I1206 01:20:49.370525 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 06 01:20:49 crc kubenswrapper[4991]: I1206 01:20:49.376608 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 06 01:20:49 crc kubenswrapper[4991]: I1206 01:20:49.377173 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 06 01:20:49 crc kubenswrapper[4991]: I1206 01:20:49.377347 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-nrzn9" Dec 06 01:20:49 crc kubenswrapper[4991]: I1206 01:20:49.377489 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 06 01:20:49 crc kubenswrapper[4991]: I1206 01:20:49.378476 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 06 01:20:49 crc kubenswrapper[4991]: I1206 01:20:49.381483 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 06 01:20:49 crc kubenswrapper[4991]: I1206 01:20:49.570908 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1b6ac217-8e85-4279-b2d1-7dd35227f828-kolla-config\") pod \"openstack-galera-0\" (UID: \"1b6ac217-8e85-4279-b2d1-7dd35227f828\") " pod="openstack/openstack-galera-0" Dec 06 01:20:49 crc kubenswrapper[4991]: I1206 01:20:49.571029 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1b6ac217-8e85-4279-b2d1-7dd35227f828-config-data-generated\") pod \"openstack-galera-0\" (UID: \"1b6ac217-8e85-4279-b2d1-7dd35227f828\") " pod="openstack/openstack-galera-0" Dec 06 01:20:49 crc kubenswrapper[4991]: I1206 01:20:49.571065 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b6ac217-8e85-4279-b2d1-7dd35227f828-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"1b6ac217-8e85-4279-b2d1-7dd35227f828\") " pod="openstack/openstack-galera-0" Dec 06 01:20:49 crc kubenswrapper[4991]: I1206 01:20:49.571090 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1b6ac217-8e85-4279-b2d1-7dd35227f828-config-data-default\") pod \"openstack-galera-0\" (UID: \"1b6ac217-8e85-4279-b2d1-7dd35227f828\") " pod="openstack/openstack-galera-0" Dec 06 01:20:49 crc kubenswrapper[4991]: I1206 01:20:49.571113 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlhcj\" (UniqueName: \"kubernetes.io/projected/1b6ac217-8e85-4279-b2d1-7dd35227f828-kube-api-access-hlhcj\") pod \"openstack-galera-0\" (UID: \"1b6ac217-8e85-4279-b2d1-7dd35227f828\") " pod="openstack/openstack-galera-0" Dec 06 01:20:49 crc kubenswrapper[4991]: I1206 01:20:49.571147 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"1b6ac217-8e85-4279-b2d1-7dd35227f828\") " pod="openstack/openstack-galera-0" Dec 06 01:20:49 crc kubenswrapper[4991]: I1206 01:20:49.571173 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b6ac217-8e85-4279-b2d1-7dd35227f828-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"1b6ac217-8e85-4279-b2d1-7dd35227f828\") " pod="openstack/openstack-galera-0" Dec 06 01:20:49 crc kubenswrapper[4991]: I1206 01:20:49.571206 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b6ac217-8e85-4279-b2d1-7dd35227f828-operator-scripts\") pod \"openstack-galera-0\" (UID: \"1b6ac217-8e85-4279-b2d1-7dd35227f828\") " pod="openstack/openstack-galera-0" Dec 06 01:20:49 crc kubenswrapper[4991]: I1206 01:20:49.674087 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b6ac217-8e85-4279-b2d1-7dd35227f828-operator-scripts\") pod \"openstack-galera-0\" (UID: \"1b6ac217-8e85-4279-b2d1-7dd35227f828\") " pod="openstack/openstack-galera-0" Dec 06 01:20:49 crc kubenswrapper[4991]: I1206 01:20:49.674166 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1b6ac217-8e85-4279-b2d1-7dd35227f828-kolla-config\") pod \"openstack-galera-0\" (UID: \"1b6ac217-8e85-4279-b2d1-7dd35227f828\") " pod="openstack/openstack-galera-0" Dec 06 01:20:49 crc kubenswrapper[4991]: I1206 01:20:49.674204 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1b6ac217-8e85-4279-b2d1-7dd35227f828-config-data-generated\") pod \"openstack-galera-0\" (UID: \"1b6ac217-8e85-4279-b2d1-7dd35227f828\") " pod="openstack/openstack-galera-0" Dec 06 01:20:49 crc kubenswrapper[4991]: I1206 01:20:49.674231 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b6ac217-8e85-4279-b2d1-7dd35227f828-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"1b6ac217-8e85-4279-b2d1-7dd35227f828\") " pod="openstack/openstack-galera-0" Dec 06 01:20:49 crc kubenswrapper[4991]: I1206 01:20:49.674253 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1b6ac217-8e85-4279-b2d1-7dd35227f828-config-data-default\") pod \"openstack-galera-0\" (UID: \"1b6ac217-8e85-4279-b2d1-7dd35227f828\") " pod="openstack/openstack-galera-0" Dec 06 01:20:49 crc kubenswrapper[4991]: I1206 01:20:49.674270 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlhcj\" (UniqueName: \"kubernetes.io/projected/1b6ac217-8e85-4279-b2d1-7dd35227f828-kube-api-access-hlhcj\") pod \"openstack-galera-0\" (UID: \"1b6ac217-8e85-4279-b2d1-7dd35227f828\") " pod="openstack/openstack-galera-0" Dec 06 01:20:49 crc kubenswrapper[4991]: I1206 01:20:49.674296 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"1b6ac217-8e85-4279-b2d1-7dd35227f828\") " pod="openstack/openstack-galera-0" Dec 06 01:20:49 crc kubenswrapper[4991]: I1206 01:20:49.674318 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b6ac217-8e85-4279-b2d1-7dd35227f828-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"1b6ac217-8e85-4279-b2d1-7dd35227f828\") " pod="openstack/openstack-galera-0" Dec 06 01:20:49 crc kubenswrapper[4991]: I1206 01:20:49.675968 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1b6ac217-8e85-4279-b2d1-7dd35227f828-config-data-generated\") pod \"openstack-galera-0\" (UID: \"1b6ac217-8e85-4279-b2d1-7dd35227f828\") " pod="openstack/openstack-galera-0" Dec 06 01:20:49 crc kubenswrapper[4991]: I1206 01:20:49.676435 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1b6ac217-8e85-4279-b2d1-7dd35227f828-kolla-config\") pod \"openstack-galera-0\" (UID: \"1b6ac217-8e85-4279-b2d1-7dd35227f828\") " pod="openstack/openstack-galera-0" Dec 06 01:20:49 crc kubenswrapper[4991]: I1206 01:20:49.676603 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1b6ac217-8e85-4279-b2d1-7dd35227f828-config-data-default\") pod \"openstack-galera-0\" (UID: \"1b6ac217-8e85-4279-b2d1-7dd35227f828\") " pod="openstack/openstack-galera-0" Dec 06 01:20:49 crc kubenswrapper[4991]: I1206 01:20:49.676822 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b6ac217-8e85-4279-b2d1-7dd35227f828-operator-scripts\") pod \"openstack-galera-0\" (UID: \"1b6ac217-8e85-4279-b2d1-7dd35227f828\") " pod="openstack/openstack-galera-0" Dec 06 01:20:49 crc kubenswrapper[4991]: I1206 01:20:49.680712 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b6ac217-8e85-4279-b2d1-7dd35227f828-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"1b6ac217-8e85-4279-b2d1-7dd35227f828\") " pod="openstack/openstack-galera-0" Dec 06 01:20:49 crc kubenswrapper[4991]: I1206 01:20:49.681303 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b6ac217-8e85-4279-b2d1-7dd35227f828-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"1b6ac217-8e85-4279-b2d1-7dd35227f828\") " pod="openstack/openstack-galera-0" Dec 06 01:20:49 crc kubenswrapper[4991]: I1206 01:20:49.681715 4991 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"1b6ac217-8e85-4279-b2d1-7dd35227f828\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-galera-0" Dec 06 01:20:49 crc kubenswrapper[4991]: I1206 01:20:49.702522 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlhcj\" (UniqueName: \"kubernetes.io/projected/1b6ac217-8e85-4279-b2d1-7dd35227f828-kube-api-access-hlhcj\") pod \"openstack-galera-0\" (UID: \"1b6ac217-8e85-4279-b2d1-7dd35227f828\") " pod="openstack/openstack-galera-0" Dec 06 01:20:49 crc kubenswrapper[4991]: I1206 01:20:49.726020 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"1b6ac217-8e85-4279-b2d1-7dd35227f828\") " pod="openstack/openstack-galera-0" Dec 06 01:20:50 crc kubenswrapper[4991]: I1206 01:20:50.004967 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:50.965504 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:50.975433 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:50.975554 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.009539 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-qprvh" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.009792 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.009929 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.010062 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.115919 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8sr7\" (UniqueName: \"kubernetes.io/projected/e937b594-10b6-4742-a47a-7d21f0c19636-kube-api-access-x8sr7\") pod \"openstack-cell1-galera-0\" (UID: \"e937b594-10b6-4742-a47a-7d21f0c19636\") " pod="openstack/openstack-cell1-galera-0" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.116032 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e937b594-10b6-4742-a47a-7d21f0c19636-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e937b594-10b6-4742-a47a-7d21f0c19636\") " pod="openstack/openstack-cell1-galera-0" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.116346 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e937b594-10b6-4742-a47a-7d21f0c19636-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e937b594-10b6-4742-a47a-7d21f0c19636\") " pod="openstack/openstack-cell1-galera-0" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.116455 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e937b594-10b6-4742-a47a-7d21f0c19636-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e937b594-10b6-4742-a47a-7d21f0c19636\") " pod="openstack/openstack-cell1-galera-0" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.116683 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e937b594-10b6-4742-a47a-7d21f0c19636\") " pod="openstack/openstack-cell1-galera-0" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.116824 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e937b594-10b6-4742-a47a-7d21f0c19636-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e937b594-10b6-4742-a47a-7d21f0c19636\") " pod="openstack/openstack-cell1-galera-0" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.116884 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e937b594-10b6-4742-a47a-7d21f0c19636-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e937b594-10b6-4742-a47a-7d21f0c19636\") " pod="openstack/openstack-cell1-galera-0" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.117056 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e937b594-10b6-4742-a47a-7d21f0c19636-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e937b594-10b6-4742-a47a-7d21f0c19636\") " pod="openstack/openstack-cell1-galera-0" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.221729 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e937b594-10b6-4742-a47a-7d21f0c19636-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e937b594-10b6-4742-a47a-7d21f0c19636\") " pod="openstack/openstack-cell1-galera-0" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.221791 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8sr7\" (UniqueName: \"kubernetes.io/projected/e937b594-10b6-4742-a47a-7d21f0c19636-kube-api-access-x8sr7\") pod \"openstack-cell1-galera-0\" (UID: \"e937b594-10b6-4742-a47a-7d21f0c19636\") " pod="openstack/openstack-cell1-galera-0" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.221826 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e937b594-10b6-4742-a47a-7d21f0c19636-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e937b594-10b6-4742-a47a-7d21f0c19636\") " pod="openstack/openstack-cell1-galera-0" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.221897 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e937b594-10b6-4742-a47a-7d21f0c19636-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e937b594-10b6-4742-a47a-7d21f0c19636\") " pod="openstack/openstack-cell1-galera-0" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.221927 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e937b594-10b6-4742-a47a-7d21f0c19636-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e937b594-10b6-4742-a47a-7d21f0c19636\") " pod="openstack/openstack-cell1-galera-0" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.222000 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e937b594-10b6-4742-a47a-7d21f0c19636\") " pod="openstack/openstack-cell1-galera-0" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.222040 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e937b594-10b6-4742-a47a-7d21f0c19636-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e937b594-10b6-4742-a47a-7d21f0c19636\") " pod="openstack/openstack-cell1-galera-0" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.222069 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e937b594-10b6-4742-a47a-7d21f0c19636-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e937b594-10b6-4742-a47a-7d21f0c19636\") " pod="openstack/openstack-cell1-galera-0" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.222260 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e937b594-10b6-4742-a47a-7d21f0c19636-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e937b594-10b6-4742-a47a-7d21f0c19636\") " pod="openstack/openstack-cell1-galera-0" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.223364 4991 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e937b594-10b6-4742-a47a-7d21f0c19636\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-cell1-galera-0" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.223777 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e937b594-10b6-4742-a47a-7d21f0c19636-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e937b594-10b6-4742-a47a-7d21f0c19636\") " pod="openstack/openstack-cell1-galera-0" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.224261 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e937b594-10b6-4742-a47a-7d21f0c19636-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e937b594-10b6-4742-a47a-7d21f0c19636\") " pod="openstack/openstack-cell1-galera-0" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.224514 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e937b594-10b6-4742-a47a-7d21f0c19636-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e937b594-10b6-4742-a47a-7d21f0c19636\") " pod="openstack/openstack-cell1-galera-0" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.234178 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e937b594-10b6-4742-a47a-7d21f0c19636-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e937b594-10b6-4742-a47a-7d21f0c19636\") " pod="openstack/openstack-cell1-galera-0" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.240069 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8sr7\" (UniqueName: \"kubernetes.io/projected/e937b594-10b6-4742-a47a-7d21f0c19636-kube-api-access-x8sr7\") pod \"openstack-cell1-galera-0\" (UID: \"e937b594-10b6-4742-a47a-7d21f0c19636\") " pod="openstack/openstack-cell1-galera-0" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.240108 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e937b594-10b6-4742-a47a-7d21f0c19636-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e937b594-10b6-4742-a47a-7d21f0c19636\") " pod="openstack/openstack-cell1-galera-0" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.258935 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e937b594-10b6-4742-a47a-7d21f0c19636\") " pod="openstack/openstack-cell1-galera-0" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.320861 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.321834 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.327546 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.327714 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.327868 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-nlcmg" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.341456 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.419574 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.425244 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/68a4be52-9eb9-4c1f-b625-de969a73a990-config-data\") pod \"memcached-0\" (UID: \"68a4be52-9eb9-4c1f-b625-de969a73a990\") " pod="openstack/memcached-0" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.425304 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68a4be52-9eb9-4c1f-b625-de969a73a990-combined-ca-bundle\") pod \"memcached-0\" (UID: \"68a4be52-9eb9-4c1f-b625-de969a73a990\") " pod="openstack/memcached-0" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.425504 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/68a4be52-9eb9-4c1f-b625-de969a73a990-kolla-config\") pod \"memcached-0\" (UID: \"68a4be52-9eb9-4c1f-b625-de969a73a990\") " pod="openstack/memcached-0" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.425561 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/68a4be52-9eb9-4c1f-b625-de969a73a990-memcached-tls-certs\") pod \"memcached-0\" (UID: \"68a4be52-9eb9-4c1f-b625-de969a73a990\") " pod="openstack/memcached-0" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.425826 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dcgd\" (UniqueName: \"kubernetes.io/projected/68a4be52-9eb9-4c1f-b625-de969a73a990-kube-api-access-8dcgd\") pod \"memcached-0\" (UID: \"68a4be52-9eb9-4c1f-b625-de969a73a990\") " pod="openstack/memcached-0" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.527139 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/68a4be52-9eb9-4c1f-b625-de969a73a990-config-data\") pod \"memcached-0\" (UID: \"68a4be52-9eb9-4c1f-b625-de969a73a990\") " pod="openstack/memcached-0" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.527190 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68a4be52-9eb9-4c1f-b625-de969a73a990-combined-ca-bundle\") pod \"memcached-0\" (UID: \"68a4be52-9eb9-4c1f-b625-de969a73a990\") " pod="openstack/memcached-0" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.527231 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/68a4be52-9eb9-4c1f-b625-de969a73a990-kolla-config\") pod \"memcached-0\" (UID: \"68a4be52-9eb9-4c1f-b625-de969a73a990\") " pod="openstack/memcached-0" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.527249 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/68a4be52-9eb9-4c1f-b625-de969a73a990-memcached-tls-certs\") pod \"memcached-0\" (UID: \"68a4be52-9eb9-4c1f-b625-de969a73a990\") " pod="openstack/memcached-0" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.527311 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dcgd\" (UniqueName: \"kubernetes.io/projected/68a4be52-9eb9-4c1f-b625-de969a73a990-kube-api-access-8dcgd\") pod \"memcached-0\" (UID: \"68a4be52-9eb9-4c1f-b625-de969a73a990\") " pod="openstack/memcached-0" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.528387 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/68a4be52-9eb9-4c1f-b625-de969a73a990-config-data\") pod \"memcached-0\" (UID: \"68a4be52-9eb9-4c1f-b625-de969a73a990\") " pod="openstack/memcached-0" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.529243 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/68a4be52-9eb9-4c1f-b625-de969a73a990-kolla-config\") pod \"memcached-0\" (UID: \"68a4be52-9eb9-4c1f-b625-de969a73a990\") " pod="openstack/memcached-0" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.552732 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68a4be52-9eb9-4c1f-b625-de969a73a990-combined-ca-bundle\") pod \"memcached-0\" (UID: \"68a4be52-9eb9-4c1f-b625-de969a73a990\") " pod="openstack/memcached-0" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.553654 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/68a4be52-9eb9-4c1f-b625-de969a73a990-memcached-tls-certs\") pod \"memcached-0\" (UID: \"68a4be52-9eb9-4c1f-b625-de969a73a990\") " pod="openstack/memcached-0" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.572404 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dcgd\" (UniqueName: \"kubernetes.io/projected/68a4be52-9eb9-4c1f-b625-de969a73a990-kube-api-access-8dcgd\") pod \"memcached-0\" (UID: \"68a4be52-9eb9-4c1f-b625-de969a73a990\") " pod="openstack/memcached-0" Dec 06 01:20:51 crc kubenswrapper[4991]: I1206 01:20:51.650095 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 06 01:20:53 crc kubenswrapper[4991]: I1206 01:20:53.599755 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 01:20:53 crc kubenswrapper[4991]: I1206 01:20:53.600954 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 06 01:20:53 crc kubenswrapper[4991]: I1206 01:20:53.603163 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-7nbm8" Dec 06 01:20:53 crc kubenswrapper[4991]: I1206 01:20:53.610551 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 01:20:53 crc kubenswrapper[4991]: I1206 01:20:53.773657 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqbh5\" (UniqueName: \"kubernetes.io/projected/c8f4ebbe-9528-4c3d-b28c-ac0eff0176d8-kube-api-access-jqbh5\") pod \"kube-state-metrics-0\" (UID: \"c8f4ebbe-9528-4c3d-b28c-ac0eff0176d8\") " pod="openstack/kube-state-metrics-0" Dec 06 01:20:53 crc kubenswrapper[4991]: I1206 01:20:53.875879 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqbh5\" (UniqueName: \"kubernetes.io/projected/c8f4ebbe-9528-4c3d-b28c-ac0eff0176d8-kube-api-access-jqbh5\") pod \"kube-state-metrics-0\" (UID: \"c8f4ebbe-9528-4c3d-b28c-ac0eff0176d8\") " pod="openstack/kube-state-metrics-0" Dec 06 01:20:53 crc kubenswrapper[4991]: I1206 01:20:53.898841 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqbh5\" (UniqueName: \"kubernetes.io/projected/c8f4ebbe-9528-4c3d-b28c-ac0eff0176d8-kube-api-access-jqbh5\") pod \"kube-state-metrics-0\" (UID: \"c8f4ebbe-9528-4c3d-b28c-ac0eff0176d8\") " pod="openstack/kube-state-metrics-0" Dec 06 01:20:53 crc kubenswrapper[4991]: I1206 01:20:53.923822 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 06 01:20:54 crc kubenswrapper[4991]: I1206 01:20:54.329807 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af","Type":"ContainerStarted","Data":"83246f8482c1f84400f6204ad561026e90de77e8cc0c9cdc11925e1cb185a0df"} Dec 06 01:20:56 crc kubenswrapper[4991]: I1206 01:20:56.691015 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 06 01:20:56 crc kubenswrapper[4991]: I1206 01:20:56.692650 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 06 01:20:56 crc kubenswrapper[4991]: I1206 01:20:56.696155 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 06 01:20:56 crc kubenswrapper[4991]: I1206 01:20:56.696221 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 06 01:20:56 crc kubenswrapper[4991]: I1206 01:20:56.701252 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-2rt27" Dec 06 01:20:56 crc kubenswrapper[4991]: I1206 01:20:56.703380 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 06 01:20:56 crc kubenswrapper[4991]: I1206 01:20:56.713311 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 06 01:20:56 crc kubenswrapper[4991]: I1206 01:20:56.719563 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 06 01:20:56 crc kubenswrapper[4991]: I1206 01:20:56.852021 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2c775c23-51a7-4213-9a70-1382e7c55a1e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2c775c23-51a7-4213-9a70-1382e7c55a1e\") " pod="openstack/ovsdbserver-sb-0" Dec 06 01:20:56 crc kubenswrapper[4991]: I1206 01:20:56.852071 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgwpk\" (UniqueName: \"kubernetes.io/projected/2c775c23-51a7-4213-9a70-1382e7c55a1e-kube-api-access-wgwpk\") pod \"ovsdbserver-sb-0\" (UID: \"2c775c23-51a7-4213-9a70-1382e7c55a1e\") " pod="openstack/ovsdbserver-sb-0" Dec 06 01:20:56 crc kubenswrapper[4991]: I1206 01:20:56.852103 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c775c23-51a7-4213-9a70-1382e7c55a1e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2c775c23-51a7-4213-9a70-1382e7c55a1e\") " pod="openstack/ovsdbserver-sb-0" Dec 06 01:20:56 crc kubenswrapper[4991]: I1206 01:20:56.852125 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2c775c23-51a7-4213-9a70-1382e7c55a1e\") " pod="openstack/ovsdbserver-sb-0" Dec 06 01:20:56 crc kubenswrapper[4991]: I1206 01:20:56.852157 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c775c23-51a7-4213-9a70-1382e7c55a1e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2c775c23-51a7-4213-9a70-1382e7c55a1e\") " pod="openstack/ovsdbserver-sb-0" Dec 06 01:20:56 crc kubenswrapper[4991]: I1206 01:20:56.852172 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c775c23-51a7-4213-9a70-1382e7c55a1e-config\") pod \"ovsdbserver-sb-0\" (UID: \"2c775c23-51a7-4213-9a70-1382e7c55a1e\") " pod="openstack/ovsdbserver-sb-0" Dec 06 01:20:56 crc kubenswrapper[4991]: I1206 01:20:56.852198 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c775c23-51a7-4213-9a70-1382e7c55a1e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2c775c23-51a7-4213-9a70-1382e7c55a1e\") " pod="openstack/ovsdbserver-sb-0" Dec 06 01:20:56 crc kubenswrapper[4991]: I1206 01:20:56.852219 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2c775c23-51a7-4213-9a70-1382e7c55a1e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2c775c23-51a7-4213-9a70-1382e7c55a1e\") " pod="openstack/ovsdbserver-sb-0" Dec 06 01:20:56 crc kubenswrapper[4991]: I1206 01:20:56.953186 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgwpk\" (UniqueName: \"kubernetes.io/projected/2c775c23-51a7-4213-9a70-1382e7c55a1e-kube-api-access-wgwpk\") pod \"ovsdbserver-sb-0\" (UID: \"2c775c23-51a7-4213-9a70-1382e7c55a1e\") " pod="openstack/ovsdbserver-sb-0" Dec 06 01:20:56 crc kubenswrapper[4991]: I1206 01:20:56.953257 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c775c23-51a7-4213-9a70-1382e7c55a1e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2c775c23-51a7-4213-9a70-1382e7c55a1e\") " pod="openstack/ovsdbserver-sb-0" Dec 06 01:20:56 crc kubenswrapper[4991]: I1206 01:20:56.953286 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2c775c23-51a7-4213-9a70-1382e7c55a1e\") " pod="openstack/ovsdbserver-sb-0" Dec 06 01:20:56 crc kubenswrapper[4991]: I1206 01:20:56.953326 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c775c23-51a7-4213-9a70-1382e7c55a1e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2c775c23-51a7-4213-9a70-1382e7c55a1e\") " pod="openstack/ovsdbserver-sb-0" Dec 06 01:20:56 crc kubenswrapper[4991]: I1206 01:20:56.953344 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c775c23-51a7-4213-9a70-1382e7c55a1e-config\") pod \"ovsdbserver-sb-0\" (UID: \"2c775c23-51a7-4213-9a70-1382e7c55a1e\") " pod="openstack/ovsdbserver-sb-0" Dec 06 01:20:56 crc kubenswrapper[4991]: I1206 01:20:56.953370 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c775c23-51a7-4213-9a70-1382e7c55a1e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2c775c23-51a7-4213-9a70-1382e7c55a1e\") " pod="openstack/ovsdbserver-sb-0" Dec 06 01:20:56 crc kubenswrapper[4991]: I1206 01:20:56.953389 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2c775c23-51a7-4213-9a70-1382e7c55a1e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2c775c23-51a7-4213-9a70-1382e7c55a1e\") " pod="openstack/ovsdbserver-sb-0" Dec 06 01:20:56 crc kubenswrapper[4991]: I1206 01:20:56.953437 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2c775c23-51a7-4213-9a70-1382e7c55a1e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2c775c23-51a7-4213-9a70-1382e7c55a1e\") " pod="openstack/ovsdbserver-sb-0" Dec 06 01:20:56 crc kubenswrapper[4991]: I1206 01:20:56.953701 4991 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2c775c23-51a7-4213-9a70-1382e7c55a1e\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-sb-0" Dec 06 01:20:56 crc kubenswrapper[4991]: I1206 01:20:56.954147 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2c775c23-51a7-4213-9a70-1382e7c55a1e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2c775c23-51a7-4213-9a70-1382e7c55a1e\") " pod="openstack/ovsdbserver-sb-0" Dec 06 01:20:56 crc kubenswrapper[4991]: I1206 01:20:56.954726 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c775c23-51a7-4213-9a70-1382e7c55a1e-config\") pod \"ovsdbserver-sb-0\" (UID: \"2c775c23-51a7-4213-9a70-1382e7c55a1e\") " pod="openstack/ovsdbserver-sb-0" Dec 06 01:20:56 crc kubenswrapper[4991]: I1206 01:20:56.955364 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2c775c23-51a7-4213-9a70-1382e7c55a1e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2c775c23-51a7-4213-9a70-1382e7c55a1e\") " pod="openstack/ovsdbserver-sb-0" Dec 06 01:20:56 crc kubenswrapper[4991]: I1206 01:20:56.958525 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c775c23-51a7-4213-9a70-1382e7c55a1e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2c775c23-51a7-4213-9a70-1382e7c55a1e\") " pod="openstack/ovsdbserver-sb-0" Dec 06 01:20:56 crc kubenswrapper[4991]: I1206 01:20:56.970192 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c775c23-51a7-4213-9a70-1382e7c55a1e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2c775c23-51a7-4213-9a70-1382e7c55a1e\") " pod="openstack/ovsdbserver-sb-0" Dec 06 01:20:56 crc kubenswrapper[4991]: I1206 01:20:56.975591 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c775c23-51a7-4213-9a70-1382e7c55a1e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2c775c23-51a7-4213-9a70-1382e7c55a1e\") " pod="openstack/ovsdbserver-sb-0" Dec 06 01:20:56 crc kubenswrapper[4991]: I1206 01:20:56.976208 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgwpk\" (UniqueName: \"kubernetes.io/projected/2c775c23-51a7-4213-9a70-1382e7c55a1e-kube-api-access-wgwpk\") pod \"ovsdbserver-sb-0\" (UID: \"2c775c23-51a7-4213-9a70-1382e7c55a1e\") " pod="openstack/ovsdbserver-sb-0" Dec 06 01:20:56 crc kubenswrapper[4991]: I1206 01:20:56.989879 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2c775c23-51a7-4213-9a70-1382e7c55a1e\") " pod="openstack/ovsdbserver-sb-0" Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.011769 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.473724 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-82ddm"] Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.475248 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-82ddm" Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.477847 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-jrsf4" Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.478177 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.482372 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.490604 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-82ddm"] Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.495005 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-529dn"] Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.498507 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-529dn" Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.526959 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-529dn"] Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.563395 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3995a5a-dfc3-4769-8161-31f7e2234d13-ovn-controller-tls-certs\") pod \"ovn-controller-82ddm\" (UID: \"a3995a5a-dfc3-4769-8161-31f7e2234d13\") " pod="openstack/ovn-controller-82ddm" Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.563443 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3995a5a-dfc3-4769-8161-31f7e2234d13-combined-ca-bundle\") pod \"ovn-controller-82ddm\" (UID: \"a3995a5a-dfc3-4769-8161-31f7e2234d13\") " pod="openstack/ovn-controller-82ddm" Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.563500 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qbvg\" (UniqueName: \"kubernetes.io/projected/a3995a5a-dfc3-4769-8161-31f7e2234d13-kube-api-access-9qbvg\") pod \"ovn-controller-82ddm\" (UID: \"a3995a5a-dfc3-4769-8161-31f7e2234d13\") " pod="openstack/ovn-controller-82ddm" Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.563533 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a3995a5a-dfc3-4769-8161-31f7e2234d13-var-log-ovn\") pod \"ovn-controller-82ddm\" (UID: \"a3995a5a-dfc3-4769-8161-31f7e2234d13\") " pod="openstack/ovn-controller-82ddm" Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.563680 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3995a5a-dfc3-4769-8161-31f7e2234d13-scripts\") pod \"ovn-controller-82ddm\" (UID: \"a3995a5a-dfc3-4769-8161-31f7e2234d13\") " pod="openstack/ovn-controller-82ddm" Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.563734 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a3995a5a-dfc3-4769-8161-31f7e2234d13-var-run-ovn\") pod \"ovn-controller-82ddm\" (UID: \"a3995a5a-dfc3-4769-8161-31f7e2234d13\") " pod="openstack/ovn-controller-82ddm" Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.563891 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a3995a5a-dfc3-4769-8161-31f7e2234d13-var-run\") pod \"ovn-controller-82ddm\" (UID: \"a3995a5a-dfc3-4769-8161-31f7e2234d13\") " pod="openstack/ovn-controller-82ddm" Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.664928 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6e642a31-7110-47d8-b322-91900cf49151-var-run\") pod \"ovn-controller-ovs-529dn\" (UID: \"6e642a31-7110-47d8-b322-91900cf49151\") " pod="openstack/ovn-controller-ovs-529dn" Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.664971 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/6e642a31-7110-47d8-b322-91900cf49151-etc-ovs\") pod \"ovn-controller-ovs-529dn\" (UID: \"6e642a31-7110-47d8-b322-91900cf49151\") " pod="openstack/ovn-controller-ovs-529dn" Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.665011 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qbvg\" (UniqueName: \"kubernetes.io/projected/a3995a5a-dfc3-4769-8161-31f7e2234d13-kube-api-access-9qbvg\") pod \"ovn-controller-82ddm\" (UID: \"a3995a5a-dfc3-4769-8161-31f7e2234d13\") " pod="openstack/ovn-controller-82ddm" Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.665066 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a3995a5a-dfc3-4769-8161-31f7e2234d13-var-log-ovn\") pod \"ovn-controller-82ddm\" (UID: \"a3995a5a-dfc3-4769-8161-31f7e2234d13\") " pod="openstack/ovn-controller-82ddm" Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.665098 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3995a5a-dfc3-4769-8161-31f7e2234d13-scripts\") pod \"ovn-controller-82ddm\" (UID: \"a3995a5a-dfc3-4769-8161-31f7e2234d13\") " pod="openstack/ovn-controller-82ddm" Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.665116 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a3995a5a-dfc3-4769-8161-31f7e2234d13-var-run-ovn\") pod \"ovn-controller-82ddm\" (UID: \"a3995a5a-dfc3-4769-8161-31f7e2234d13\") " pod="openstack/ovn-controller-82ddm" Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.665147 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/6e642a31-7110-47d8-b322-91900cf49151-var-lib\") pod \"ovn-controller-ovs-529dn\" (UID: \"6e642a31-7110-47d8-b322-91900cf49151\") " pod="openstack/ovn-controller-ovs-529dn" Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.665176 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/6e642a31-7110-47d8-b322-91900cf49151-var-log\") pod \"ovn-controller-ovs-529dn\" (UID: \"6e642a31-7110-47d8-b322-91900cf49151\") " pod="openstack/ovn-controller-ovs-529dn" Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.665193 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a3995a5a-dfc3-4769-8161-31f7e2234d13-var-run\") pod \"ovn-controller-82ddm\" (UID: \"a3995a5a-dfc3-4769-8161-31f7e2234d13\") " pod="openstack/ovn-controller-82ddm" Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.665217 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6e642a31-7110-47d8-b322-91900cf49151-scripts\") pod \"ovn-controller-ovs-529dn\" (UID: \"6e642a31-7110-47d8-b322-91900cf49151\") " pod="openstack/ovn-controller-ovs-529dn" Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.665243 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3995a5a-dfc3-4769-8161-31f7e2234d13-ovn-controller-tls-certs\") pod \"ovn-controller-82ddm\" (UID: \"a3995a5a-dfc3-4769-8161-31f7e2234d13\") " pod="openstack/ovn-controller-82ddm" Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.665264 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3995a5a-dfc3-4769-8161-31f7e2234d13-combined-ca-bundle\") pod \"ovn-controller-82ddm\" (UID: \"a3995a5a-dfc3-4769-8161-31f7e2234d13\") " pod="openstack/ovn-controller-82ddm" Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.665281 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t6qg\" (UniqueName: \"kubernetes.io/projected/6e642a31-7110-47d8-b322-91900cf49151-kube-api-access-7t6qg\") pod \"ovn-controller-ovs-529dn\" (UID: \"6e642a31-7110-47d8-b322-91900cf49151\") " pod="openstack/ovn-controller-ovs-529dn" Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.665803 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a3995a5a-dfc3-4769-8161-31f7e2234d13-var-log-ovn\") pod \"ovn-controller-82ddm\" (UID: \"a3995a5a-dfc3-4769-8161-31f7e2234d13\") " pod="openstack/ovn-controller-82ddm" Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.665920 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a3995a5a-dfc3-4769-8161-31f7e2234d13-var-run\") pod \"ovn-controller-82ddm\" (UID: \"a3995a5a-dfc3-4769-8161-31f7e2234d13\") " pod="openstack/ovn-controller-82ddm" Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.666445 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a3995a5a-dfc3-4769-8161-31f7e2234d13-var-run-ovn\") pod \"ovn-controller-82ddm\" (UID: \"a3995a5a-dfc3-4769-8161-31f7e2234d13\") " pod="openstack/ovn-controller-82ddm" Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.667849 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3995a5a-dfc3-4769-8161-31f7e2234d13-scripts\") pod \"ovn-controller-82ddm\" (UID: \"a3995a5a-dfc3-4769-8161-31f7e2234d13\") " pod="openstack/ovn-controller-82ddm" Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.669084 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3995a5a-dfc3-4769-8161-31f7e2234d13-ovn-controller-tls-certs\") pod \"ovn-controller-82ddm\" (UID: \"a3995a5a-dfc3-4769-8161-31f7e2234d13\") " pod="openstack/ovn-controller-82ddm" Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.674002 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3995a5a-dfc3-4769-8161-31f7e2234d13-combined-ca-bundle\") pod \"ovn-controller-82ddm\" (UID: \"a3995a5a-dfc3-4769-8161-31f7e2234d13\") " pod="openstack/ovn-controller-82ddm" Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.680380 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qbvg\" (UniqueName: \"kubernetes.io/projected/a3995a5a-dfc3-4769-8161-31f7e2234d13-kube-api-access-9qbvg\") pod \"ovn-controller-82ddm\" (UID: \"a3995a5a-dfc3-4769-8161-31f7e2234d13\") " pod="openstack/ovn-controller-82ddm" Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.766741 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t6qg\" (UniqueName: \"kubernetes.io/projected/6e642a31-7110-47d8-b322-91900cf49151-kube-api-access-7t6qg\") pod \"ovn-controller-ovs-529dn\" (UID: \"6e642a31-7110-47d8-b322-91900cf49151\") " pod="openstack/ovn-controller-ovs-529dn" Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.766829 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6e642a31-7110-47d8-b322-91900cf49151-var-run\") pod \"ovn-controller-ovs-529dn\" (UID: \"6e642a31-7110-47d8-b322-91900cf49151\") " pod="openstack/ovn-controller-ovs-529dn" Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.766867 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/6e642a31-7110-47d8-b322-91900cf49151-etc-ovs\") pod \"ovn-controller-ovs-529dn\" (UID: \"6e642a31-7110-47d8-b322-91900cf49151\") " pod="openstack/ovn-controller-ovs-529dn" Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.766943 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/6e642a31-7110-47d8-b322-91900cf49151-var-lib\") pod \"ovn-controller-ovs-529dn\" (UID: \"6e642a31-7110-47d8-b322-91900cf49151\") " pod="openstack/ovn-controller-ovs-529dn" Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.767010 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/6e642a31-7110-47d8-b322-91900cf49151-var-log\") pod \"ovn-controller-ovs-529dn\" (UID: \"6e642a31-7110-47d8-b322-91900cf49151\") " pod="openstack/ovn-controller-ovs-529dn" Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.767043 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6e642a31-7110-47d8-b322-91900cf49151-scripts\") pod \"ovn-controller-ovs-529dn\" (UID: \"6e642a31-7110-47d8-b322-91900cf49151\") " pod="openstack/ovn-controller-ovs-529dn" Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.767321 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6e642a31-7110-47d8-b322-91900cf49151-var-run\") pod \"ovn-controller-ovs-529dn\" (UID: \"6e642a31-7110-47d8-b322-91900cf49151\") " pod="openstack/ovn-controller-ovs-529dn" Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.767426 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/6e642a31-7110-47d8-b322-91900cf49151-var-log\") pod \"ovn-controller-ovs-529dn\" (UID: \"6e642a31-7110-47d8-b322-91900cf49151\") " pod="openstack/ovn-controller-ovs-529dn" Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.767445 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/6e642a31-7110-47d8-b322-91900cf49151-var-lib\") pod \"ovn-controller-ovs-529dn\" (UID: \"6e642a31-7110-47d8-b322-91900cf49151\") " pod="openstack/ovn-controller-ovs-529dn" Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.767457 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/6e642a31-7110-47d8-b322-91900cf49151-etc-ovs\") pod \"ovn-controller-ovs-529dn\" (UID: \"6e642a31-7110-47d8-b322-91900cf49151\") " pod="openstack/ovn-controller-ovs-529dn" Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.770186 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6e642a31-7110-47d8-b322-91900cf49151-scripts\") pod \"ovn-controller-ovs-529dn\" (UID: \"6e642a31-7110-47d8-b322-91900cf49151\") " pod="openstack/ovn-controller-ovs-529dn" Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.782683 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t6qg\" (UniqueName: \"kubernetes.io/projected/6e642a31-7110-47d8-b322-91900cf49151-kube-api-access-7t6qg\") pod \"ovn-controller-ovs-529dn\" (UID: \"6e642a31-7110-47d8-b322-91900cf49151\") " pod="openstack/ovn-controller-ovs-529dn" Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.818146 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-82ddm" Dec 06 01:20:57 crc kubenswrapper[4991]: I1206 01:20:57.829585 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-529dn" Dec 06 01:21:00 crc kubenswrapper[4991]: I1206 01:21:00.251039 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 06 01:21:00 crc kubenswrapper[4991]: I1206 01:21:00.252674 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 06 01:21:00 crc kubenswrapper[4991]: I1206 01:21:00.254820 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 06 01:21:00 crc kubenswrapper[4991]: I1206 01:21:00.255290 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-5nn8j" Dec 06 01:21:00 crc kubenswrapper[4991]: I1206 01:21:00.255482 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 06 01:21:00 crc kubenswrapper[4991]: I1206 01:21:00.255815 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 06 01:21:00 crc kubenswrapper[4991]: I1206 01:21:00.269104 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 06 01:21:00 crc kubenswrapper[4991]: I1206 01:21:00.351240 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e154c690-d20d-463c-af7d-257a3ae60f3c-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e154c690-d20d-463c-af7d-257a3ae60f3c\") " pod="openstack/ovsdbserver-nb-0" Dec 06 01:21:00 crc kubenswrapper[4991]: I1206 01:21:00.351295 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e154c690-d20d-463c-af7d-257a3ae60f3c-config\") pod \"ovsdbserver-nb-0\" (UID: \"e154c690-d20d-463c-af7d-257a3ae60f3c\") " pod="openstack/ovsdbserver-nb-0" Dec 06 01:21:00 crc kubenswrapper[4991]: I1206 01:21:00.351333 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfhmr\" (UniqueName: \"kubernetes.io/projected/e154c690-d20d-463c-af7d-257a3ae60f3c-kube-api-access-mfhmr\") pod \"ovsdbserver-nb-0\" (UID: \"e154c690-d20d-463c-af7d-257a3ae60f3c\") " pod="openstack/ovsdbserver-nb-0" Dec 06 01:21:00 crc kubenswrapper[4991]: I1206 01:21:00.351358 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e154c690-d20d-463c-af7d-257a3ae60f3c-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e154c690-d20d-463c-af7d-257a3ae60f3c\") " pod="openstack/ovsdbserver-nb-0" Dec 06 01:21:00 crc kubenswrapper[4991]: I1206 01:21:00.351562 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e154c690-d20d-463c-af7d-257a3ae60f3c\") " pod="openstack/ovsdbserver-nb-0" Dec 06 01:21:00 crc kubenswrapper[4991]: I1206 01:21:00.351617 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e154c690-d20d-463c-af7d-257a3ae60f3c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e154c690-d20d-463c-af7d-257a3ae60f3c\") " pod="openstack/ovsdbserver-nb-0" Dec 06 01:21:00 crc kubenswrapper[4991]: I1206 01:21:00.351819 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e154c690-d20d-463c-af7d-257a3ae60f3c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e154c690-d20d-463c-af7d-257a3ae60f3c\") " pod="openstack/ovsdbserver-nb-0" Dec 06 01:21:00 crc kubenswrapper[4991]: I1206 01:21:00.351865 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e154c690-d20d-463c-af7d-257a3ae60f3c-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e154c690-d20d-463c-af7d-257a3ae60f3c\") " pod="openstack/ovsdbserver-nb-0" Dec 06 01:21:00 crc kubenswrapper[4991]: I1206 01:21:00.453559 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e154c690-d20d-463c-af7d-257a3ae60f3c-config\") pod \"ovsdbserver-nb-0\" (UID: \"e154c690-d20d-463c-af7d-257a3ae60f3c\") " pod="openstack/ovsdbserver-nb-0" Dec 06 01:21:00 crc kubenswrapper[4991]: I1206 01:21:00.453665 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfhmr\" (UniqueName: \"kubernetes.io/projected/e154c690-d20d-463c-af7d-257a3ae60f3c-kube-api-access-mfhmr\") pod \"ovsdbserver-nb-0\" (UID: \"e154c690-d20d-463c-af7d-257a3ae60f3c\") " pod="openstack/ovsdbserver-nb-0" Dec 06 01:21:00 crc kubenswrapper[4991]: I1206 01:21:00.453718 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e154c690-d20d-463c-af7d-257a3ae60f3c-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e154c690-d20d-463c-af7d-257a3ae60f3c\") " pod="openstack/ovsdbserver-nb-0" Dec 06 01:21:00 crc kubenswrapper[4991]: I1206 01:21:00.453788 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e154c690-d20d-463c-af7d-257a3ae60f3c\") " pod="openstack/ovsdbserver-nb-0" Dec 06 01:21:00 crc kubenswrapper[4991]: I1206 01:21:00.453816 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e154c690-d20d-463c-af7d-257a3ae60f3c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e154c690-d20d-463c-af7d-257a3ae60f3c\") " pod="openstack/ovsdbserver-nb-0" Dec 06 01:21:00 crc kubenswrapper[4991]: I1206 01:21:00.453884 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e154c690-d20d-463c-af7d-257a3ae60f3c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e154c690-d20d-463c-af7d-257a3ae60f3c\") " pod="openstack/ovsdbserver-nb-0" Dec 06 01:21:00 crc kubenswrapper[4991]: I1206 01:21:00.453902 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e154c690-d20d-463c-af7d-257a3ae60f3c-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e154c690-d20d-463c-af7d-257a3ae60f3c\") " pod="openstack/ovsdbserver-nb-0" Dec 06 01:21:00 crc kubenswrapper[4991]: I1206 01:21:00.453958 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e154c690-d20d-463c-af7d-257a3ae60f3c-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e154c690-d20d-463c-af7d-257a3ae60f3c\") " pod="openstack/ovsdbserver-nb-0" Dec 06 01:21:00 crc kubenswrapper[4991]: I1206 01:21:00.454318 4991 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e154c690-d20d-463c-af7d-257a3ae60f3c\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-nb-0" Dec 06 01:21:00 crc kubenswrapper[4991]: I1206 01:21:00.454739 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e154c690-d20d-463c-af7d-257a3ae60f3c-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e154c690-d20d-463c-af7d-257a3ae60f3c\") " pod="openstack/ovsdbserver-nb-0" Dec 06 01:21:00 crc kubenswrapper[4991]: I1206 01:21:00.454960 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e154c690-d20d-463c-af7d-257a3ae60f3c-config\") pod \"ovsdbserver-nb-0\" (UID: \"e154c690-d20d-463c-af7d-257a3ae60f3c\") " pod="openstack/ovsdbserver-nb-0" Dec 06 01:21:00 crc kubenswrapper[4991]: I1206 01:21:00.455671 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e154c690-d20d-463c-af7d-257a3ae60f3c-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e154c690-d20d-463c-af7d-257a3ae60f3c\") " pod="openstack/ovsdbserver-nb-0" Dec 06 01:21:00 crc kubenswrapper[4991]: I1206 01:21:00.464480 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e154c690-d20d-463c-af7d-257a3ae60f3c-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e154c690-d20d-463c-af7d-257a3ae60f3c\") " pod="openstack/ovsdbserver-nb-0" Dec 06 01:21:00 crc kubenswrapper[4991]: I1206 01:21:00.465825 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e154c690-d20d-463c-af7d-257a3ae60f3c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e154c690-d20d-463c-af7d-257a3ae60f3c\") " pod="openstack/ovsdbserver-nb-0" Dec 06 01:21:00 crc kubenswrapper[4991]: I1206 01:21:00.468368 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e154c690-d20d-463c-af7d-257a3ae60f3c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e154c690-d20d-463c-af7d-257a3ae60f3c\") " pod="openstack/ovsdbserver-nb-0" Dec 06 01:21:00 crc kubenswrapper[4991]: I1206 01:21:00.478420 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfhmr\" (UniqueName: \"kubernetes.io/projected/e154c690-d20d-463c-af7d-257a3ae60f3c-kube-api-access-mfhmr\") pod \"ovsdbserver-nb-0\" (UID: \"e154c690-d20d-463c-af7d-257a3ae60f3c\") " pod="openstack/ovsdbserver-nb-0" Dec 06 01:21:00 crc kubenswrapper[4991]: I1206 01:21:00.479358 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e154c690-d20d-463c-af7d-257a3ae60f3c\") " pod="openstack/ovsdbserver-nb-0" Dec 06 01:21:00 crc kubenswrapper[4991]: I1206 01:21:00.614885 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 06 01:21:06 crc kubenswrapper[4991]: E1206 01:21:06.384331 4991 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 06 01:21:06 crc kubenswrapper[4991]: E1206 01:21:06.385208 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xstdg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-tc69q_openstack(ceab5909-eb2b-44e4-a0e2-3dbcfa7db592): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 01:21:06 crc kubenswrapper[4991]: E1206 01:21:06.386410 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-tc69q" podUID="ceab5909-eb2b-44e4-a0e2-3dbcfa7db592" Dec 06 01:21:06 crc kubenswrapper[4991]: E1206 01:21:06.399886 4991 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 06 01:21:06 crc kubenswrapper[4991]: E1206 01:21:06.400194 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jsmwt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-k6rsf_openstack(4d13b5e3-846c-44ea-ae2d-3e4e28261447): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 01:21:06 crc kubenswrapper[4991]: E1206 01:21:06.402082 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-k6rsf" podUID="4d13b5e3-846c-44ea-ae2d-3e4e28261447" Dec 06 01:21:06 crc kubenswrapper[4991]: E1206 01:21:06.420828 4991 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 06 01:21:06 crc kubenswrapper[4991]: E1206 01:21:06.420976 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sr6mf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-l2qb2_openstack(e3da75e9-2e39-490d-a4f6-ab86c82a7ce1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 01:21:06 crc kubenswrapper[4991]: E1206 01:21:06.422220 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-l2qb2" podUID="e3da75e9-2e39-490d-a4f6-ab86c82a7ce1" Dec 06 01:21:06 crc kubenswrapper[4991]: E1206 01:21:06.441605 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-tc69q" podUID="ceab5909-eb2b-44e4-a0e2-3dbcfa7db592" Dec 06 01:21:06 crc kubenswrapper[4991]: E1206 01:21:06.444938 4991 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 06 01:21:06 crc kubenswrapper[4991]: E1206 01:21:06.445109 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hd4nd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-pqmt8_openstack(b6d417b0-d165-455f-9505-de10a93777c4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 01:21:06 crc kubenswrapper[4991]: E1206 01:21:06.446261 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5ccc8479f9-pqmt8" podUID="b6d417b0-d165-455f-9505-de10a93777c4" Dec 06 01:21:07 crc kubenswrapper[4991]: E1206 01:21:07.446374 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-5ccc8479f9-pqmt8" podUID="b6d417b0-d165-455f-9505-de10a93777c4" Dec 06 01:21:07 crc kubenswrapper[4991]: I1206 01:21:07.716225 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-l2qb2" Dec 06 01:21:07 crc kubenswrapper[4991]: I1206 01:21:07.723284 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-k6rsf" Dec 06 01:21:07 crc kubenswrapper[4991]: I1206 01:21:07.833287 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sr6mf\" (UniqueName: \"kubernetes.io/projected/e3da75e9-2e39-490d-a4f6-ab86c82a7ce1-kube-api-access-sr6mf\") pod \"e3da75e9-2e39-490d-a4f6-ab86c82a7ce1\" (UID: \"e3da75e9-2e39-490d-a4f6-ab86c82a7ce1\") " Dec 06 01:21:07 crc kubenswrapper[4991]: I1206 01:21:07.833715 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3da75e9-2e39-490d-a4f6-ab86c82a7ce1-config\") pod \"e3da75e9-2e39-490d-a4f6-ab86c82a7ce1\" (UID: \"e3da75e9-2e39-490d-a4f6-ab86c82a7ce1\") " Dec 06 01:21:07 crc kubenswrapper[4991]: I1206 01:21:07.833753 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsmwt\" (UniqueName: \"kubernetes.io/projected/4d13b5e3-846c-44ea-ae2d-3e4e28261447-kube-api-access-jsmwt\") pod \"4d13b5e3-846c-44ea-ae2d-3e4e28261447\" (UID: \"4d13b5e3-846c-44ea-ae2d-3e4e28261447\") " Dec 06 01:21:07 crc kubenswrapper[4991]: I1206 01:21:07.833816 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d13b5e3-846c-44ea-ae2d-3e4e28261447-dns-svc\") pod \"4d13b5e3-846c-44ea-ae2d-3e4e28261447\" (UID: \"4d13b5e3-846c-44ea-ae2d-3e4e28261447\") " Dec 06 01:21:07 crc kubenswrapper[4991]: I1206 01:21:07.833961 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d13b5e3-846c-44ea-ae2d-3e4e28261447-config\") pod \"4d13b5e3-846c-44ea-ae2d-3e4e28261447\" (UID: \"4d13b5e3-846c-44ea-ae2d-3e4e28261447\") " Dec 06 01:21:07 crc kubenswrapper[4991]: I1206 01:21:07.834564 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d13b5e3-846c-44ea-ae2d-3e4e28261447-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4d13b5e3-846c-44ea-ae2d-3e4e28261447" (UID: "4d13b5e3-846c-44ea-ae2d-3e4e28261447"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:21:07 crc kubenswrapper[4991]: I1206 01:21:07.834591 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3da75e9-2e39-490d-a4f6-ab86c82a7ce1-config" (OuterVolumeSpecName: "config") pod "e3da75e9-2e39-490d-a4f6-ab86c82a7ce1" (UID: "e3da75e9-2e39-490d-a4f6-ab86c82a7ce1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:21:07 crc kubenswrapper[4991]: I1206 01:21:07.834752 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d13b5e3-846c-44ea-ae2d-3e4e28261447-config" (OuterVolumeSpecName: "config") pod "4d13b5e3-846c-44ea-ae2d-3e4e28261447" (UID: "4d13b5e3-846c-44ea-ae2d-3e4e28261447"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:21:07 crc kubenswrapper[4991]: I1206 01:21:07.838837 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3da75e9-2e39-490d-a4f6-ab86c82a7ce1-kube-api-access-sr6mf" (OuterVolumeSpecName: "kube-api-access-sr6mf") pod "e3da75e9-2e39-490d-a4f6-ab86c82a7ce1" (UID: "e3da75e9-2e39-490d-a4f6-ab86c82a7ce1"). InnerVolumeSpecName "kube-api-access-sr6mf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:21:07 crc kubenswrapper[4991]: I1206 01:21:07.839120 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d13b5e3-846c-44ea-ae2d-3e4e28261447-kube-api-access-jsmwt" (OuterVolumeSpecName: "kube-api-access-jsmwt") pod "4d13b5e3-846c-44ea-ae2d-3e4e28261447" (UID: "4d13b5e3-846c-44ea-ae2d-3e4e28261447"). InnerVolumeSpecName "kube-api-access-jsmwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:21:07 crc kubenswrapper[4991]: I1206 01:21:07.935623 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d13b5e3-846c-44ea-ae2d-3e4e28261447-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:21:07 crc kubenswrapper[4991]: I1206 01:21:07.935656 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sr6mf\" (UniqueName: \"kubernetes.io/projected/e3da75e9-2e39-490d-a4f6-ab86c82a7ce1-kube-api-access-sr6mf\") on node \"crc\" DevicePath \"\"" Dec 06 01:21:07 crc kubenswrapper[4991]: I1206 01:21:07.935667 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3da75e9-2e39-490d-a4f6-ab86c82a7ce1-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:21:07 crc kubenswrapper[4991]: I1206 01:21:07.935676 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsmwt\" (UniqueName: \"kubernetes.io/projected/4d13b5e3-846c-44ea-ae2d-3e4e28261447-kube-api-access-jsmwt\") on node \"crc\" DevicePath \"\"" Dec 06 01:21:07 crc kubenswrapper[4991]: I1206 01:21:07.935685 4991 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d13b5e3-846c-44ea-ae2d-3e4e28261447-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 01:21:08 crc kubenswrapper[4991]: I1206 01:21:08.215648 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 06 01:21:08 crc kubenswrapper[4991]: I1206 01:21:08.237402 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 06 01:21:08 crc kubenswrapper[4991]: W1206 01:21:08.239937 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode937b594_10b6_4742_a47a_7d21f0c19636.slice/crio-cd950a8e4979eb3a726bd9ea4cbf914c4f85e128233ee5614e0f60969f0340ef WatchSource:0}: Error finding container cd950a8e4979eb3a726bd9ea4cbf914c4f85e128233ee5614e0f60969f0340ef: Status 404 returned error can't find the container with id cd950a8e4979eb3a726bd9ea4cbf914c4f85e128233ee5614e0f60969f0340ef Dec 06 01:21:08 crc kubenswrapper[4991]: I1206 01:21:08.364257 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-529dn"] Dec 06 01:21:08 crc kubenswrapper[4991]: I1206 01:21:08.402511 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 06 01:21:08 crc kubenswrapper[4991]: W1206 01:21:08.418393 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b6ac217_8e85_4279_b2d1_7dd35227f828.slice/crio-78078d9d916cf94f758d6b6a71b73685ec2e57fb13005cd7bb8b61e5f81c161f WatchSource:0}: Error finding container 78078d9d916cf94f758d6b6a71b73685ec2e57fb13005cd7bb8b61e5f81c161f: Status 404 returned error can't find the container with id 78078d9d916cf94f758d6b6a71b73685ec2e57fb13005cd7bb8b61e5f81c161f Dec 06 01:21:08 crc kubenswrapper[4991]: I1206 01:21:08.421357 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 01:21:08 crc kubenswrapper[4991]: W1206 01:21:08.428350 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8f4ebbe_9528_4c3d_b28c_ac0eff0176d8.slice/crio-668a2e308d7ca0fab9a9fa25c913a5bf844fd23179999c4eb672a974283b9773 WatchSource:0}: Error finding container 668a2e308d7ca0fab9a9fa25c913a5bf844fd23179999c4eb672a974283b9773: Status 404 returned error can't find the container with id 668a2e308d7ca0fab9a9fa25c913a5bf844fd23179999c4eb672a974283b9773 Dec 06 01:21:08 crc kubenswrapper[4991]: I1206 01:21:08.449406 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e937b594-10b6-4742-a47a-7d21f0c19636","Type":"ContainerStarted","Data":"cd950a8e4979eb3a726bd9ea4cbf914c4f85e128233ee5614e0f60969f0340ef"} Dec 06 01:21:08 crc kubenswrapper[4991]: I1206 01:21:08.451167 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c8f4ebbe-9528-4c3d-b28c-ac0eff0176d8","Type":"ContainerStarted","Data":"668a2e308d7ca0fab9a9fa25c913a5bf844fd23179999c4eb672a974283b9773"} Dec 06 01:21:08 crc kubenswrapper[4991]: I1206 01:21:08.452724 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-l2qb2" event={"ID":"e3da75e9-2e39-490d-a4f6-ab86c82a7ce1","Type":"ContainerDied","Data":"1abd97eefeb5e5fb4f43caed88683f30e450a8d7546d0c67ebce6da6a29b1a75"} Dec 06 01:21:08 crc kubenswrapper[4991]: I1206 01:21:08.452793 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-l2qb2" Dec 06 01:21:08 crc kubenswrapper[4991]: I1206 01:21:08.457786 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-529dn" event={"ID":"6e642a31-7110-47d8-b322-91900cf49151","Type":"ContainerStarted","Data":"d5cca043b17806ec922ddb770aad05fc8a6be8e916ed3fe9ebd64da1560be5db"} Dec 06 01:21:08 crc kubenswrapper[4991]: I1206 01:21:08.458757 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"68a4be52-9eb9-4c1f-b625-de969a73a990","Type":"ContainerStarted","Data":"4dc549f63230bcc505bb80b0a3c99fb64d40eb2131d9425d03c91b5c6feb8ee9"} Dec 06 01:21:08 crc kubenswrapper[4991]: I1206 01:21:08.459822 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-k6rsf" event={"ID":"4d13b5e3-846c-44ea-ae2d-3e4e28261447","Type":"ContainerDied","Data":"3953203c4cb4439215b38c930c8be15e8ed5cf3a66f85ffe2d88917576eb947e"} Dec 06 01:21:08 crc kubenswrapper[4991]: I1206 01:21:08.459902 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-k6rsf" Dec 06 01:21:08 crc kubenswrapper[4991]: I1206 01:21:08.460821 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"1b6ac217-8e85-4279-b2d1-7dd35227f828","Type":"ContainerStarted","Data":"78078d9d916cf94f758d6b6a71b73685ec2e57fb13005cd7bb8b61e5f81c161f"} Dec 06 01:21:08 crc kubenswrapper[4991]: I1206 01:21:08.489578 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 06 01:21:08 crc kubenswrapper[4991]: W1206 01:21:08.505791 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c775c23_51a7_4213_9a70_1382e7c55a1e.slice/crio-19757fbe2dc6a6612475ef791f5bfc162c2bbdeaa92913bb7f788e3d754b2c7e WatchSource:0}: Error finding container 19757fbe2dc6a6612475ef791f5bfc162c2bbdeaa92913bb7f788e3d754b2c7e: Status 404 returned error can't find the container with id 19757fbe2dc6a6612475ef791f5bfc162c2bbdeaa92913bb7f788e3d754b2c7e Dec 06 01:21:08 crc kubenswrapper[4991]: I1206 01:21:08.516167 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-l2qb2"] Dec 06 01:21:08 crc kubenswrapper[4991]: I1206 01:21:08.521457 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-l2qb2"] Dec 06 01:21:08 crc kubenswrapper[4991]: I1206 01:21:08.545655 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-k6rsf"] Dec 06 01:21:08 crc kubenswrapper[4991]: I1206 01:21:08.581296 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-k6rsf"] Dec 06 01:21:08 crc kubenswrapper[4991]: I1206 01:21:08.607700 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-82ddm"] Dec 06 01:21:08 crc kubenswrapper[4991]: I1206 01:21:08.676218 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 06 01:21:08 crc kubenswrapper[4991]: W1206 01:21:08.785409 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode154c690_d20d_463c_af7d_257a3ae60f3c.slice/crio-c6a77d4ab66925ea20ddbe670564a297173d1b775cab01e33e8eb5d3e67463c6 WatchSource:0}: Error finding container c6a77d4ab66925ea20ddbe670564a297173d1b775cab01e33e8eb5d3e67463c6: Status 404 returned error can't find the container with id c6a77d4ab66925ea20ddbe670564a297173d1b775cab01e33e8eb5d3e67463c6 Dec 06 01:21:09 crc kubenswrapper[4991]: I1206 01:21:09.050840 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d13b5e3-846c-44ea-ae2d-3e4e28261447" path="/var/lib/kubelet/pods/4d13b5e3-846c-44ea-ae2d-3e4e28261447/volumes" Dec 06 01:21:09 crc kubenswrapper[4991]: I1206 01:21:09.051261 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3da75e9-2e39-490d-a4f6-ab86c82a7ce1" path="/var/lib/kubelet/pods/e3da75e9-2e39-490d-a4f6-ab86c82a7ce1/volumes" Dec 06 01:21:09 crc kubenswrapper[4991]: I1206 01:21:09.471366 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8380dcf2-30be-4578-937b-5e2a8f4fc074","Type":"ContainerStarted","Data":"906bedf92a02c420aa1b9648b0ca335743a0784def2ed1b54f18aedd5e785b8a"} Dec 06 01:21:09 crc kubenswrapper[4991]: I1206 01:21:09.474045 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e154c690-d20d-463c-af7d-257a3ae60f3c","Type":"ContainerStarted","Data":"c6a77d4ab66925ea20ddbe670564a297173d1b775cab01e33e8eb5d3e67463c6"} Dec 06 01:21:09 crc kubenswrapper[4991]: I1206 01:21:09.476107 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af","Type":"ContainerStarted","Data":"ba88aa1a0ccfd2b59d2138748ee66af24deb48ba69955de9f56aa083e4ca9001"} Dec 06 01:21:09 crc kubenswrapper[4991]: I1206 01:21:09.477741 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2c775c23-51a7-4213-9a70-1382e7c55a1e","Type":"ContainerStarted","Data":"19757fbe2dc6a6612475ef791f5bfc162c2bbdeaa92913bb7f788e3d754b2c7e"} Dec 06 01:21:09 crc kubenswrapper[4991]: I1206 01:21:09.479087 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-82ddm" event={"ID":"a3995a5a-dfc3-4769-8161-31f7e2234d13","Type":"ContainerStarted","Data":"b353b624b81af83c5b3476533a326757757a690f7d24bb8835f409dce634d3de"} Dec 06 01:21:16 crc kubenswrapper[4991]: I1206 01:21:16.547519 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-529dn" event={"ID":"6e642a31-7110-47d8-b322-91900cf49151","Type":"ContainerStarted","Data":"3da548c00a022ec50f6481f9c114d85d59c82e1d40f291c9f351bdc18d455535"} Dec 06 01:21:16 crc kubenswrapper[4991]: I1206 01:21:16.550707 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"68a4be52-9eb9-4c1f-b625-de969a73a990","Type":"ContainerStarted","Data":"b5c8e474698b7dd83b831f056cdb6275a033b41188a707117ac323f89b6210fe"} Dec 06 01:21:16 crc kubenswrapper[4991]: I1206 01:21:16.551161 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 06 01:21:16 crc kubenswrapper[4991]: I1206 01:21:16.553047 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2c775c23-51a7-4213-9a70-1382e7c55a1e","Type":"ContainerStarted","Data":"238917f738cf8417ac61bc7660c5b385307474d8623cd8f2a0aca1ea01ec2575"} Dec 06 01:21:16 crc kubenswrapper[4991]: I1206 01:21:16.554886 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-82ddm" event={"ID":"a3995a5a-dfc3-4769-8161-31f7e2234d13","Type":"ContainerStarted","Data":"1521cb4b992eabf3a0677c59e34f0c33265b751af5c3a6a665027bd231c0b14f"} Dec 06 01:21:16 crc kubenswrapper[4991]: I1206 01:21:16.555060 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-82ddm" Dec 06 01:21:16 crc kubenswrapper[4991]: I1206 01:21:16.556819 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"1b6ac217-8e85-4279-b2d1-7dd35227f828","Type":"ContainerStarted","Data":"a75a0613065aaeaa84ac215b026eb5525b22d9cb8199da234a7083c8e0486494"} Dec 06 01:21:16 crc kubenswrapper[4991]: I1206 01:21:16.558477 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e937b594-10b6-4742-a47a-7d21f0c19636","Type":"ContainerStarted","Data":"c866f34b320948e79a6507085f726dacff78fccb077166ff265410279ace90c1"} Dec 06 01:21:16 crc kubenswrapper[4991]: I1206 01:21:16.559884 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e154c690-d20d-463c-af7d-257a3ae60f3c","Type":"ContainerStarted","Data":"c3a3bc596196e56ec00c769b4d65395804fbe31c8525276f89d4ea5152c3edc1"} Dec 06 01:21:16 crc kubenswrapper[4991]: I1206 01:21:16.561737 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c8f4ebbe-9528-4c3d-b28c-ac0eff0176d8","Type":"ContainerStarted","Data":"f3a1e7416e7bf4f0b9c7aec629a783a2ad163987c711023f3b779d6c2d3fa94e"} Dec 06 01:21:16 crc kubenswrapper[4991]: I1206 01:21:16.561893 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 06 01:21:16 crc kubenswrapper[4991]: I1206 01:21:16.647647 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-82ddm" podStartSLOduration=13.394117705 podStartE2EDuration="19.647627353s" podCreationTimestamp="2025-12-06 01:20:57 +0000 UTC" firstStartedPulling="2025-12-06 01:21:08.688780918 +0000 UTC m=+1142.039071386" lastFinishedPulling="2025-12-06 01:21:14.942290556 +0000 UTC m=+1148.292581034" observedRunningTime="2025-12-06 01:21:16.623772768 +0000 UTC m=+1149.974063236" watchObservedRunningTime="2025-12-06 01:21:16.647627353 +0000 UTC m=+1149.997917821" Dec 06 01:21:16 crc kubenswrapper[4991]: I1206 01:21:16.686691 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=16.117783679 podStartE2EDuration="23.686652775s" podCreationTimestamp="2025-12-06 01:20:53 +0000 UTC" firstStartedPulling="2025-12-06 01:21:08.433507854 +0000 UTC m=+1141.783798312" lastFinishedPulling="2025-12-06 01:21:16.00237693 +0000 UTC m=+1149.352667408" observedRunningTime="2025-12-06 01:21:16.679346793 +0000 UTC m=+1150.029637261" watchObservedRunningTime="2025-12-06 01:21:16.686652775 +0000 UTC m=+1150.036943243" Dec 06 01:21:16 crc kubenswrapper[4991]: I1206 01:21:16.709975 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=19.255618628 podStartE2EDuration="25.709948015s" podCreationTimestamp="2025-12-06 01:20:51 +0000 UTC" firstStartedPulling="2025-12-06 01:21:08.224912763 +0000 UTC m=+1141.575203231" lastFinishedPulling="2025-12-06 01:21:14.67924212 +0000 UTC m=+1148.029532618" observedRunningTime="2025-12-06 01:21:16.703136676 +0000 UTC m=+1150.053427144" watchObservedRunningTime="2025-12-06 01:21:16.709948015 +0000 UTC m=+1150.060238493" Dec 06 01:21:17 crc kubenswrapper[4991]: I1206 01:21:17.596047 4991 generic.go:334] "Generic (PLEG): container finished" podID="6e642a31-7110-47d8-b322-91900cf49151" containerID="3da548c00a022ec50f6481f9c114d85d59c82e1d40f291c9f351bdc18d455535" exitCode=0 Dec 06 01:21:17 crc kubenswrapper[4991]: I1206 01:21:17.596145 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-529dn" event={"ID":"6e642a31-7110-47d8-b322-91900cf49151","Type":"ContainerDied","Data":"3da548c00a022ec50f6481f9c114d85d59c82e1d40f291c9f351bdc18d455535"} Dec 06 01:21:18 crc kubenswrapper[4991]: I1206 01:21:18.623510 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-529dn" event={"ID":"6e642a31-7110-47d8-b322-91900cf49151","Type":"ContainerStarted","Data":"fa1eee77990a50f68cd95342feb379ced4182e25ccda11baa84273e5eec8e974"} Dec 06 01:21:18 crc kubenswrapper[4991]: I1206 01:21:18.623906 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-529dn" event={"ID":"6e642a31-7110-47d8-b322-91900cf49151","Type":"ContainerStarted","Data":"bbc2245a9b3dd416a1b1c1f38d3e469baac083b0bff1a0183222312def332674"} Dec 06 01:21:18 crc kubenswrapper[4991]: I1206 01:21:18.624048 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-529dn" Dec 06 01:21:19 crc kubenswrapper[4991]: I1206 01:21:19.631799 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-529dn" Dec 06 01:21:20 crc kubenswrapper[4991]: I1206 01:21:20.057619 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-529dn" podStartSLOduration=16.367770901 podStartE2EDuration="23.057596922s" podCreationTimestamp="2025-12-06 01:20:57 +0000 UTC" firstStartedPulling="2025-12-06 01:21:08.374356276 +0000 UTC m=+1141.724646744" lastFinishedPulling="2025-12-06 01:21:15.064182297 +0000 UTC m=+1148.414472765" observedRunningTime="2025-12-06 01:21:18.647490103 +0000 UTC m=+1151.997780571" watchObservedRunningTime="2025-12-06 01:21:20.057596922 +0000 UTC m=+1153.407887390" Dec 06 01:21:20 crc kubenswrapper[4991]: I1206 01:21:20.644765 4991 generic.go:334] "Generic (PLEG): container finished" podID="1b6ac217-8e85-4279-b2d1-7dd35227f828" containerID="a75a0613065aaeaa84ac215b026eb5525b22d9cb8199da234a7083c8e0486494" exitCode=0 Dec 06 01:21:20 crc kubenswrapper[4991]: I1206 01:21:20.644931 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"1b6ac217-8e85-4279-b2d1-7dd35227f828","Type":"ContainerDied","Data":"a75a0613065aaeaa84ac215b026eb5525b22d9cb8199da234a7083c8e0486494"} Dec 06 01:21:20 crc kubenswrapper[4991]: I1206 01:21:20.653666 4991 generic.go:334] "Generic (PLEG): container finished" podID="e937b594-10b6-4742-a47a-7d21f0c19636" containerID="c866f34b320948e79a6507085f726dacff78fccb077166ff265410279ace90c1" exitCode=0 Dec 06 01:21:20 crc kubenswrapper[4991]: I1206 01:21:20.653805 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e937b594-10b6-4742-a47a-7d21f0c19636","Type":"ContainerDied","Data":"c866f34b320948e79a6507085f726dacff78fccb077166ff265410279ace90c1"} Dec 06 01:21:20 crc kubenswrapper[4991]: I1206 01:21:20.663608 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e154c690-d20d-463c-af7d-257a3ae60f3c","Type":"ContainerStarted","Data":"dae9f274bd301a7e2aa992d4744a87a3ccf3aa983baf79402f1824827d252ab9"} Dec 06 01:21:20 crc kubenswrapper[4991]: I1206 01:21:20.671002 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2c775c23-51a7-4213-9a70-1382e7c55a1e","Type":"ContainerStarted","Data":"7065463e85487a730c2164e9beb04a9523d339f334a752627a2b226c3d64368a"} Dec 06 01:21:20 crc kubenswrapper[4991]: I1206 01:21:20.733337 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=14.290205385 podStartE2EDuration="25.733311474s" podCreationTimestamp="2025-12-06 01:20:55 +0000 UTC" firstStartedPulling="2025-12-06 01:21:08.509693889 +0000 UTC m=+1141.859984367" lastFinishedPulling="2025-12-06 01:21:19.952799938 +0000 UTC m=+1153.303090456" observedRunningTime="2025-12-06 01:21:20.725330405 +0000 UTC m=+1154.075620883" watchObservedRunningTime="2025-12-06 01:21:20.733311474 +0000 UTC m=+1154.083601952" Dec 06 01:21:20 crc kubenswrapper[4991]: I1206 01:21:20.761230 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=10.629438936 podStartE2EDuration="21.761212254s" podCreationTimestamp="2025-12-06 01:20:59 +0000 UTC" firstStartedPulling="2025-12-06 01:21:08.787311578 +0000 UTC m=+1142.137602056" lastFinishedPulling="2025-12-06 01:21:19.919084896 +0000 UTC m=+1153.269375374" observedRunningTime="2025-12-06 01:21:20.756063709 +0000 UTC m=+1154.106354187" watchObservedRunningTime="2025-12-06 01:21:20.761212254 +0000 UTC m=+1154.111502712" Dec 06 01:21:21 crc kubenswrapper[4991]: I1206 01:21:21.012486 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 06 01:21:21 crc kubenswrapper[4991]: I1206 01:21:21.070959 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 06 01:21:21 crc kubenswrapper[4991]: I1206 01:21:21.616037 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 06 01:21:21 crc kubenswrapper[4991]: I1206 01:21:21.656224 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 06 01:21:21 crc kubenswrapper[4991]: I1206 01:21:21.657276 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 06 01:21:21 crc kubenswrapper[4991]: I1206 01:21:21.688097 4991 generic.go:334] "Generic (PLEG): container finished" podID="b6d417b0-d165-455f-9505-de10a93777c4" containerID="c63d149c1d78342596a7526bf4aab77ac8e9697185226326819460b41ddf2858" exitCode=0 Dec 06 01:21:21 crc kubenswrapper[4991]: I1206 01:21:21.688175 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-pqmt8" event={"ID":"b6d417b0-d165-455f-9505-de10a93777c4","Type":"ContainerDied","Data":"c63d149c1d78342596a7526bf4aab77ac8e9697185226326819460b41ddf2858"} Dec 06 01:21:21 crc kubenswrapper[4991]: I1206 01:21:21.691719 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"1b6ac217-8e85-4279-b2d1-7dd35227f828","Type":"ContainerStarted","Data":"4b9ec36758f37d698f4aad874e225a4b927a878fc7a28ffe790b28283a513f82"} Dec 06 01:21:21 crc kubenswrapper[4991]: I1206 01:21:21.695184 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e937b594-10b6-4742-a47a-7d21f0c19636","Type":"ContainerStarted","Data":"29f76955368f93039ae9b6a9afc7d6fa215b36cf22231a781e6c70db84616791"} Dec 06 01:21:21 crc kubenswrapper[4991]: I1206 01:21:21.695632 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 06 01:21:21 crc kubenswrapper[4991]: I1206 01:21:21.696237 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 06 01:21:21 crc kubenswrapper[4991]: I1206 01:21:21.754175 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=27.110091518 podStartE2EDuration="33.754126771s" podCreationTimestamp="2025-12-06 01:20:48 +0000 UTC" firstStartedPulling="2025-12-06 01:21:08.420594136 +0000 UTC m=+1141.770884614" lastFinishedPulling="2025-12-06 01:21:15.064629369 +0000 UTC m=+1148.414919867" observedRunningTime="2025-12-06 01:21:21.73807077 +0000 UTC m=+1155.088361238" watchObservedRunningTime="2025-12-06 01:21:21.754126771 +0000 UTC m=+1155.104417239" Dec 06 01:21:21 crc kubenswrapper[4991]: I1206 01:21:21.787627 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=26.090426463 podStartE2EDuration="32.787605017s" podCreationTimestamp="2025-12-06 01:20:49 +0000 UTC" firstStartedPulling="2025-12-06 01:21:08.244395613 +0000 UTC m=+1141.594686081" lastFinishedPulling="2025-12-06 01:21:14.941574127 +0000 UTC m=+1148.291864635" observedRunningTime="2025-12-06 01:21:21.77547482 +0000 UTC m=+1155.125765288" watchObservedRunningTime="2025-12-06 01:21:21.787605017 +0000 UTC m=+1155.137895485" Dec 06 01:21:21 crc kubenswrapper[4991]: I1206 01:21:21.802068 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 06 01:21:21 crc kubenswrapper[4991]: I1206 01:21:21.813002 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.302123 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-tc69q"] Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.335942 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-6dw72"] Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.337336 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-6dw72" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.354523 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-6dw72"] Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.354532 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.408057 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-ltzgn"] Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.409730 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-ltzgn" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.412685 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.438587 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-ltzgn"] Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.443195 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54xng\" (UniqueName: \"kubernetes.io/projected/911fb4db-d797-447e-9aa3-671f50be1815-kube-api-access-54xng\") pod \"dnsmasq-dns-6bc7876d45-6dw72\" (UID: \"911fb4db-d797-447e-9aa3-671f50be1815\") " pod="openstack/dnsmasq-dns-6bc7876d45-6dw72" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.443341 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/911fb4db-d797-447e-9aa3-671f50be1815-config\") pod \"dnsmasq-dns-6bc7876d45-6dw72\" (UID: \"911fb4db-d797-447e-9aa3-671f50be1815\") " pod="openstack/dnsmasq-dns-6bc7876d45-6dw72" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.443470 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/911fb4db-d797-447e-9aa3-671f50be1815-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-6dw72\" (UID: \"911fb4db-d797-447e-9aa3-671f50be1815\") " pod="openstack/dnsmasq-dns-6bc7876d45-6dw72" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.443582 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/911fb4db-d797-447e-9aa3-671f50be1815-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-6dw72\" (UID: \"911fb4db-d797-447e-9aa3-671f50be1815\") " pod="openstack/dnsmasq-dns-6bc7876d45-6dw72" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.509140 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.510601 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.513501 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.513645 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.513797 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-hg5lq" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.520372 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.530599 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-pqmt8"] Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.545820 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/911fb4db-d797-447e-9aa3-671f50be1815-config\") pod \"dnsmasq-dns-6bc7876d45-6dw72\" (UID: \"911fb4db-d797-447e-9aa3-671f50be1815\") " pod="openstack/dnsmasq-dns-6bc7876d45-6dw72" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.545869 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e130223-fd73-41e3-8af7-d0637555c009-combined-ca-bundle\") pod \"ovn-controller-metrics-ltzgn\" (UID: \"3e130223-fd73-41e3-8af7-d0637555c009\") " pod="openstack/ovn-controller-metrics-ltzgn" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.547059 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/911fb4db-d797-447e-9aa3-671f50be1815-config\") pod \"dnsmasq-dns-6bc7876d45-6dw72\" (UID: \"911fb4db-d797-447e-9aa3-671f50be1815\") " pod="openstack/dnsmasq-dns-6bc7876d45-6dw72" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.547117 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.560042 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/911fb4db-d797-447e-9aa3-671f50be1815-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-6dw72\" (UID: \"911fb4db-d797-447e-9aa3-671f50be1815\") " pod="openstack/dnsmasq-dns-6bc7876d45-6dw72" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.560185 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/911fb4db-d797-447e-9aa3-671f50be1815-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-6dw72\" (UID: \"911fb4db-d797-447e-9aa3-671f50be1815\") " pod="openstack/dnsmasq-dns-6bc7876d45-6dw72" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.560248 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e130223-fd73-41e3-8af7-d0637555c009-config\") pod \"ovn-controller-metrics-ltzgn\" (UID: \"3e130223-fd73-41e3-8af7-d0637555c009\") " pod="openstack/ovn-controller-metrics-ltzgn" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.560275 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e130223-fd73-41e3-8af7-d0637555c009-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-ltzgn\" (UID: \"3e130223-fd73-41e3-8af7-d0637555c009\") " pod="openstack/ovn-controller-metrics-ltzgn" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.560343 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3e130223-fd73-41e3-8af7-d0637555c009-ovn-rundir\") pod \"ovn-controller-metrics-ltzgn\" (UID: \"3e130223-fd73-41e3-8af7-d0637555c009\") " pod="openstack/ovn-controller-metrics-ltzgn" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.560443 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj7ts\" (UniqueName: \"kubernetes.io/projected/3e130223-fd73-41e3-8af7-d0637555c009-kube-api-access-gj7ts\") pod \"ovn-controller-metrics-ltzgn\" (UID: \"3e130223-fd73-41e3-8af7-d0637555c009\") " pod="openstack/ovn-controller-metrics-ltzgn" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.560500 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54xng\" (UniqueName: \"kubernetes.io/projected/911fb4db-d797-447e-9aa3-671f50be1815-kube-api-access-54xng\") pod \"dnsmasq-dns-6bc7876d45-6dw72\" (UID: \"911fb4db-d797-447e-9aa3-671f50be1815\") " pod="openstack/dnsmasq-dns-6bc7876d45-6dw72" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.560536 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3e130223-fd73-41e3-8af7-d0637555c009-ovs-rundir\") pod \"ovn-controller-metrics-ltzgn\" (UID: \"3e130223-fd73-41e3-8af7-d0637555c009\") " pod="openstack/ovn-controller-metrics-ltzgn" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.561363 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/911fb4db-d797-447e-9aa3-671f50be1815-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-6dw72\" (UID: \"911fb4db-d797-447e-9aa3-671f50be1815\") " pod="openstack/dnsmasq-dns-6bc7876d45-6dw72" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.561667 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-dlhpm"] Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.562120 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/911fb4db-d797-447e-9aa3-671f50be1815-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-6dw72\" (UID: \"911fb4db-d797-447e-9aa3-671f50be1815\") " pod="openstack/dnsmasq-dns-6bc7876d45-6dw72" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.567490 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-dlhpm" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.574641 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.581130 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-dlhpm"] Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.601505 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54xng\" (UniqueName: \"kubernetes.io/projected/911fb4db-d797-447e-9aa3-671f50be1815-kube-api-access-54xng\") pod \"dnsmasq-dns-6bc7876d45-6dw72\" (UID: \"911fb4db-d797-447e-9aa3-671f50be1815\") " pod="openstack/dnsmasq-dns-6bc7876d45-6dw72" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.663080 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3e130223-fd73-41e3-8af7-d0637555c009-ovs-rundir\") pod \"ovn-controller-metrics-ltzgn\" (UID: \"3e130223-fd73-41e3-8af7-d0637555c009\") " pod="openstack/ovn-controller-metrics-ltzgn" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.663663 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e97b3a5-f53d-4aee-9d21-3ca0d6928b31-config\") pod \"dnsmasq-dns-8554648995-dlhpm\" (UID: \"7e97b3a5-f53d-4aee-9d21-3ca0d6928b31\") " pod="openstack/dnsmasq-dns-8554648995-dlhpm" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.663693 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cznbb\" (UniqueName: \"kubernetes.io/projected/49a9417e-79cd-4b0c-a5b0-5528a9e20edc-kube-api-access-cznbb\") pod \"ovn-northd-0\" (UID: \"49a9417e-79cd-4b0c-a5b0-5528a9e20edc\") " pod="openstack/ovn-northd-0" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.663489 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3e130223-fd73-41e3-8af7-d0637555c009-ovs-rundir\") pod \"ovn-controller-metrics-ltzgn\" (UID: \"3e130223-fd73-41e3-8af7-d0637555c009\") " pod="openstack/ovn-controller-metrics-ltzgn" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.663721 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e130223-fd73-41e3-8af7-d0637555c009-combined-ca-bundle\") pod \"ovn-controller-metrics-ltzgn\" (UID: \"3e130223-fd73-41e3-8af7-d0637555c009\") " pod="openstack/ovn-controller-metrics-ltzgn" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.663842 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/49a9417e-79cd-4b0c-a5b0-5528a9e20edc-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"49a9417e-79cd-4b0c-a5b0-5528a9e20edc\") " pod="openstack/ovn-northd-0" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.663931 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e97b3a5-f53d-4aee-9d21-3ca0d6928b31-dns-svc\") pod \"dnsmasq-dns-8554648995-dlhpm\" (UID: \"7e97b3a5-f53d-4aee-9d21-3ca0d6928b31\") " pod="openstack/dnsmasq-dns-8554648995-dlhpm" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.664132 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/49a9417e-79cd-4b0c-a5b0-5528a9e20edc-scripts\") pod \"ovn-northd-0\" (UID: \"49a9417e-79cd-4b0c-a5b0-5528a9e20edc\") " pod="openstack/ovn-northd-0" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.664222 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e130223-fd73-41e3-8af7-d0637555c009-config\") pod \"ovn-controller-metrics-ltzgn\" (UID: \"3e130223-fd73-41e3-8af7-d0637555c009\") " pod="openstack/ovn-controller-metrics-ltzgn" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.664256 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e130223-fd73-41e3-8af7-d0637555c009-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-ltzgn\" (UID: \"3e130223-fd73-41e3-8af7-d0637555c009\") " pod="openstack/ovn-controller-metrics-ltzgn" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.664363 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/49a9417e-79cd-4b0c-a5b0-5528a9e20edc-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"49a9417e-79cd-4b0c-a5b0-5528a9e20edc\") " pod="openstack/ovn-northd-0" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.664395 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e97b3a5-f53d-4aee-9d21-3ca0d6928b31-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-dlhpm\" (UID: \"7e97b3a5-f53d-4aee-9d21-3ca0d6928b31\") " pod="openstack/dnsmasq-dns-8554648995-dlhpm" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.664428 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3e130223-fd73-41e3-8af7-d0637555c009-ovn-rundir\") pod \"ovn-controller-metrics-ltzgn\" (UID: \"3e130223-fd73-41e3-8af7-d0637555c009\") " pod="openstack/ovn-controller-metrics-ltzgn" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.664502 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49a9417e-79cd-4b0c-a5b0-5528a9e20edc-config\") pod \"ovn-northd-0\" (UID: \"49a9417e-79cd-4b0c-a5b0-5528a9e20edc\") " pod="openstack/ovn-northd-0" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.664555 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e97b3a5-f53d-4aee-9d21-3ca0d6928b31-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-dlhpm\" (UID: \"7e97b3a5-f53d-4aee-9d21-3ca0d6928b31\") " pod="openstack/dnsmasq-dns-8554648995-dlhpm" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.664590 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p99t6\" (UniqueName: \"kubernetes.io/projected/7e97b3a5-f53d-4aee-9d21-3ca0d6928b31-kube-api-access-p99t6\") pod \"dnsmasq-dns-8554648995-dlhpm\" (UID: \"7e97b3a5-f53d-4aee-9d21-3ca0d6928b31\") " pod="openstack/dnsmasq-dns-8554648995-dlhpm" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.664611 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49a9417e-79cd-4b0c-a5b0-5528a9e20edc-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"49a9417e-79cd-4b0c-a5b0-5528a9e20edc\") " pod="openstack/ovn-northd-0" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.664667 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj7ts\" (UniqueName: \"kubernetes.io/projected/3e130223-fd73-41e3-8af7-d0637555c009-kube-api-access-gj7ts\") pod \"ovn-controller-metrics-ltzgn\" (UID: \"3e130223-fd73-41e3-8af7-d0637555c009\") " pod="openstack/ovn-controller-metrics-ltzgn" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.664689 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/49a9417e-79cd-4b0c-a5b0-5528a9e20edc-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"49a9417e-79cd-4b0c-a5b0-5528a9e20edc\") " pod="openstack/ovn-northd-0" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.665523 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e130223-fd73-41e3-8af7-d0637555c009-config\") pod \"ovn-controller-metrics-ltzgn\" (UID: \"3e130223-fd73-41e3-8af7-d0637555c009\") " pod="openstack/ovn-controller-metrics-ltzgn" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.665942 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3e130223-fd73-41e3-8af7-d0637555c009-ovn-rundir\") pod \"ovn-controller-metrics-ltzgn\" (UID: \"3e130223-fd73-41e3-8af7-d0637555c009\") " pod="openstack/ovn-controller-metrics-ltzgn" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.669961 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e130223-fd73-41e3-8af7-d0637555c009-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-ltzgn\" (UID: \"3e130223-fd73-41e3-8af7-d0637555c009\") " pod="openstack/ovn-controller-metrics-ltzgn" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.671071 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e130223-fd73-41e3-8af7-d0637555c009-combined-ca-bundle\") pod \"ovn-controller-metrics-ltzgn\" (UID: \"3e130223-fd73-41e3-8af7-d0637555c009\") " pod="openstack/ovn-controller-metrics-ltzgn" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.679460 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-6dw72" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.695122 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj7ts\" (UniqueName: \"kubernetes.io/projected/3e130223-fd73-41e3-8af7-d0637555c009-kube-api-access-gj7ts\") pod \"ovn-controller-metrics-ltzgn\" (UID: \"3e130223-fd73-41e3-8af7-d0637555c009\") " pod="openstack/ovn-controller-metrics-ltzgn" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.706222 4991 generic.go:334] "Generic (PLEG): container finished" podID="ceab5909-eb2b-44e4-a0e2-3dbcfa7db592" containerID="a4d2ff3d829d09e64b059cf9d15d8764ece7fa745250fd87798a8d28d07c2ccd" exitCode=0 Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.706440 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-tc69q" event={"ID":"ceab5909-eb2b-44e4-a0e2-3dbcfa7db592","Type":"ContainerDied","Data":"a4d2ff3d829d09e64b059cf9d15d8764ece7fa745250fd87798a8d28d07c2ccd"} Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.717219 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc8479f9-pqmt8" podUID="b6d417b0-d165-455f-9505-de10a93777c4" containerName="dnsmasq-dns" containerID="cri-o://cec708f6e3b0cd487561cc67db39ef2994375b652d395bc9aad127ce5e1c7516" gracePeriod=10 Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.717332 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-pqmt8" event={"ID":"b6d417b0-d165-455f-9505-de10a93777c4","Type":"ContainerStarted","Data":"cec708f6e3b0cd487561cc67db39ef2994375b652d395bc9aad127ce5e1c7516"} Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.717776 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc8479f9-pqmt8" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.756852 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc8479f9-pqmt8" podStartSLOduration=3.775916475 podStartE2EDuration="36.756835273s" podCreationTimestamp="2025-12-06 01:20:46 +0000 UTC" firstStartedPulling="2025-12-06 01:20:47.492797979 +0000 UTC m=+1120.843088447" lastFinishedPulling="2025-12-06 01:21:20.473716737 +0000 UTC m=+1153.824007245" observedRunningTime="2025-12-06 01:21:22.756316879 +0000 UTC m=+1156.106607347" watchObservedRunningTime="2025-12-06 01:21:22.756835273 +0000 UTC m=+1156.107125741" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.766377 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49a9417e-79cd-4b0c-a5b0-5528a9e20edc-config\") pod \"ovn-northd-0\" (UID: \"49a9417e-79cd-4b0c-a5b0-5528a9e20edc\") " pod="openstack/ovn-northd-0" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.766434 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e97b3a5-f53d-4aee-9d21-3ca0d6928b31-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-dlhpm\" (UID: \"7e97b3a5-f53d-4aee-9d21-3ca0d6928b31\") " pod="openstack/dnsmasq-dns-8554648995-dlhpm" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.766458 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p99t6\" (UniqueName: \"kubernetes.io/projected/7e97b3a5-f53d-4aee-9d21-3ca0d6928b31-kube-api-access-p99t6\") pod \"dnsmasq-dns-8554648995-dlhpm\" (UID: \"7e97b3a5-f53d-4aee-9d21-3ca0d6928b31\") " pod="openstack/dnsmasq-dns-8554648995-dlhpm" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.766477 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49a9417e-79cd-4b0c-a5b0-5528a9e20edc-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"49a9417e-79cd-4b0c-a5b0-5528a9e20edc\") " pod="openstack/ovn-northd-0" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.766497 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/49a9417e-79cd-4b0c-a5b0-5528a9e20edc-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"49a9417e-79cd-4b0c-a5b0-5528a9e20edc\") " pod="openstack/ovn-northd-0" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.766537 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e97b3a5-f53d-4aee-9d21-3ca0d6928b31-config\") pod \"dnsmasq-dns-8554648995-dlhpm\" (UID: \"7e97b3a5-f53d-4aee-9d21-3ca0d6928b31\") " pod="openstack/dnsmasq-dns-8554648995-dlhpm" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.766558 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cznbb\" (UniqueName: \"kubernetes.io/projected/49a9417e-79cd-4b0c-a5b0-5528a9e20edc-kube-api-access-cznbb\") pod \"ovn-northd-0\" (UID: \"49a9417e-79cd-4b0c-a5b0-5528a9e20edc\") " pod="openstack/ovn-northd-0" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.766580 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/49a9417e-79cd-4b0c-a5b0-5528a9e20edc-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"49a9417e-79cd-4b0c-a5b0-5528a9e20edc\") " pod="openstack/ovn-northd-0" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.766605 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e97b3a5-f53d-4aee-9d21-3ca0d6928b31-dns-svc\") pod \"dnsmasq-dns-8554648995-dlhpm\" (UID: \"7e97b3a5-f53d-4aee-9d21-3ca0d6928b31\") " pod="openstack/dnsmasq-dns-8554648995-dlhpm" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.766635 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/49a9417e-79cd-4b0c-a5b0-5528a9e20edc-scripts\") pod \"ovn-northd-0\" (UID: \"49a9417e-79cd-4b0c-a5b0-5528a9e20edc\") " pod="openstack/ovn-northd-0" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.766671 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/49a9417e-79cd-4b0c-a5b0-5528a9e20edc-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"49a9417e-79cd-4b0c-a5b0-5528a9e20edc\") " pod="openstack/ovn-northd-0" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.766689 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e97b3a5-f53d-4aee-9d21-3ca0d6928b31-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-dlhpm\" (UID: \"7e97b3a5-f53d-4aee-9d21-3ca0d6928b31\") " pod="openstack/dnsmasq-dns-8554648995-dlhpm" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.767373 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49a9417e-79cd-4b0c-a5b0-5528a9e20edc-config\") pod \"ovn-northd-0\" (UID: \"49a9417e-79cd-4b0c-a5b0-5528a9e20edc\") " pod="openstack/ovn-northd-0" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.767432 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e97b3a5-f53d-4aee-9d21-3ca0d6928b31-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-dlhpm\" (UID: \"7e97b3a5-f53d-4aee-9d21-3ca0d6928b31\") " pod="openstack/dnsmasq-dns-8554648995-dlhpm" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.768131 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e97b3a5-f53d-4aee-9d21-3ca0d6928b31-config\") pod \"dnsmasq-dns-8554648995-dlhpm\" (UID: \"7e97b3a5-f53d-4aee-9d21-3ca0d6928b31\") " pod="openstack/dnsmasq-dns-8554648995-dlhpm" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.768737 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e97b3a5-f53d-4aee-9d21-3ca0d6928b31-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-dlhpm\" (UID: \"7e97b3a5-f53d-4aee-9d21-3ca0d6928b31\") " pod="openstack/dnsmasq-dns-8554648995-dlhpm" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.769232 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/49a9417e-79cd-4b0c-a5b0-5528a9e20edc-scripts\") pod \"ovn-northd-0\" (UID: \"49a9417e-79cd-4b0c-a5b0-5528a9e20edc\") " pod="openstack/ovn-northd-0" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.769868 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/49a9417e-79cd-4b0c-a5b0-5528a9e20edc-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"49a9417e-79cd-4b0c-a5b0-5528a9e20edc\") " pod="openstack/ovn-northd-0" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.770106 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e97b3a5-f53d-4aee-9d21-3ca0d6928b31-dns-svc\") pod \"dnsmasq-dns-8554648995-dlhpm\" (UID: \"7e97b3a5-f53d-4aee-9d21-3ca0d6928b31\") " pod="openstack/dnsmasq-dns-8554648995-dlhpm" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.771875 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/49a9417e-79cd-4b0c-a5b0-5528a9e20edc-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"49a9417e-79cd-4b0c-a5b0-5528a9e20edc\") " pod="openstack/ovn-northd-0" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.772728 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/49a9417e-79cd-4b0c-a5b0-5528a9e20edc-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"49a9417e-79cd-4b0c-a5b0-5528a9e20edc\") " pod="openstack/ovn-northd-0" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.774519 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49a9417e-79cd-4b0c-a5b0-5528a9e20edc-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"49a9417e-79cd-4b0c-a5b0-5528a9e20edc\") " pod="openstack/ovn-northd-0" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.778288 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-ltzgn" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.786127 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p99t6\" (UniqueName: \"kubernetes.io/projected/7e97b3a5-f53d-4aee-9d21-3ca0d6928b31-kube-api-access-p99t6\") pod \"dnsmasq-dns-8554648995-dlhpm\" (UID: \"7e97b3a5-f53d-4aee-9d21-3ca0d6928b31\") " pod="openstack/dnsmasq-dns-8554648995-dlhpm" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.788083 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cznbb\" (UniqueName: \"kubernetes.io/projected/49a9417e-79cd-4b0c-a5b0-5528a9e20edc-kube-api-access-cznbb\") pod \"ovn-northd-0\" (UID: \"49a9417e-79cd-4b0c-a5b0-5528a9e20edc\") " pod="openstack/ovn-northd-0" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.841522 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 06 01:21:22 crc kubenswrapper[4991]: I1206 01:21:22.895963 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-dlhpm" Dec 06 01:21:23 crc kubenswrapper[4991]: I1206 01:21:23.115040 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-tc69q" Dec 06 01:21:23 crc kubenswrapper[4991]: I1206 01:21:23.241425 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-pqmt8" Dec 06 01:21:23 crc kubenswrapper[4991]: I1206 01:21:23.274846 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ceab5909-eb2b-44e4-a0e2-3dbcfa7db592-config\") pod \"ceab5909-eb2b-44e4-a0e2-3dbcfa7db592\" (UID: \"ceab5909-eb2b-44e4-a0e2-3dbcfa7db592\") " Dec 06 01:21:23 crc kubenswrapper[4991]: I1206 01:21:23.275071 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xstdg\" (UniqueName: \"kubernetes.io/projected/ceab5909-eb2b-44e4-a0e2-3dbcfa7db592-kube-api-access-xstdg\") pod \"ceab5909-eb2b-44e4-a0e2-3dbcfa7db592\" (UID: \"ceab5909-eb2b-44e4-a0e2-3dbcfa7db592\") " Dec 06 01:21:23 crc kubenswrapper[4991]: I1206 01:21:23.275109 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ceab5909-eb2b-44e4-a0e2-3dbcfa7db592-dns-svc\") pod \"ceab5909-eb2b-44e4-a0e2-3dbcfa7db592\" (UID: \"ceab5909-eb2b-44e4-a0e2-3dbcfa7db592\") " Dec 06 01:21:23 crc kubenswrapper[4991]: I1206 01:21:23.298727 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ceab5909-eb2b-44e4-a0e2-3dbcfa7db592-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ceab5909-eb2b-44e4-a0e2-3dbcfa7db592" (UID: "ceab5909-eb2b-44e4-a0e2-3dbcfa7db592"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:21:23 crc kubenswrapper[4991]: I1206 01:21:23.305314 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ceab5909-eb2b-44e4-a0e2-3dbcfa7db592-kube-api-access-xstdg" (OuterVolumeSpecName: "kube-api-access-xstdg") pod "ceab5909-eb2b-44e4-a0e2-3dbcfa7db592" (UID: "ceab5909-eb2b-44e4-a0e2-3dbcfa7db592"). InnerVolumeSpecName "kube-api-access-xstdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:21:23 crc kubenswrapper[4991]: I1206 01:21:23.310068 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ceab5909-eb2b-44e4-a0e2-3dbcfa7db592-config" (OuterVolumeSpecName: "config") pod "ceab5909-eb2b-44e4-a0e2-3dbcfa7db592" (UID: "ceab5909-eb2b-44e4-a0e2-3dbcfa7db592"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:21:23 crc kubenswrapper[4991]: I1206 01:21:23.363819 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-6dw72"] Dec 06 01:21:23 crc kubenswrapper[4991]: W1206 01:21:23.375382 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod911fb4db_d797_447e_9aa3_671f50be1815.slice/crio-c9e5d7b74a720a0eeb9c0598c68509962531c7fc14144f90d6d867ae476d7c91 WatchSource:0}: Error finding container c9e5d7b74a720a0eeb9c0598c68509962531c7fc14144f90d6d867ae476d7c91: Status 404 returned error can't find the container with id c9e5d7b74a720a0eeb9c0598c68509962531c7fc14144f90d6d867ae476d7c91 Dec 06 01:21:23 crc kubenswrapper[4991]: I1206 01:21:23.376582 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hd4nd\" (UniqueName: \"kubernetes.io/projected/b6d417b0-d165-455f-9505-de10a93777c4-kube-api-access-hd4nd\") pod \"b6d417b0-d165-455f-9505-de10a93777c4\" (UID: \"b6d417b0-d165-455f-9505-de10a93777c4\") " Dec 06 01:21:23 crc kubenswrapper[4991]: I1206 01:21:23.376678 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6d417b0-d165-455f-9505-de10a93777c4-dns-svc\") pod \"b6d417b0-d165-455f-9505-de10a93777c4\" (UID: \"b6d417b0-d165-455f-9505-de10a93777c4\") " Dec 06 01:21:23 crc kubenswrapper[4991]: I1206 01:21:23.376747 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6d417b0-d165-455f-9505-de10a93777c4-config\") pod \"b6d417b0-d165-455f-9505-de10a93777c4\" (UID: \"b6d417b0-d165-455f-9505-de10a93777c4\") " Dec 06 01:21:23 crc kubenswrapper[4991]: I1206 01:21:23.377126 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xstdg\" (UniqueName: \"kubernetes.io/projected/ceab5909-eb2b-44e4-a0e2-3dbcfa7db592-kube-api-access-xstdg\") on node \"crc\" DevicePath \"\"" Dec 06 01:21:23 crc kubenswrapper[4991]: I1206 01:21:23.377145 4991 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ceab5909-eb2b-44e4-a0e2-3dbcfa7db592-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 01:21:23 crc kubenswrapper[4991]: I1206 01:21:23.377154 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ceab5909-eb2b-44e4-a0e2-3dbcfa7db592-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:21:23 crc kubenswrapper[4991]: I1206 01:21:23.383108 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6d417b0-d165-455f-9505-de10a93777c4-kube-api-access-hd4nd" (OuterVolumeSpecName: "kube-api-access-hd4nd") pod "b6d417b0-d165-455f-9505-de10a93777c4" (UID: "b6d417b0-d165-455f-9505-de10a93777c4"). InnerVolumeSpecName "kube-api-access-hd4nd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:21:23 crc kubenswrapper[4991]: I1206 01:21:23.414427 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6d417b0-d165-455f-9505-de10a93777c4-config" (OuterVolumeSpecName: "config") pod "b6d417b0-d165-455f-9505-de10a93777c4" (UID: "b6d417b0-d165-455f-9505-de10a93777c4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:21:23 crc kubenswrapper[4991]: I1206 01:21:23.422732 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6d417b0-d165-455f-9505-de10a93777c4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b6d417b0-d165-455f-9505-de10a93777c4" (UID: "b6d417b0-d165-455f-9505-de10a93777c4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:21:23 crc kubenswrapper[4991]: W1206 01:21:23.440690 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e130223_fd73_41e3_8af7_d0637555c009.slice/crio-b282dd6bb2d8a3b3bb1e586d24bf8f5dcec3d875c160876f20e2b670c4089b1b WatchSource:0}: Error finding container b282dd6bb2d8a3b3bb1e586d24bf8f5dcec3d875c160876f20e2b670c4089b1b: Status 404 returned error can't find the container with id b282dd6bb2d8a3b3bb1e586d24bf8f5dcec3d875c160876f20e2b670c4089b1b Dec 06 01:21:23 crc kubenswrapper[4991]: I1206 01:21:23.447000 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-ltzgn"] Dec 06 01:21:23 crc kubenswrapper[4991]: I1206 01:21:23.479680 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6d417b0-d165-455f-9505-de10a93777c4-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:21:23 crc kubenswrapper[4991]: I1206 01:21:23.479752 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hd4nd\" (UniqueName: \"kubernetes.io/projected/b6d417b0-d165-455f-9505-de10a93777c4-kube-api-access-hd4nd\") on node \"crc\" DevicePath \"\"" Dec 06 01:21:23 crc kubenswrapper[4991]: I1206 01:21:23.479765 4991 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6d417b0-d165-455f-9505-de10a93777c4-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 01:21:23 crc kubenswrapper[4991]: I1206 01:21:23.480124 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-dlhpm"] Dec 06 01:21:23 crc kubenswrapper[4991]: I1206 01:21:23.541769 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 06 01:21:23 crc kubenswrapper[4991]: I1206 01:21:23.730103 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-ltzgn" event={"ID":"3e130223-fd73-41e3-8af7-d0637555c009","Type":"ContainerStarted","Data":"b282dd6bb2d8a3b3bb1e586d24bf8f5dcec3d875c160876f20e2b670c4089b1b"} Dec 06 01:21:23 crc kubenswrapper[4991]: I1206 01:21:23.737620 4991 generic.go:334] "Generic (PLEG): container finished" podID="911fb4db-d797-447e-9aa3-671f50be1815" containerID="ea659aa8477ce81114559324bc0c8f5db8c1d4bb27ccd15decc1cc02c36ddeed" exitCode=0 Dec 06 01:21:23 crc kubenswrapper[4991]: I1206 01:21:23.737685 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-6dw72" event={"ID":"911fb4db-d797-447e-9aa3-671f50be1815","Type":"ContainerDied","Data":"ea659aa8477ce81114559324bc0c8f5db8c1d4bb27ccd15decc1cc02c36ddeed"} Dec 06 01:21:23 crc kubenswrapper[4991]: I1206 01:21:23.737705 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-6dw72" event={"ID":"911fb4db-d797-447e-9aa3-671f50be1815","Type":"ContainerStarted","Data":"c9e5d7b74a720a0eeb9c0598c68509962531c7fc14144f90d6d867ae476d7c91"} Dec 06 01:21:23 crc kubenswrapper[4991]: I1206 01:21:23.740937 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"49a9417e-79cd-4b0c-a5b0-5528a9e20edc","Type":"ContainerStarted","Data":"658dbd7b824fc84c9195416e7062c1bd43fc4bb11d861cc6deafae2cabaef287"} Dec 06 01:21:23 crc kubenswrapper[4991]: I1206 01:21:23.746339 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-tc69q" event={"ID":"ceab5909-eb2b-44e4-a0e2-3dbcfa7db592","Type":"ContainerDied","Data":"a39f44d5ea884990ba22b36b87a150f29fd868c11d51325ea855eeaafb6597e6"} Dec 06 01:21:23 crc kubenswrapper[4991]: I1206 01:21:23.746481 4991 scope.go:117] "RemoveContainer" containerID="a4d2ff3d829d09e64b059cf9d15d8764ece7fa745250fd87798a8d28d07c2ccd" Dec 06 01:21:23 crc kubenswrapper[4991]: I1206 01:21:23.746640 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-tc69q" Dec 06 01:21:23 crc kubenswrapper[4991]: I1206 01:21:23.767439 4991 generic.go:334] "Generic (PLEG): container finished" podID="b6d417b0-d165-455f-9505-de10a93777c4" containerID="cec708f6e3b0cd487561cc67db39ef2994375b652d395bc9aad127ce5e1c7516" exitCode=0 Dec 06 01:21:23 crc kubenswrapper[4991]: I1206 01:21:23.767571 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-pqmt8" event={"ID":"b6d417b0-d165-455f-9505-de10a93777c4","Type":"ContainerDied","Data":"cec708f6e3b0cd487561cc67db39ef2994375b652d395bc9aad127ce5e1c7516"} Dec 06 01:21:23 crc kubenswrapper[4991]: I1206 01:21:23.767602 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-pqmt8" event={"ID":"b6d417b0-d165-455f-9505-de10a93777c4","Type":"ContainerDied","Data":"77ec72aac4fe99a2904b2feb019ff65768da504ca666d9d7cc7ca02bde05152c"} Dec 06 01:21:23 crc kubenswrapper[4991]: I1206 01:21:23.767868 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-pqmt8" Dec 06 01:21:23 crc kubenswrapper[4991]: I1206 01:21:23.803083 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-dlhpm" event={"ID":"7e97b3a5-f53d-4aee-9d21-3ca0d6928b31","Type":"ContainerStarted","Data":"d95b1aadd43c5fa26fde516673f803479461a565fcc192a91401bb15ce699b25"} Dec 06 01:21:23 crc kubenswrapper[4991]: I1206 01:21:23.803193 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-dlhpm" event={"ID":"7e97b3a5-f53d-4aee-9d21-3ca0d6928b31","Type":"ContainerStarted","Data":"76f739d933a78b9bddb7a5add40d869c3f218ef6b99650ae6889a09c5500ad3f"} Dec 06 01:21:23 crc kubenswrapper[4991]: I1206 01:21:23.864551 4991 scope.go:117] "RemoveContainer" containerID="cec708f6e3b0cd487561cc67db39ef2994375b652d395bc9aad127ce5e1c7516" Dec 06 01:21:23 crc kubenswrapper[4991]: E1206 01:21:23.918313 4991 mount_linux.go:282] Mount failed: exit status 32 Dec 06 01:21:23 crc kubenswrapper[4991]: Mounting command: mount Dec 06 01:21:23 crc kubenswrapper[4991]: Mounting arguments: --no-canonicalize -o bind /proc/4991/fd/27 /var/lib/kubelet/pods/7e97b3a5-f53d-4aee-9d21-3ca0d6928b31/volume-subpaths/dns-svc/dnsmasq-dns/1 Dec 06 01:21:23 crc kubenswrapper[4991]: Output: mount: /var/lib/kubelet/pods/7e97b3a5-f53d-4aee-9d21-3ca0d6928b31/volume-subpaths/dns-svc/dnsmasq-dns/1: mount(2) system call failed: No such file or directory. Dec 06 01:21:23 crc kubenswrapper[4991]: I1206 01:21:23.932418 4991 scope.go:117] "RemoveContainer" containerID="c63d149c1d78342596a7526bf4aab77ac8e9697185226326819460b41ddf2858" Dec 06 01:21:23 crc kubenswrapper[4991]: I1206 01:21:23.993616 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-6dw72"] Dec 06 01:21:24 crc kubenswrapper[4991]: E1206 01:21:24.010777 4991 kubelet_pods.go:349] "Failed to prepare subPath for volumeMount of the container" err=< Dec 06 01:21:24 crc kubenswrapper[4991]: error mounting /var/lib/kubelet/pods/7e97b3a5-f53d-4aee-9d21-3ca0d6928b31/volumes/kubernetes.io~configmap/dns-svc/..2025_12_06_01_21_22.1969956174/dns-svc: mount failed: exit status 32 Dec 06 01:21:24 crc kubenswrapper[4991]: Mounting command: mount Dec 06 01:21:24 crc kubenswrapper[4991]: Mounting arguments: --no-canonicalize -o bind /proc/4991/fd/27 /var/lib/kubelet/pods/7e97b3a5-f53d-4aee-9d21-3ca0d6928b31/volume-subpaths/dns-svc/dnsmasq-dns/1 Dec 06 01:21:24 crc kubenswrapper[4991]: Output: mount: /var/lib/kubelet/pods/7e97b3a5-f53d-4aee-9d21-3ca0d6928b31/volume-subpaths/dns-svc/dnsmasq-dns/1: mount(2) system call failed: No such file or directory. Dec 06 01:21:24 crc kubenswrapper[4991]: > containerName="dnsmasq-dns" volumeMountName="dns-svc" Dec 06 01:21:24 crc kubenswrapper[4991]: E1206 01:21:24.010956 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n654h99h64ch5dbh6dh555h587h64bh5cfh647h5fdh57ch679h9h597h5f5hbch59bh54fh575h566h667h586h5f5h65ch5bch57h68h65ch58bh694h5cfq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p99t6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-8554648995-dlhpm_openstack(7e97b3a5-f53d-4aee-9d21-3ca0d6928b31): CreateContainerConfigError: failed to prepare subPath for volumeMount \"dns-svc\" of container \"dnsmasq-dns\"" logger="UnhandledError" Dec 06 01:21:24 crc kubenswrapper[4991]: E1206 01:21:24.016268 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerConfigError: \"failed to prepare subPath for volumeMount \\\"dns-svc\\\" of container \\\"dnsmasq-dns\\\"\"" pod="openstack/dnsmasq-dns-8554648995-dlhpm" podUID="7e97b3a5-f53d-4aee-9d21-3ca0d6928b31" Dec 06 01:21:24 crc kubenswrapper[4991]: I1206 01:21:24.021148 4991 scope.go:117] "RemoveContainer" containerID="cec708f6e3b0cd487561cc67db39ef2994375b652d395bc9aad127ce5e1c7516" Dec 06 01:21:24 crc kubenswrapper[4991]: I1206 01:21:24.029204 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 06 01:21:24 crc kubenswrapper[4991]: E1206 01:21:24.030653 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cec708f6e3b0cd487561cc67db39ef2994375b652d395bc9aad127ce5e1c7516\": container with ID starting with cec708f6e3b0cd487561cc67db39ef2994375b652d395bc9aad127ce5e1c7516 not found: ID does not exist" containerID="cec708f6e3b0cd487561cc67db39ef2994375b652d395bc9aad127ce5e1c7516" Dec 06 01:21:24 crc kubenswrapper[4991]: I1206 01:21:24.030704 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cec708f6e3b0cd487561cc67db39ef2994375b652d395bc9aad127ce5e1c7516"} err="failed to get container status \"cec708f6e3b0cd487561cc67db39ef2994375b652d395bc9aad127ce5e1c7516\": rpc error: code = NotFound desc = could not find container \"cec708f6e3b0cd487561cc67db39ef2994375b652d395bc9aad127ce5e1c7516\": container with ID starting with cec708f6e3b0cd487561cc67db39ef2994375b652d395bc9aad127ce5e1c7516 not found: ID does not exist" Dec 06 01:21:24 crc kubenswrapper[4991]: I1206 01:21:24.030734 4991 scope.go:117] "RemoveContainer" containerID="c63d149c1d78342596a7526bf4aab77ac8e9697185226326819460b41ddf2858" Dec 06 01:21:24 crc kubenswrapper[4991]: E1206 01:21:24.032806 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c63d149c1d78342596a7526bf4aab77ac8e9697185226326819460b41ddf2858\": container with ID starting with c63d149c1d78342596a7526bf4aab77ac8e9697185226326819460b41ddf2858 not found: ID does not exist" containerID="c63d149c1d78342596a7526bf4aab77ac8e9697185226326819460b41ddf2858" Dec 06 01:21:24 crc kubenswrapper[4991]: I1206 01:21:24.032840 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c63d149c1d78342596a7526bf4aab77ac8e9697185226326819460b41ddf2858"} err="failed to get container status \"c63d149c1d78342596a7526bf4aab77ac8e9697185226326819460b41ddf2858\": rpc error: code = NotFound desc = could not find container \"c63d149c1d78342596a7526bf4aab77ac8e9697185226326819460b41ddf2858\": container with ID starting with c63d149c1d78342596a7526bf4aab77ac8e9697185226326819460b41ddf2858 not found: ID does not exist" Dec 06 01:21:24 crc kubenswrapper[4991]: I1206 01:21:24.055524 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-pqmt8"] Dec 06 01:21:24 crc kubenswrapper[4991]: I1206 01:21:24.089121 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-pqmt8"] Dec 06 01:21:24 crc kubenswrapper[4991]: I1206 01:21:24.116121 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-2wbr6"] Dec 06 01:21:24 crc kubenswrapper[4991]: E1206 01:21:24.116508 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6d417b0-d165-455f-9505-de10a93777c4" containerName="dnsmasq-dns" Dec 06 01:21:24 crc kubenswrapper[4991]: I1206 01:21:24.116528 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6d417b0-d165-455f-9505-de10a93777c4" containerName="dnsmasq-dns" Dec 06 01:21:24 crc kubenswrapper[4991]: E1206 01:21:24.116552 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceab5909-eb2b-44e4-a0e2-3dbcfa7db592" containerName="init" Dec 06 01:21:24 crc kubenswrapper[4991]: I1206 01:21:24.116559 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceab5909-eb2b-44e4-a0e2-3dbcfa7db592" containerName="init" Dec 06 01:21:24 crc kubenswrapper[4991]: E1206 01:21:24.116577 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6d417b0-d165-455f-9505-de10a93777c4" containerName="init" Dec 06 01:21:24 crc kubenswrapper[4991]: I1206 01:21:24.116584 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6d417b0-d165-455f-9505-de10a93777c4" containerName="init" Dec 06 01:21:24 crc kubenswrapper[4991]: I1206 01:21:24.116758 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceab5909-eb2b-44e4-a0e2-3dbcfa7db592" containerName="init" Dec 06 01:21:24 crc kubenswrapper[4991]: I1206 01:21:24.116771 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6d417b0-d165-455f-9505-de10a93777c4" containerName="dnsmasq-dns" Dec 06 01:21:24 crc kubenswrapper[4991]: I1206 01:21:24.117626 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-2wbr6" Dec 06 01:21:24 crc kubenswrapper[4991]: I1206 01:21:24.124519 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-tc69q"] Dec 06 01:21:24 crc kubenswrapper[4991]: E1206 01:21:24.126291 4991 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6d417b0_d165_455f_9505_de10a93777c4.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podceab5909_eb2b_44e4_a0e2_3dbcfa7db592.slice/crio-a39f44d5ea884990ba22b36b87a150f29fd868c11d51325ea855eeaafb6597e6\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e97b3a5_f53d_4aee_9d21_3ca0d6928b31.slice/crio-conmon-d95b1aadd43c5fa26fde516673f803479461a565fcc192a91401bb15ce699b25.scope\": RecentStats: unable to find data in memory cache]" Dec 06 01:21:24 crc kubenswrapper[4991]: I1206 01:21:24.137781 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-tc69q"] Dec 06 01:21:24 crc kubenswrapper[4991]: I1206 01:21:24.144389 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-2wbr6"] Dec 06 01:21:24 crc kubenswrapper[4991]: I1206 01:21:24.215390 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/257d40ce-9d2a-4cc4-9e89-4cb777582c47-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-2wbr6\" (UID: \"257d40ce-9d2a-4cc4-9e89-4cb777582c47\") " pod="openstack/dnsmasq-dns-b8fbc5445-2wbr6" Dec 06 01:21:24 crc kubenswrapper[4991]: I1206 01:21:24.215753 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/257d40ce-9d2a-4cc4-9e89-4cb777582c47-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-2wbr6\" (UID: \"257d40ce-9d2a-4cc4-9e89-4cb777582c47\") " pod="openstack/dnsmasq-dns-b8fbc5445-2wbr6" Dec 06 01:21:24 crc kubenswrapper[4991]: I1206 01:21:24.215842 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/257d40ce-9d2a-4cc4-9e89-4cb777582c47-config\") pod \"dnsmasq-dns-b8fbc5445-2wbr6\" (UID: \"257d40ce-9d2a-4cc4-9e89-4cb777582c47\") " pod="openstack/dnsmasq-dns-b8fbc5445-2wbr6" Dec 06 01:21:24 crc kubenswrapper[4991]: I1206 01:21:24.215895 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/257d40ce-9d2a-4cc4-9e89-4cb777582c47-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-2wbr6\" (UID: \"257d40ce-9d2a-4cc4-9e89-4cb777582c47\") " pod="openstack/dnsmasq-dns-b8fbc5445-2wbr6" Dec 06 01:21:24 crc kubenswrapper[4991]: I1206 01:21:24.215912 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlc9z\" (UniqueName: \"kubernetes.io/projected/257d40ce-9d2a-4cc4-9e89-4cb777582c47-kube-api-access-tlc9z\") pod \"dnsmasq-dns-b8fbc5445-2wbr6\" (UID: \"257d40ce-9d2a-4cc4-9e89-4cb777582c47\") " pod="openstack/dnsmasq-dns-b8fbc5445-2wbr6" Dec 06 01:21:24 crc kubenswrapper[4991]: E1206 01:21:24.234026 4991 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Dec 06 01:21:24 crc kubenswrapper[4991]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/911fb4db-d797-447e-9aa3-671f50be1815/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 06 01:21:24 crc kubenswrapper[4991]: > podSandboxID="c9e5d7b74a720a0eeb9c0598c68509962531c7fc14144f90d6d867ae476d7c91" Dec 06 01:21:24 crc kubenswrapper[4991]: E1206 01:21:24.234226 4991 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 06 01:21:24 crc kubenswrapper[4991]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n8ch647h5fdh676h5c8h566h96h5d8hdh569h64dh5b5h587h55h5cch58dh658h67h5f6h64fh648h6h59fh65ch7hf9hf6h74hf8hch596h5b8q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-54xng,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6bc7876d45-6dw72_openstack(911fb4db-d797-447e-9aa3-671f50be1815): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/911fb4db-d797-447e-9aa3-671f50be1815/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 06 01:21:24 crc kubenswrapper[4991]: > logger="UnhandledError" Dec 06 01:21:24 crc kubenswrapper[4991]: E1206 01:21:24.235797 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/911fb4db-d797-447e-9aa3-671f50be1815/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-6bc7876d45-6dw72" podUID="911fb4db-d797-447e-9aa3-671f50be1815" Dec 06 01:21:24 crc kubenswrapper[4991]: I1206 01:21:24.318591 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/257d40ce-9d2a-4cc4-9e89-4cb777582c47-config\") pod \"dnsmasq-dns-b8fbc5445-2wbr6\" (UID: \"257d40ce-9d2a-4cc4-9e89-4cb777582c47\") " pod="openstack/dnsmasq-dns-b8fbc5445-2wbr6" Dec 06 01:21:24 crc kubenswrapper[4991]: I1206 01:21:24.318662 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlc9z\" (UniqueName: \"kubernetes.io/projected/257d40ce-9d2a-4cc4-9e89-4cb777582c47-kube-api-access-tlc9z\") pod \"dnsmasq-dns-b8fbc5445-2wbr6\" (UID: \"257d40ce-9d2a-4cc4-9e89-4cb777582c47\") " pod="openstack/dnsmasq-dns-b8fbc5445-2wbr6" Dec 06 01:21:24 crc kubenswrapper[4991]: I1206 01:21:24.318679 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/257d40ce-9d2a-4cc4-9e89-4cb777582c47-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-2wbr6\" (UID: \"257d40ce-9d2a-4cc4-9e89-4cb777582c47\") " pod="openstack/dnsmasq-dns-b8fbc5445-2wbr6" Dec 06 01:21:24 crc kubenswrapper[4991]: I1206 01:21:24.318739 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/257d40ce-9d2a-4cc4-9e89-4cb777582c47-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-2wbr6\" (UID: \"257d40ce-9d2a-4cc4-9e89-4cb777582c47\") " pod="openstack/dnsmasq-dns-b8fbc5445-2wbr6" Dec 06 01:21:24 crc kubenswrapper[4991]: I1206 01:21:24.318767 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/257d40ce-9d2a-4cc4-9e89-4cb777582c47-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-2wbr6\" (UID: \"257d40ce-9d2a-4cc4-9e89-4cb777582c47\") " pod="openstack/dnsmasq-dns-b8fbc5445-2wbr6" Dec 06 01:21:24 crc kubenswrapper[4991]: I1206 01:21:24.319998 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/257d40ce-9d2a-4cc4-9e89-4cb777582c47-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-2wbr6\" (UID: \"257d40ce-9d2a-4cc4-9e89-4cb777582c47\") " pod="openstack/dnsmasq-dns-b8fbc5445-2wbr6" Dec 06 01:21:24 crc kubenswrapper[4991]: I1206 01:21:24.320554 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/257d40ce-9d2a-4cc4-9e89-4cb777582c47-config\") pod \"dnsmasq-dns-b8fbc5445-2wbr6\" (UID: \"257d40ce-9d2a-4cc4-9e89-4cb777582c47\") " pod="openstack/dnsmasq-dns-b8fbc5445-2wbr6" Dec 06 01:21:24 crc kubenswrapper[4991]: I1206 01:21:24.320783 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/257d40ce-9d2a-4cc4-9e89-4cb777582c47-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-2wbr6\" (UID: \"257d40ce-9d2a-4cc4-9e89-4cb777582c47\") " pod="openstack/dnsmasq-dns-b8fbc5445-2wbr6" Dec 06 01:21:24 crc kubenswrapper[4991]: I1206 01:21:24.321219 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/257d40ce-9d2a-4cc4-9e89-4cb777582c47-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-2wbr6\" (UID: \"257d40ce-9d2a-4cc4-9e89-4cb777582c47\") " pod="openstack/dnsmasq-dns-b8fbc5445-2wbr6" Dec 06 01:21:24 crc kubenswrapper[4991]: I1206 01:21:24.360194 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlc9z\" (UniqueName: \"kubernetes.io/projected/257d40ce-9d2a-4cc4-9e89-4cb777582c47-kube-api-access-tlc9z\") pod \"dnsmasq-dns-b8fbc5445-2wbr6\" (UID: \"257d40ce-9d2a-4cc4-9e89-4cb777582c47\") " pod="openstack/dnsmasq-dns-b8fbc5445-2wbr6" Dec 06 01:21:24 crc kubenswrapper[4991]: I1206 01:21:24.475605 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-2wbr6" Dec 06 01:21:24 crc kubenswrapper[4991]: I1206 01:21:24.810929 4991 generic.go:334] "Generic (PLEG): container finished" podID="7e97b3a5-f53d-4aee-9d21-3ca0d6928b31" containerID="d95b1aadd43c5fa26fde516673f803479461a565fcc192a91401bb15ce699b25" exitCode=0 Dec 06 01:21:24 crc kubenswrapper[4991]: I1206 01:21:24.811330 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-dlhpm" event={"ID":"7e97b3a5-f53d-4aee-9d21-3ca0d6928b31","Type":"ContainerDied","Data":"d95b1aadd43c5fa26fde516673f803479461a565fcc192a91401bb15ce699b25"} Dec 06 01:21:24 crc kubenswrapper[4991]: I1206 01:21:24.817221 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-ltzgn" event={"ID":"3e130223-fd73-41e3-8af7-d0637555c009","Type":"ContainerStarted","Data":"d26a1e4a22d15d217ef6de7ad40043f9ad324cd54ceb72379296235a4529a43d"} Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.050290 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6d417b0-d165-455f-9505-de10a93777c4" path="/var/lib/kubelet/pods/b6d417b0-d165-455f-9505-de10a93777c4/volumes" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.051162 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ceab5909-eb2b-44e4-a0e2-3dbcfa7db592" path="/var/lib/kubelet/pods/ceab5909-eb2b-44e4-a0e2-3dbcfa7db592/volumes" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.057477 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-ltzgn" podStartSLOduration=3.057459938 podStartE2EDuration="3.057459938s" podCreationTimestamp="2025-12-06 01:21:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:21:24.865680797 +0000 UTC m=+1158.215971265" watchObservedRunningTime="2025-12-06 01:21:25.057459938 +0000 UTC m=+1158.407750406" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.060584 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-2wbr6"] Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.070432 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.077719 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.081485 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.081703 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-vwjnc" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.081722 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.081833 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.129103 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.236377 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"04a746be-9ac3-4e2e-a264-6fc99dc51527\") " pod="openstack/swift-storage-0" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.236470 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx6wv\" (UniqueName: \"kubernetes.io/projected/04a746be-9ac3-4e2e-a264-6fc99dc51527-kube-api-access-wx6wv\") pod \"swift-storage-0\" (UID: \"04a746be-9ac3-4e2e-a264-6fc99dc51527\") " pod="openstack/swift-storage-0" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.236517 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/04a746be-9ac3-4e2e-a264-6fc99dc51527-etc-swift\") pod \"swift-storage-0\" (UID: \"04a746be-9ac3-4e2e-a264-6fc99dc51527\") " pod="openstack/swift-storage-0" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.236556 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/04a746be-9ac3-4e2e-a264-6fc99dc51527-lock\") pod \"swift-storage-0\" (UID: \"04a746be-9ac3-4e2e-a264-6fc99dc51527\") " pod="openstack/swift-storage-0" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.236648 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/04a746be-9ac3-4e2e-a264-6fc99dc51527-cache\") pod \"swift-storage-0\" (UID: \"04a746be-9ac3-4e2e-a264-6fc99dc51527\") " pod="openstack/swift-storage-0" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.337689 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"04a746be-9ac3-4e2e-a264-6fc99dc51527\") " pod="openstack/swift-storage-0" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.337757 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wx6wv\" (UniqueName: \"kubernetes.io/projected/04a746be-9ac3-4e2e-a264-6fc99dc51527-kube-api-access-wx6wv\") pod \"swift-storage-0\" (UID: \"04a746be-9ac3-4e2e-a264-6fc99dc51527\") " pod="openstack/swift-storage-0" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.337794 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/04a746be-9ac3-4e2e-a264-6fc99dc51527-etc-swift\") pod \"swift-storage-0\" (UID: \"04a746be-9ac3-4e2e-a264-6fc99dc51527\") " pod="openstack/swift-storage-0" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.337819 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/04a746be-9ac3-4e2e-a264-6fc99dc51527-lock\") pod \"swift-storage-0\" (UID: \"04a746be-9ac3-4e2e-a264-6fc99dc51527\") " pod="openstack/swift-storage-0" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.337867 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/04a746be-9ac3-4e2e-a264-6fc99dc51527-cache\") pod \"swift-storage-0\" (UID: \"04a746be-9ac3-4e2e-a264-6fc99dc51527\") " pod="openstack/swift-storage-0" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.338351 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/04a746be-9ac3-4e2e-a264-6fc99dc51527-cache\") pod \"swift-storage-0\" (UID: \"04a746be-9ac3-4e2e-a264-6fc99dc51527\") " pod="openstack/swift-storage-0" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.338614 4991 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"04a746be-9ac3-4e2e-a264-6fc99dc51527\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/swift-storage-0" Dec 06 01:21:25 crc kubenswrapper[4991]: E1206 01:21:25.340116 4991 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 06 01:21:25 crc kubenswrapper[4991]: E1206 01:21:25.340138 4991 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 06 01:21:25 crc kubenswrapper[4991]: E1206 01:21:25.340180 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/04a746be-9ac3-4e2e-a264-6fc99dc51527-etc-swift podName:04a746be-9ac3-4e2e-a264-6fc99dc51527 nodeName:}" failed. No retries permitted until 2025-12-06 01:21:25.84016427 +0000 UTC m=+1159.190454738 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/04a746be-9ac3-4e2e-a264-6fc99dc51527-etc-swift") pod "swift-storage-0" (UID: "04a746be-9ac3-4e2e-a264-6fc99dc51527") : configmap "swift-ring-files" not found Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.340297 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/04a746be-9ac3-4e2e-a264-6fc99dc51527-lock\") pod \"swift-storage-0\" (UID: \"04a746be-9ac3-4e2e-a264-6fc99dc51527\") " pod="openstack/swift-storage-0" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.369332 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wx6wv\" (UniqueName: \"kubernetes.io/projected/04a746be-9ac3-4e2e-a264-6fc99dc51527-kube-api-access-wx6wv\") pod \"swift-storage-0\" (UID: \"04a746be-9ac3-4e2e-a264-6fc99dc51527\") " pod="openstack/swift-storage-0" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.373177 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"04a746be-9ac3-4e2e-a264-6fc99dc51527\") " pod="openstack/swift-storage-0" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.640803 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-mtqz6"] Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.650637 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mtqz6" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.658453 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.658678 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.658792 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.659450 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-mtqz6"] Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.702153 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-6dw72" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.746520 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/955d4a82-f627-490d-bc6c-cb7420010230-combined-ca-bundle\") pod \"swift-ring-rebalance-mtqz6\" (UID: \"955d4a82-f627-490d-bc6c-cb7420010230\") " pod="openstack/swift-ring-rebalance-mtqz6" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.746576 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/955d4a82-f627-490d-bc6c-cb7420010230-swiftconf\") pod \"swift-ring-rebalance-mtqz6\" (UID: \"955d4a82-f627-490d-bc6c-cb7420010230\") " pod="openstack/swift-ring-rebalance-mtqz6" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.746594 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-258rj\" (UniqueName: \"kubernetes.io/projected/955d4a82-f627-490d-bc6c-cb7420010230-kube-api-access-258rj\") pod \"swift-ring-rebalance-mtqz6\" (UID: \"955d4a82-f627-490d-bc6c-cb7420010230\") " pod="openstack/swift-ring-rebalance-mtqz6" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.746616 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/955d4a82-f627-490d-bc6c-cb7420010230-dispersionconf\") pod \"swift-ring-rebalance-mtqz6\" (UID: \"955d4a82-f627-490d-bc6c-cb7420010230\") " pod="openstack/swift-ring-rebalance-mtqz6" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.746680 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/955d4a82-f627-490d-bc6c-cb7420010230-ring-data-devices\") pod \"swift-ring-rebalance-mtqz6\" (UID: \"955d4a82-f627-490d-bc6c-cb7420010230\") " pod="openstack/swift-ring-rebalance-mtqz6" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.746713 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/955d4a82-f627-490d-bc6c-cb7420010230-scripts\") pod \"swift-ring-rebalance-mtqz6\" (UID: \"955d4a82-f627-490d-bc6c-cb7420010230\") " pod="openstack/swift-ring-rebalance-mtqz6" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.746730 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/955d4a82-f627-490d-bc6c-cb7420010230-etc-swift\") pod \"swift-ring-rebalance-mtqz6\" (UID: \"955d4a82-f627-490d-bc6c-cb7420010230\") " pod="openstack/swift-ring-rebalance-mtqz6" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.829831 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-6dw72" event={"ID":"911fb4db-d797-447e-9aa3-671f50be1815","Type":"ContainerDied","Data":"c9e5d7b74a720a0eeb9c0598c68509962531c7fc14144f90d6d867ae476d7c91"} Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.829889 4991 scope.go:117] "RemoveContainer" containerID="ea659aa8477ce81114559324bc0c8f5db8c1d4bb27ccd15decc1cc02c36ddeed" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.830015 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-6dw72" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.832268 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-2wbr6" event={"ID":"257d40ce-9d2a-4cc4-9e89-4cb777582c47","Type":"ContainerStarted","Data":"aa13a8613f1e22def58732e570dd15827bf8ab0abc516669d48f6cd17ee5c575"} Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.835789 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-dlhpm" event={"ID":"7e97b3a5-f53d-4aee-9d21-3ca0d6928b31","Type":"ContainerStarted","Data":"8ea453040fcf5983c2c1002cddf839fba18ed4cb2145ee013a4608e74927f6e2"} Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.847604 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54xng\" (UniqueName: \"kubernetes.io/projected/911fb4db-d797-447e-9aa3-671f50be1815-kube-api-access-54xng\") pod \"911fb4db-d797-447e-9aa3-671f50be1815\" (UID: \"911fb4db-d797-447e-9aa3-671f50be1815\") " Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.847744 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/911fb4db-d797-447e-9aa3-671f50be1815-ovsdbserver-sb\") pod \"911fb4db-d797-447e-9aa3-671f50be1815\" (UID: \"911fb4db-d797-447e-9aa3-671f50be1815\") " Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.847780 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/911fb4db-d797-447e-9aa3-671f50be1815-config\") pod \"911fb4db-d797-447e-9aa3-671f50be1815\" (UID: \"911fb4db-d797-447e-9aa3-671f50be1815\") " Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.847831 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/911fb4db-d797-447e-9aa3-671f50be1815-dns-svc\") pod \"911fb4db-d797-447e-9aa3-671f50be1815\" (UID: \"911fb4db-d797-447e-9aa3-671f50be1815\") " Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.852129 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/955d4a82-f627-490d-bc6c-cb7420010230-dispersionconf\") pod \"swift-ring-rebalance-mtqz6\" (UID: \"955d4a82-f627-490d-bc6c-cb7420010230\") " pod="openstack/swift-ring-rebalance-mtqz6" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.852244 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/04a746be-9ac3-4e2e-a264-6fc99dc51527-etc-swift\") pod \"swift-storage-0\" (UID: \"04a746be-9ac3-4e2e-a264-6fc99dc51527\") " pod="openstack/swift-storage-0" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.852277 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/955d4a82-f627-490d-bc6c-cb7420010230-ring-data-devices\") pod \"swift-ring-rebalance-mtqz6\" (UID: \"955d4a82-f627-490d-bc6c-cb7420010230\") " pod="openstack/swift-ring-rebalance-mtqz6" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.852331 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/955d4a82-f627-490d-bc6c-cb7420010230-scripts\") pod \"swift-ring-rebalance-mtqz6\" (UID: \"955d4a82-f627-490d-bc6c-cb7420010230\") " pod="openstack/swift-ring-rebalance-mtqz6" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.852357 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/955d4a82-f627-490d-bc6c-cb7420010230-etc-swift\") pod \"swift-ring-rebalance-mtqz6\" (UID: \"955d4a82-f627-490d-bc6c-cb7420010230\") " pod="openstack/swift-ring-rebalance-mtqz6" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.852402 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/955d4a82-f627-490d-bc6c-cb7420010230-combined-ca-bundle\") pod \"swift-ring-rebalance-mtqz6\" (UID: \"955d4a82-f627-490d-bc6c-cb7420010230\") " pod="openstack/swift-ring-rebalance-mtqz6" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.852453 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/955d4a82-f627-490d-bc6c-cb7420010230-swiftconf\") pod \"swift-ring-rebalance-mtqz6\" (UID: \"955d4a82-f627-490d-bc6c-cb7420010230\") " pod="openstack/swift-ring-rebalance-mtqz6" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.852474 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-258rj\" (UniqueName: \"kubernetes.io/projected/955d4a82-f627-490d-bc6c-cb7420010230-kube-api-access-258rj\") pod \"swift-ring-rebalance-mtqz6\" (UID: \"955d4a82-f627-490d-bc6c-cb7420010230\") " pod="openstack/swift-ring-rebalance-mtqz6" Dec 06 01:21:25 crc kubenswrapper[4991]: E1206 01:21:25.853651 4991 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 06 01:21:25 crc kubenswrapper[4991]: E1206 01:21:25.853676 4991 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 06 01:21:25 crc kubenswrapper[4991]: E1206 01:21:25.853715 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/04a746be-9ac3-4e2e-a264-6fc99dc51527-etc-swift podName:04a746be-9ac3-4e2e-a264-6fc99dc51527 nodeName:}" failed. No retries permitted until 2025-12-06 01:21:26.853702174 +0000 UTC m=+1160.203992642 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/04a746be-9ac3-4e2e-a264-6fc99dc51527-etc-swift") pod "swift-storage-0" (UID: "04a746be-9ac3-4e2e-a264-6fc99dc51527") : configmap "swift-ring-files" not found Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.860356 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/955d4a82-f627-490d-bc6c-cb7420010230-etc-swift\") pod \"swift-ring-rebalance-mtqz6\" (UID: \"955d4a82-f627-490d-bc6c-cb7420010230\") " pod="openstack/swift-ring-rebalance-mtqz6" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.860901 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/955d4a82-f627-490d-bc6c-cb7420010230-ring-data-devices\") pod \"swift-ring-rebalance-mtqz6\" (UID: \"955d4a82-f627-490d-bc6c-cb7420010230\") " pod="openstack/swift-ring-rebalance-mtqz6" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.861357 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/955d4a82-f627-490d-bc6c-cb7420010230-scripts\") pod \"swift-ring-rebalance-mtqz6\" (UID: \"955d4a82-f627-490d-bc6c-cb7420010230\") " pod="openstack/swift-ring-rebalance-mtqz6" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.863961 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/911fb4db-d797-447e-9aa3-671f50be1815-kube-api-access-54xng" (OuterVolumeSpecName: "kube-api-access-54xng") pod "911fb4db-d797-447e-9aa3-671f50be1815" (UID: "911fb4db-d797-447e-9aa3-671f50be1815"). InnerVolumeSpecName "kube-api-access-54xng". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.879079 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/955d4a82-f627-490d-bc6c-cb7420010230-combined-ca-bundle\") pod \"swift-ring-rebalance-mtqz6\" (UID: \"955d4a82-f627-490d-bc6c-cb7420010230\") " pod="openstack/swift-ring-rebalance-mtqz6" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.881449 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/955d4a82-f627-490d-bc6c-cb7420010230-dispersionconf\") pod \"swift-ring-rebalance-mtqz6\" (UID: \"955d4a82-f627-490d-bc6c-cb7420010230\") " pod="openstack/swift-ring-rebalance-mtqz6" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.882062 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/955d4a82-f627-490d-bc6c-cb7420010230-swiftconf\") pod \"swift-ring-rebalance-mtqz6\" (UID: \"955d4a82-f627-490d-bc6c-cb7420010230\") " pod="openstack/swift-ring-rebalance-mtqz6" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.888695 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-258rj\" (UniqueName: \"kubernetes.io/projected/955d4a82-f627-490d-bc6c-cb7420010230-kube-api-access-258rj\") pod \"swift-ring-rebalance-mtqz6\" (UID: \"955d4a82-f627-490d-bc6c-cb7420010230\") " pod="openstack/swift-ring-rebalance-mtqz6" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.955644 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54xng\" (UniqueName: \"kubernetes.io/projected/911fb4db-d797-447e-9aa3-671f50be1815-kube-api-access-54xng\") on node \"crc\" DevicePath \"\"" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.959873 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/911fb4db-d797-447e-9aa3-671f50be1815-config" (OuterVolumeSpecName: "config") pod "911fb4db-d797-447e-9aa3-671f50be1815" (UID: "911fb4db-d797-447e-9aa3-671f50be1815"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.975356 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/911fb4db-d797-447e-9aa3-671f50be1815-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "911fb4db-d797-447e-9aa3-671f50be1815" (UID: "911fb4db-d797-447e-9aa3-671f50be1815"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:21:25 crc kubenswrapper[4991]: I1206 01:21:25.983561 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/911fb4db-d797-447e-9aa3-671f50be1815-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "911fb4db-d797-447e-9aa3-671f50be1815" (UID: "911fb4db-d797-447e-9aa3-671f50be1815"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:21:26 crc kubenswrapper[4991]: I1206 01:21:26.017192 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mtqz6" Dec 06 01:21:26 crc kubenswrapper[4991]: I1206 01:21:26.058208 4991 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/911fb4db-d797-447e-9aa3-671f50be1815-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 01:21:26 crc kubenswrapper[4991]: I1206 01:21:26.058240 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/911fb4db-d797-447e-9aa3-671f50be1815-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 01:21:26 crc kubenswrapper[4991]: I1206 01:21:26.058254 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/911fb4db-d797-447e-9aa3-671f50be1815-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:21:26 crc kubenswrapper[4991]: I1206 01:21:26.169297 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-dlhpm" podStartSLOduration=4.169267026 podStartE2EDuration="4.169267026s" podCreationTimestamp="2025-12-06 01:21:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:21:25.871419168 +0000 UTC m=+1159.221709636" watchObservedRunningTime="2025-12-06 01:21:26.169267026 +0000 UTC m=+1159.519557494" Dec 06 01:21:26 crc kubenswrapper[4991]: I1206 01:21:26.205125 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-6dw72"] Dec 06 01:21:26 crc kubenswrapper[4991]: I1206 01:21:26.224978 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-6dw72"] Dec 06 01:21:26 crc kubenswrapper[4991]: I1206 01:21:26.499792 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-mtqz6"] Dec 06 01:21:26 crc kubenswrapper[4991]: W1206 01:21:26.506196 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod955d4a82_f627_490d_bc6c_cb7420010230.slice/crio-e3df034ccf93cac8353d9b07a92d23fb4c64d400e9ae9c6a197dd55173705236 WatchSource:0}: Error finding container e3df034ccf93cac8353d9b07a92d23fb4c64d400e9ae9c6a197dd55173705236: Status 404 returned error can't find the container with id e3df034ccf93cac8353d9b07a92d23fb4c64d400e9ae9c6a197dd55173705236 Dec 06 01:21:26 crc kubenswrapper[4991]: I1206 01:21:26.844723 4991 generic.go:334] "Generic (PLEG): container finished" podID="257d40ce-9d2a-4cc4-9e89-4cb777582c47" containerID="5491c90568a001a1638a6e6e8efe9b049d0706709eacdb70c2bedc4c77bb7654" exitCode=0 Dec 06 01:21:26 crc kubenswrapper[4991]: I1206 01:21:26.844770 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-2wbr6" event={"ID":"257d40ce-9d2a-4cc4-9e89-4cb777582c47","Type":"ContainerDied","Data":"5491c90568a001a1638a6e6e8efe9b049d0706709eacdb70c2bedc4c77bb7654"} Dec 06 01:21:26 crc kubenswrapper[4991]: I1206 01:21:26.847558 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"49a9417e-79cd-4b0c-a5b0-5528a9e20edc","Type":"ContainerStarted","Data":"f9453ffe8ae558fc8cd1dc1e3c22fdb2d988e07ba1153a59b2e56c7419529c7e"} Dec 06 01:21:26 crc kubenswrapper[4991]: I1206 01:21:26.847662 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"49a9417e-79cd-4b0c-a5b0-5528a9e20edc","Type":"ContainerStarted","Data":"f36eb5a028984c1677aad987439d4c40c8be17a5b358330898434107c770c11d"} Dec 06 01:21:26 crc kubenswrapper[4991]: I1206 01:21:26.847906 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 06 01:21:26 crc kubenswrapper[4991]: I1206 01:21:26.849299 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mtqz6" event={"ID":"955d4a82-f627-490d-bc6c-cb7420010230","Type":"ContainerStarted","Data":"e3df034ccf93cac8353d9b07a92d23fb4c64d400e9ae9c6a197dd55173705236"} Dec 06 01:21:26 crc kubenswrapper[4991]: I1206 01:21:26.850756 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-dlhpm" Dec 06 01:21:26 crc kubenswrapper[4991]: I1206 01:21:26.872174 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/04a746be-9ac3-4e2e-a264-6fc99dc51527-etc-swift\") pod \"swift-storage-0\" (UID: \"04a746be-9ac3-4e2e-a264-6fc99dc51527\") " pod="openstack/swift-storage-0" Dec 06 01:21:26 crc kubenswrapper[4991]: E1206 01:21:26.872429 4991 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 06 01:21:26 crc kubenswrapper[4991]: E1206 01:21:26.872455 4991 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 06 01:21:26 crc kubenswrapper[4991]: E1206 01:21:26.872518 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/04a746be-9ac3-4e2e-a264-6fc99dc51527-etc-swift podName:04a746be-9ac3-4e2e-a264-6fc99dc51527 nodeName:}" failed. No retries permitted until 2025-12-06 01:21:28.872498439 +0000 UTC m=+1162.222788917 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/04a746be-9ac3-4e2e-a264-6fc99dc51527-etc-swift") pod "swift-storage-0" (UID: "04a746be-9ac3-4e2e-a264-6fc99dc51527") : configmap "swift-ring-files" not found Dec 06 01:21:26 crc kubenswrapper[4991]: I1206 01:21:26.905928 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.841649428 podStartE2EDuration="4.905888333s" podCreationTimestamp="2025-12-06 01:21:22 +0000 UTC" firstStartedPulling="2025-12-06 01:21:23.544934347 +0000 UTC m=+1156.895224815" lastFinishedPulling="2025-12-06 01:21:25.609173252 +0000 UTC m=+1158.959463720" observedRunningTime="2025-12-06 01:21:26.894112204 +0000 UTC m=+1160.244402662" watchObservedRunningTime="2025-12-06 01:21:26.905888333 +0000 UTC m=+1160.256178811" Dec 06 01:21:27 crc kubenswrapper[4991]: I1206 01:21:27.073553 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="911fb4db-d797-447e-9aa3-671f50be1815" path="/var/lib/kubelet/pods/911fb4db-d797-447e-9aa3-671f50be1815/volumes" Dec 06 01:21:27 crc kubenswrapper[4991]: I1206 01:21:27.862698 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-2wbr6" event={"ID":"257d40ce-9d2a-4cc4-9e89-4cb777582c47","Type":"ContainerStarted","Data":"4446c7bbe395dc5b4d0b88f1e738a6013f2ff5fd7470461981530e62075d68d1"} Dec 06 01:21:27 crc kubenswrapper[4991]: I1206 01:21:27.863260 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-2wbr6" Dec 06 01:21:27 crc kubenswrapper[4991]: I1206 01:21:27.885520 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-2wbr6" podStartSLOduration=4.885500671 podStartE2EDuration="4.885500671s" podCreationTimestamp="2025-12-06 01:21:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:21:27.881607799 +0000 UTC m=+1161.231898267" watchObservedRunningTime="2025-12-06 01:21:27.885500671 +0000 UTC m=+1161.235791139" Dec 06 01:21:28 crc kubenswrapper[4991]: E1206 01:21:28.915175 4991 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 06 01:21:28 crc kubenswrapper[4991]: E1206 01:21:28.915209 4991 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 06 01:21:28 crc kubenswrapper[4991]: E1206 01:21:28.915276 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/04a746be-9ac3-4e2e-a264-6fc99dc51527-etc-swift podName:04a746be-9ac3-4e2e-a264-6fc99dc51527 nodeName:}" failed. No retries permitted until 2025-12-06 01:21:32.915256082 +0000 UTC m=+1166.265546550 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/04a746be-9ac3-4e2e-a264-6fc99dc51527-etc-swift") pod "swift-storage-0" (UID: "04a746be-9ac3-4e2e-a264-6fc99dc51527") : configmap "swift-ring-files" not found Dec 06 01:21:28 crc kubenswrapper[4991]: I1206 01:21:28.914966 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/04a746be-9ac3-4e2e-a264-6fc99dc51527-etc-swift\") pod \"swift-storage-0\" (UID: \"04a746be-9ac3-4e2e-a264-6fc99dc51527\") " pod="openstack/swift-storage-0" Dec 06 01:21:30 crc kubenswrapper[4991]: I1206 01:21:30.005651 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 06 01:21:30 crc kubenswrapper[4991]: I1206 01:21:30.006178 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 06 01:21:30 crc kubenswrapper[4991]: I1206 01:21:30.128831 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 06 01:21:30 crc kubenswrapper[4991]: I1206 01:21:30.905581 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mtqz6" event={"ID":"955d4a82-f627-490d-bc6c-cb7420010230","Type":"ContainerStarted","Data":"5652ee63e5d24bf081a22e67ab33ae9a89ad0974c8287ae8b41bce1092d6c17a"} Dec 06 01:21:30 crc kubenswrapper[4991]: I1206 01:21:30.933678 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-mtqz6" podStartSLOduration=1.9641300080000001 podStartE2EDuration="5.933657187s" podCreationTimestamp="2025-12-06 01:21:25 +0000 UTC" firstStartedPulling="2025-12-06 01:21:26.508507929 +0000 UTC m=+1159.858798397" lastFinishedPulling="2025-12-06 01:21:30.478035098 +0000 UTC m=+1163.828325576" observedRunningTime="2025-12-06 01:21:30.924660952 +0000 UTC m=+1164.274951420" watchObservedRunningTime="2025-12-06 01:21:30.933657187 +0000 UTC m=+1164.283947655" Dec 06 01:21:30 crc kubenswrapper[4991]: I1206 01:21:30.999579 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.330716 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-7xb7k"] Dec 06 01:21:31 crc kubenswrapper[4991]: E1206 01:21:31.332252 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="911fb4db-d797-447e-9aa3-671f50be1815" containerName="init" Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.332322 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="911fb4db-d797-447e-9aa3-671f50be1815" containerName="init" Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.332525 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="911fb4db-d797-447e-9aa3-671f50be1815" containerName="init" Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.333175 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-7xb7k" Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.339351 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-3d5c-account-create-update-vbgdh"] Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.340682 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3d5c-account-create-update-vbgdh" Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.342264 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.343332 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.349875 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.355933 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-3d5c-account-create-update-vbgdh"] Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.364460 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-7xb7k"] Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.459387 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.507468 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25db95d2-7c84-4c51-a212-4590d91cedee-operator-scripts\") pod \"keystone-db-create-7xb7k\" (UID: \"25db95d2-7c84-4c51-a212-4590d91cedee\") " pod="openstack/keystone-db-create-7xb7k" Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.507810 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn9jl\" (UniqueName: \"kubernetes.io/projected/d902c58d-de3b-420a-95de-27f9393ef061-kube-api-access-tn9jl\") pod \"keystone-3d5c-account-create-update-vbgdh\" (UID: \"d902c58d-de3b-420a-95de-27f9393ef061\") " pod="openstack/keystone-3d5c-account-create-update-vbgdh" Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.507927 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d902c58d-de3b-420a-95de-27f9393ef061-operator-scripts\") pod \"keystone-3d5c-account-create-update-vbgdh\" (UID: \"d902c58d-de3b-420a-95de-27f9393ef061\") " pod="openstack/keystone-3d5c-account-create-update-vbgdh" Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.508087 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ctdw\" (UniqueName: \"kubernetes.io/projected/25db95d2-7c84-4c51-a212-4590d91cedee-kube-api-access-8ctdw\") pod \"keystone-db-create-7xb7k\" (UID: \"25db95d2-7c84-4c51-a212-4590d91cedee\") " pod="openstack/keystone-db-create-7xb7k" Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.610011 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn9jl\" (UniqueName: \"kubernetes.io/projected/d902c58d-de3b-420a-95de-27f9393ef061-kube-api-access-tn9jl\") pod \"keystone-3d5c-account-create-update-vbgdh\" (UID: \"d902c58d-de3b-420a-95de-27f9393ef061\") " pod="openstack/keystone-3d5c-account-create-update-vbgdh" Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.610107 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d902c58d-de3b-420a-95de-27f9393ef061-operator-scripts\") pod \"keystone-3d5c-account-create-update-vbgdh\" (UID: \"d902c58d-de3b-420a-95de-27f9393ef061\") " pod="openstack/keystone-3d5c-account-create-update-vbgdh" Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.610200 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ctdw\" (UniqueName: \"kubernetes.io/projected/25db95d2-7c84-4c51-a212-4590d91cedee-kube-api-access-8ctdw\") pod \"keystone-db-create-7xb7k\" (UID: \"25db95d2-7c84-4c51-a212-4590d91cedee\") " pod="openstack/keystone-db-create-7xb7k" Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.610238 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25db95d2-7c84-4c51-a212-4590d91cedee-operator-scripts\") pod \"keystone-db-create-7xb7k\" (UID: \"25db95d2-7c84-4c51-a212-4590d91cedee\") " pod="openstack/keystone-db-create-7xb7k" Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.611225 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25db95d2-7c84-4c51-a212-4590d91cedee-operator-scripts\") pod \"keystone-db-create-7xb7k\" (UID: \"25db95d2-7c84-4c51-a212-4590d91cedee\") " pod="openstack/keystone-db-create-7xb7k" Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.612228 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d902c58d-de3b-420a-95de-27f9393ef061-operator-scripts\") pod \"keystone-3d5c-account-create-update-vbgdh\" (UID: \"d902c58d-de3b-420a-95de-27f9393ef061\") " pod="openstack/keystone-3d5c-account-create-update-vbgdh" Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.645963 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ctdw\" (UniqueName: \"kubernetes.io/projected/25db95d2-7c84-4c51-a212-4590d91cedee-kube-api-access-8ctdw\") pod \"keystone-db-create-7xb7k\" (UID: \"25db95d2-7c84-4c51-a212-4590d91cedee\") " pod="openstack/keystone-db-create-7xb7k" Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.647375 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn9jl\" (UniqueName: \"kubernetes.io/projected/d902c58d-de3b-420a-95de-27f9393ef061-kube-api-access-tn9jl\") pod \"keystone-3d5c-account-create-update-vbgdh\" (UID: \"d902c58d-de3b-420a-95de-27f9393ef061\") " pod="openstack/keystone-3d5c-account-create-update-vbgdh" Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.651288 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-2snkc"] Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.652531 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-2snkc" Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.663514 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-2snkc"] Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.664191 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-7xb7k" Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.672577 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3d5c-account-create-update-vbgdh" Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.741897 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97898e71-ae7d-405f-898b-5bb25c2948ff-operator-scripts\") pod \"placement-db-create-2snkc\" (UID: \"97898e71-ae7d-405f-898b-5bb25c2948ff\") " pod="openstack/placement-db-create-2snkc" Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.742013 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zj56\" (UniqueName: \"kubernetes.io/projected/97898e71-ae7d-405f-898b-5bb25c2948ff-kube-api-access-9zj56\") pod \"placement-db-create-2snkc\" (UID: \"97898e71-ae7d-405f-898b-5bb25c2948ff\") " pod="openstack/placement-db-create-2snkc" Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.762474 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-e2eb-account-create-update-pd2t5"] Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.772211 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e2eb-account-create-update-pd2t5" Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.782273 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.786530 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-e2eb-account-create-update-pd2t5"] Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.843971 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zj56\" (UniqueName: \"kubernetes.io/projected/97898e71-ae7d-405f-898b-5bb25c2948ff-kube-api-access-9zj56\") pod \"placement-db-create-2snkc\" (UID: \"97898e71-ae7d-405f-898b-5bb25c2948ff\") " pod="openstack/placement-db-create-2snkc" Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.844310 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8xwh\" (UniqueName: \"kubernetes.io/projected/31697208-90b9-4041-98b1-9a63a2671b4b-kube-api-access-f8xwh\") pod \"placement-e2eb-account-create-update-pd2t5\" (UID: \"31697208-90b9-4041-98b1-9a63a2671b4b\") " pod="openstack/placement-e2eb-account-create-update-pd2t5" Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.844413 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97898e71-ae7d-405f-898b-5bb25c2948ff-operator-scripts\") pod \"placement-db-create-2snkc\" (UID: \"97898e71-ae7d-405f-898b-5bb25c2948ff\") " pod="openstack/placement-db-create-2snkc" Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.844434 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31697208-90b9-4041-98b1-9a63a2671b4b-operator-scripts\") pod \"placement-e2eb-account-create-update-pd2t5\" (UID: \"31697208-90b9-4041-98b1-9a63a2671b4b\") " pod="openstack/placement-e2eb-account-create-update-pd2t5" Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.845302 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97898e71-ae7d-405f-898b-5bb25c2948ff-operator-scripts\") pod \"placement-db-create-2snkc\" (UID: \"97898e71-ae7d-405f-898b-5bb25c2948ff\") " pod="openstack/placement-db-create-2snkc" Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.878708 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zj56\" (UniqueName: \"kubernetes.io/projected/97898e71-ae7d-405f-898b-5bb25c2948ff-kube-api-access-9zj56\") pod \"placement-db-create-2snkc\" (UID: \"97898e71-ae7d-405f-898b-5bb25c2948ff\") " pod="openstack/placement-db-create-2snkc" Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.948308 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31697208-90b9-4041-98b1-9a63a2671b4b-operator-scripts\") pod \"placement-e2eb-account-create-update-pd2t5\" (UID: \"31697208-90b9-4041-98b1-9a63a2671b4b\") " pod="openstack/placement-e2eb-account-create-update-pd2t5" Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.948467 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8xwh\" (UniqueName: \"kubernetes.io/projected/31697208-90b9-4041-98b1-9a63a2671b4b-kube-api-access-f8xwh\") pod \"placement-e2eb-account-create-update-pd2t5\" (UID: \"31697208-90b9-4041-98b1-9a63a2671b4b\") " pod="openstack/placement-e2eb-account-create-update-pd2t5" Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.952547 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31697208-90b9-4041-98b1-9a63a2671b4b-operator-scripts\") pod \"placement-e2eb-account-create-update-pd2t5\" (UID: \"31697208-90b9-4041-98b1-9a63a2671b4b\") " pod="openstack/placement-e2eb-account-create-update-pd2t5" Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.970839 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-7wx6t"] Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.972332 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7wx6t" Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.978641 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8xwh\" (UniqueName: \"kubernetes.io/projected/31697208-90b9-4041-98b1-9a63a2671b4b-kube-api-access-f8xwh\") pod \"placement-e2eb-account-create-update-pd2t5\" (UID: \"31697208-90b9-4041-98b1-9a63a2671b4b\") " pod="openstack/placement-e2eb-account-create-update-pd2t5" Dec 06 01:21:31 crc kubenswrapper[4991]: I1206 01:21:31.981671 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-7wx6t"] Dec 06 01:21:32 crc kubenswrapper[4991]: I1206 01:21:32.049728 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 06 01:21:32 crc kubenswrapper[4991]: I1206 01:21:32.082131 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-6152-account-create-update-zzm8n"] Dec 06 01:21:32 crc kubenswrapper[4991]: I1206 01:21:32.102919 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-6152-account-create-update-zzm8n"] Dec 06 01:21:32 crc kubenswrapper[4991]: I1206 01:21:32.103069 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6152-account-create-update-zzm8n" Dec 06 01:21:32 crc kubenswrapper[4991]: I1206 01:21:32.109632 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 06 01:21:32 crc kubenswrapper[4991]: I1206 01:21:32.141152 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-2snkc" Dec 06 01:21:32 crc kubenswrapper[4991]: I1206 01:21:32.153018 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e2eb-account-create-update-pd2t5" Dec 06 01:21:32 crc kubenswrapper[4991]: I1206 01:21:32.153585 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adc2f876-daef-47fe-81a6-1eced742e7f6-operator-scripts\") pod \"glance-db-create-7wx6t\" (UID: \"adc2f876-daef-47fe-81a6-1eced742e7f6\") " pod="openstack/glance-db-create-7wx6t" Dec 06 01:21:32 crc kubenswrapper[4991]: I1206 01:21:32.153635 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4q7pz\" (UniqueName: \"kubernetes.io/projected/adc2f876-daef-47fe-81a6-1eced742e7f6-kube-api-access-4q7pz\") pod \"glance-db-create-7wx6t\" (UID: \"adc2f876-daef-47fe-81a6-1eced742e7f6\") " pod="openstack/glance-db-create-7wx6t" Dec 06 01:21:32 crc kubenswrapper[4991]: I1206 01:21:32.258744 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adc2f876-daef-47fe-81a6-1eced742e7f6-operator-scripts\") pod \"glance-db-create-7wx6t\" (UID: \"adc2f876-daef-47fe-81a6-1eced742e7f6\") " pod="openstack/glance-db-create-7wx6t" Dec 06 01:21:32 crc kubenswrapper[4991]: I1206 01:21:32.260475 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adc2f876-daef-47fe-81a6-1eced742e7f6-operator-scripts\") pod \"glance-db-create-7wx6t\" (UID: \"adc2f876-daef-47fe-81a6-1eced742e7f6\") " pod="openstack/glance-db-create-7wx6t" Dec 06 01:21:32 crc kubenswrapper[4991]: I1206 01:21:32.264125 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4q7pz\" (UniqueName: \"kubernetes.io/projected/adc2f876-daef-47fe-81a6-1eced742e7f6-kube-api-access-4q7pz\") pod \"glance-db-create-7wx6t\" (UID: \"adc2f876-daef-47fe-81a6-1eced742e7f6\") " pod="openstack/glance-db-create-7wx6t" Dec 06 01:21:32 crc kubenswrapper[4991]: I1206 01:21:32.264435 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjlrf\" (UniqueName: \"kubernetes.io/projected/78a85bfe-6df1-44b8-ad38-9a8eba3e662b-kube-api-access-bjlrf\") pod \"glance-6152-account-create-update-zzm8n\" (UID: \"78a85bfe-6df1-44b8-ad38-9a8eba3e662b\") " pod="openstack/glance-6152-account-create-update-zzm8n" Dec 06 01:21:32 crc kubenswrapper[4991]: I1206 01:21:32.264666 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78a85bfe-6df1-44b8-ad38-9a8eba3e662b-operator-scripts\") pod \"glance-6152-account-create-update-zzm8n\" (UID: \"78a85bfe-6df1-44b8-ad38-9a8eba3e662b\") " pod="openstack/glance-6152-account-create-update-zzm8n" Dec 06 01:21:32 crc kubenswrapper[4991]: I1206 01:21:32.276195 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-7xb7k"] Dec 06 01:21:32 crc kubenswrapper[4991]: I1206 01:21:32.289073 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-3d5c-account-create-update-vbgdh"] Dec 06 01:21:32 crc kubenswrapper[4991]: I1206 01:21:32.295951 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4q7pz\" (UniqueName: \"kubernetes.io/projected/adc2f876-daef-47fe-81a6-1eced742e7f6-kube-api-access-4q7pz\") pod \"glance-db-create-7wx6t\" (UID: \"adc2f876-daef-47fe-81a6-1eced742e7f6\") " pod="openstack/glance-db-create-7wx6t" Dec 06 01:21:32 crc kubenswrapper[4991]: I1206 01:21:32.308279 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7wx6t" Dec 06 01:21:32 crc kubenswrapper[4991]: I1206 01:21:32.367132 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjlrf\" (UniqueName: \"kubernetes.io/projected/78a85bfe-6df1-44b8-ad38-9a8eba3e662b-kube-api-access-bjlrf\") pod \"glance-6152-account-create-update-zzm8n\" (UID: \"78a85bfe-6df1-44b8-ad38-9a8eba3e662b\") " pod="openstack/glance-6152-account-create-update-zzm8n" Dec 06 01:21:32 crc kubenswrapper[4991]: I1206 01:21:32.367303 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78a85bfe-6df1-44b8-ad38-9a8eba3e662b-operator-scripts\") pod \"glance-6152-account-create-update-zzm8n\" (UID: \"78a85bfe-6df1-44b8-ad38-9a8eba3e662b\") " pod="openstack/glance-6152-account-create-update-zzm8n" Dec 06 01:21:32 crc kubenswrapper[4991]: I1206 01:21:32.368160 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78a85bfe-6df1-44b8-ad38-9a8eba3e662b-operator-scripts\") pod \"glance-6152-account-create-update-zzm8n\" (UID: \"78a85bfe-6df1-44b8-ad38-9a8eba3e662b\") " pod="openstack/glance-6152-account-create-update-zzm8n" Dec 06 01:21:32 crc kubenswrapper[4991]: I1206 01:21:32.389492 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjlrf\" (UniqueName: \"kubernetes.io/projected/78a85bfe-6df1-44b8-ad38-9a8eba3e662b-kube-api-access-bjlrf\") pod \"glance-6152-account-create-update-zzm8n\" (UID: \"78a85bfe-6df1-44b8-ad38-9a8eba3e662b\") " pod="openstack/glance-6152-account-create-update-zzm8n" Dec 06 01:21:32 crc kubenswrapper[4991]: I1206 01:21:32.479067 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6152-account-create-update-zzm8n" Dec 06 01:21:32 crc kubenswrapper[4991]: I1206 01:21:32.542447 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-2snkc"] Dec 06 01:21:32 crc kubenswrapper[4991]: I1206 01:21:32.684155 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-e2eb-account-create-update-pd2t5"] Dec 06 01:21:32 crc kubenswrapper[4991]: W1206 01:21:32.699865 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31697208_90b9_4041_98b1_9a63a2671b4b.slice/crio-eb64de60f3ab6f19cd8d360144a6e344b754358ff4547b111cd6a86cd8d3f981 WatchSource:0}: Error finding container eb64de60f3ab6f19cd8d360144a6e344b754358ff4547b111cd6a86cd8d3f981: Status 404 returned error can't find the container with id eb64de60f3ab6f19cd8d360144a6e344b754358ff4547b111cd6a86cd8d3f981 Dec 06 01:21:32 crc kubenswrapper[4991]: I1206 01:21:32.725791 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-7wx6t"] Dec 06 01:21:32 crc kubenswrapper[4991]: I1206 01:21:32.898384 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-dlhpm" Dec 06 01:21:32 crc kubenswrapper[4991]: I1206 01:21:32.927651 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-e2eb-account-create-update-pd2t5" event={"ID":"31697208-90b9-4041-98b1-9a63a2671b4b","Type":"ContainerStarted","Data":"eb64de60f3ab6f19cd8d360144a6e344b754358ff4547b111cd6a86cd8d3f981"} Dec 06 01:21:32 crc kubenswrapper[4991]: I1206 01:21:32.942261 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-2snkc" event={"ID":"97898e71-ae7d-405f-898b-5bb25c2948ff","Type":"ContainerStarted","Data":"ffa5ede272eb378c2ea949374bd2266d9389a836e3e8cabd4b508d9d23ca43bf"} Dec 06 01:21:32 crc kubenswrapper[4991]: I1206 01:21:32.943658 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-7wx6t" event={"ID":"adc2f876-daef-47fe-81a6-1eced742e7f6","Type":"ContainerStarted","Data":"c6964426a6a6dad9f2f6142fba8b9d9ee256c4ffac2479fdb6aef9ef28b4be77"} Dec 06 01:21:32 crc kubenswrapper[4991]: I1206 01:21:32.944709 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-7xb7k" event={"ID":"25db95d2-7c84-4c51-a212-4590d91cedee","Type":"ContainerStarted","Data":"72a754ea75011988bbe8de2bc3b63835462a8420758ab77ea1bdfeeb4d1c7b28"} Dec 06 01:21:32 crc kubenswrapper[4991]: I1206 01:21:32.944745 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-7xb7k" event={"ID":"25db95d2-7c84-4c51-a212-4590d91cedee","Type":"ContainerStarted","Data":"a4badc9d36ba6ab35fc7c9dec9e4878761af85b485166197af403a8c7dce4670"} Dec 06 01:21:32 crc kubenswrapper[4991]: I1206 01:21:32.958484 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3d5c-account-create-update-vbgdh" event={"ID":"d902c58d-de3b-420a-95de-27f9393ef061","Type":"ContainerStarted","Data":"fa3032c957230ec11f138a64bea4c7d7fff691314baba4427ffab297fbf812d7"} Dec 06 01:21:32 crc kubenswrapper[4991]: I1206 01:21:32.958527 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3d5c-account-create-update-vbgdh" event={"ID":"d902c58d-de3b-420a-95de-27f9393ef061","Type":"ContainerStarted","Data":"21a50fb33e61c82c469c23e649b26321e8a5975e544efd60991e7bd8be8546db"} Dec 06 01:21:32 crc kubenswrapper[4991]: I1206 01:21:32.975630 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-7xb7k" podStartSLOduration=1.975614449 podStartE2EDuration="1.975614449s" podCreationTimestamp="2025-12-06 01:21:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:21:32.971136932 +0000 UTC m=+1166.321427400" watchObservedRunningTime="2025-12-06 01:21:32.975614449 +0000 UTC m=+1166.325904917" Dec 06 01:21:32 crc kubenswrapper[4991]: I1206 01:21:32.982852 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/04a746be-9ac3-4e2e-a264-6fc99dc51527-etc-swift\") pod \"swift-storage-0\" (UID: \"04a746be-9ac3-4e2e-a264-6fc99dc51527\") " pod="openstack/swift-storage-0" Dec 06 01:21:32 crc kubenswrapper[4991]: E1206 01:21:32.986675 4991 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 06 01:21:32 crc kubenswrapper[4991]: E1206 01:21:32.987158 4991 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 06 01:21:32 crc kubenswrapper[4991]: E1206 01:21:32.987222 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/04a746be-9ac3-4e2e-a264-6fc99dc51527-etc-swift podName:04a746be-9ac3-4e2e-a264-6fc99dc51527 nodeName:}" failed. No retries permitted until 2025-12-06 01:21:40.987202512 +0000 UTC m=+1174.337492980 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/04a746be-9ac3-4e2e-a264-6fc99dc51527-etc-swift") pod "swift-storage-0" (UID: "04a746be-9ac3-4e2e-a264-6fc99dc51527") : configmap "swift-ring-files" not found Dec 06 01:21:33 crc kubenswrapper[4991]: I1206 01:21:33.076006 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-3d5c-account-create-update-vbgdh" podStartSLOduration=2.075963496 podStartE2EDuration="2.075963496s" podCreationTimestamp="2025-12-06 01:21:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:21:33.001276411 +0000 UTC m=+1166.351566879" watchObservedRunningTime="2025-12-06 01:21:33.075963496 +0000 UTC m=+1166.426253964" Dec 06 01:21:33 crc kubenswrapper[4991]: I1206 01:21:33.079482 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-6152-account-create-update-zzm8n"] Dec 06 01:21:33 crc kubenswrapper[4991]: W1206 01:21:33.083422 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78a85bfe_6df1_44b8_ad38_9a8eba3e662b.slice/crio-42e8eb37fd9bb65e55fdd7d61f42771139e3b08b99ada5fc721c1ec10798d73e WatchSource:0}: Error finding container 42e8eb37fd9bb65e55fdd7d61f42771139e3b08b99ada5fc721c1ec10798d73e: Status 404 returned error can't find the container with id 42e8eb37fd9bb65e55fdd7d61f42771139e3b08b99ada5fc721c1ec10798d73e Dec 06 01:21:33 crc kubenswrapper[4991]: I1206 01:21:33.968170 4991 generic.go:334] "Generic (PLEG): container finished" podID="25db95d2-7c84-4c51-a212-4590d91cedee" containerID="72a754ea75011988bbe8de2bc3b63835462a8420758ab77ea1bdfeeb4d1c7b28" exitCode=0 Dec 06 01:21:33 crc kubenswrapper[4991]: I1206 01:21:33.968233 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-7xb7k" event={"ID":"25db95d2-7c84-4c51-a212-4590d91cedee","Type":"ContainerDied","Data":"72a754ea75011988bbe8de2bc3b63835462a8420758ab77ea1bdfeeb4d1c7b28"} Dec 06 01:21:33 crc kubenswrapper[4991]: I1206 01:21:33.971470 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-e2eb-account-create-update-pd2t5" event={"ID":"31697208-90b9-4041-98b1-9a63a2671b4b","Type":"ContainerStarted","Data":"dedf0cac13abd1211ab22dee8d33e362bd80ff30c0b6bf78ffb8c62b24deae22"} Dec 06 01:21:33 crc kubenswrapper[4991]: I1206 01:21:33.980959 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-2snkc" event={"ID":"97898e71-ae7d-405f-898b-5bb25c2948ff","Type":"ContainerStarted","Data":"54bec077365df657bfb739dde45467670bdf44df4cdbeff9adf99517b58523d8"} Dec 06 01:21:33 crc kubenswrapper[4991]: I1206 01:21:33.989200 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6152-account-create-update-zzm8n" event={"ID":"78a85bfe-6df1-44b8-ad38-9a8eba3e662b","Type":"ContainerStarted","Data":"5e0cdce1ad6e5bceb0ff6bb0b94bdc291597ef588166dda553c99cd718396e04"} Dec 06 01:21:33 crc kubenswrapper[4991]: I1206 01:21:33.989255 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6152-account-create-update-zzm8n" event={"ID":"78a85bfe-6df1-44b8-ad38-9a8eba3e662b","Type":"ContainerStarted","Data":"42e8eb37fd9bb65e55fdd7d61f42771139e3b08b99ada5fc721c1ec10798d73e"} Dec 06 01:21:33 crc kubenswrapper[4991]: I1206 01:21:33.993456 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-7wx6t" event={"ID":"adc2f876-daef-47fe-81a6-1eced742e7f6","Type":"ContainerStarted","Data":"3bc552300db9f5778af42c4fc8bb6bff650f5dbeb539656c8aa06c33751d6be3"} Dec 06 01:21:34 crc kubenswrapper[4991]: I1206 01:21:34.012465 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-e2eb-account-create-update-pd2t5" podStartSLOduration=3.012441735 podStartE2EDuration="3.012441735s" podCreationTimestamp="2025-12-06 01:21:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:21:34.011289005 +0000 UTC m=+1167.361579473" watchObservedRunningTime="2025-12-06 01:21:34.012441735 +0000 UTC m=+1167.362732193" Dec 06 01:21:34 crc kubenswrapper[4991]: I1206 01:21:34.037552 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-2snkc" podStartSLOduration=3.037521572 podStartE2EDuration="3.037521572s" podCreationTimestamp="2025-12-06 01:21:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:21:34.030326764 +0000 UTC m=+1167.380617232" watchObservedRunningTime="2025-12-06 01:21:34.037521572 +0000 UTC m=+1167.387812040" Dec 06 01:21:34 crc kubenswrapper[4991]: I1206 01:21:34.047542 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-7wx6t" podStartSLOduration=3.047521054 podStartE2EDuration="3.047521054s" podCreationTimestamp="2025-12-06 01:21:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:21:34.046394874 +0000 UTC m=+1167.396685342" watchObservedRunningTime="2025-12-06 01:21:34.047521054 +0000 UTC m=+1167.397811522" Dec 06 01:21:34 crc kubenswrapper[4991]: E1206 01:21:34.400949 4991 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podadc2f876_daef_47fe_81a6_1eced742e7f6.slice/crio-3bc552300db9f5778af42c4fc8bb6bff650f5dbeb539656c8aa06c33751d6be3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd902c58d_de3b_420a_95de_27f9393ef061.slice/crio-fa3032c957230ec11f138a64bea4c7d7fff691314baba4427ffab297fbf812d7.scope\": RecentStats: unable to find data in memory cache]" Dec 06 01:21:34 crc kubenswrapper[4991]: I1206 01:21:34.480399 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-2wbr6" Dec 06 01:21:34 crc kubenswrapper[4991]: I1206 01:21:34.544265 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-dlhpm"] Dec 06 01:21:34 crc kubenswrapper[4991]: I1206 01:21:34.544604 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-dlhpm" podUID="7e97b3a5-f53d-4aee-9d21-3ca0d6928b31" containerName="dnsmasq-dns" containerID="cri-o://8ea453040fcf5983c2c1002cddf839fba18ed4cb2145ee013a4608e74927f6e2" gracePeriod=10 Dec 06 01:21:35 crc kubenswrapper[4991]: I1206 01:21:35.022136 4991 generic.go:334] "Generic (PLEG): container finished" podID="78a85bfe-6df1-44b8-ad38-9a8eba3e662b" containerID="5e0cdce1ad6e5bceb0ff6bb0b94bdc291597ef588166dda553c99cd718396e04" exitCode=0 Dec 06 01:21:35 crc kubenswrapper[4991]: I1206 01:21:35.022238 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6152-account-create-update-zzm8n" event={"ID":"78a85bfe-6df1-44b8-ad38-9a8eba3e662b","Type":"ContainerDied","Data":"5e0cdce1ad6e5bceb0ff6bb0b94bdc291597ef588166dda553c99cd718396e04"} Dec 06 01:21:35 crc kubenswrapper[4991]: I1206 01:21:35.025640 4991 generic.go:334] "Generic (PLEG): container finished" podID="adc2f876-daef-47fe-81a6-1eced742e7f6" containerID="3bc552300db9f5778af42c4fc8bb6bff650f5dbeb539656c8aa06c33751d6be3" exitCode=0 Dec 06 01:21:35 crc kubenswrapper[4991]: I1206 01:21:35.025713 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-7wx6t" event={"ID":"adc2f876-daef-47fe-81a6-1eced742e7f6","Type":"ContainerDied","Data":"3bc552300db9f5778af42c4fc8bb6bff650f5dbeb539656c8aa06c33751d6be3"} Dec 06 01:21:35 crc kubenswrapper[4991]: I1206 01:21:35.031428 4991 generic.go:334] "Generic (PLEG): container finished" podID="7e97b3a5-f53d-4aee-9d21-3ca0d6928b31" containerID="8ea453040fcf5983c2c1002cddf839fba18ed4cb2145ee013a4608e74927f6e2" exitCode=0 Dec 06 01:21:35 crc kubenswrapper[4991]: I1206 01:21:35.031490 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-dlhpm" event={"ID":"7e97b3a5-f53d-4aee-9d21-3ca0d6928b31","Type":"ContainerDied","Data":"8ea453040fcf5983c2c1002cddf839fba18ed4cb2145ee013a4608e74927f6e2"} Dec 06 01:21:35 crc kubenswrapper[4991]: I1206 01:21:35.033076 4991 generic.go:334] "Generic (PLEG): container finished" podID="d902c58d-de3b-420a-95de-27f9393ef061" containerID="fa3032c957230ec11f138a64bea4c7d7fff691314baba4427ffab297fbf812d7" exitCode=0 Dec 06 01:21:35 crc kubenswrapper[4991]: I1206 01:21:35.033119 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3d5c-account-create-update-vbgdh" event={"ID":"d902c58d-de3b-420a-95de-27f9393ef061","Type":"ContainerDied","Data":"fa3032c957230ec11f138a64bea4c7d7fff691314baba4427ffab297fbf812d7"} Dec 06 01:21:35 crc kubenswrapper[4991]: I1206 01:21:35.037641 4991 generic.go:334] "Generic (PLEG): container finished" podID="31697208-90b9-4041-98b1-9a63a2671b4b" containerID="dedf0cac13abd1211ab22dee8d33e362bd80ff30c0b6bf78ffb8c62b24deae22" exitCode=0 Dec 06 01:21:35 crc kubenswrapper[4991]: I1206 01:21:35.040719 4991 generic.go:334] "Generic (PLEG): container finished" podID="97898e71-ae7d-405f-898b-5bb25c2948ff" containerID="54bec077365df657bfb739dde45467670bdf44df4cdbeff9adf99517b58523d8" exitCode=0 Dec 06 01:21:35 crc kubenswrapper[4991]: I1206 01:21:35.067066 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-e2eb-account-create-update-pd2t5" event={"ID":"31697208-90b9-4041-98b1-9a63a2671b4b","Type":"ContainerDied","Data":"dedf0cac13abd1211ab22dee8d33e362bd80ff30c0b6bf78ffb8c62b24deae22"} Dec 06 01:21:35 crc kubenswrapper[4991]: I1206 01:21:35.067124 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-2snkc" event={"ID":"97898e71-ae7d-405f-898b-5bb25c2948ff","Type":"ContainerDied","Data":"54bec077365df657bfb739dde45467670bdf44df4cdbeff9adf99517b58523d8"} Dec 06 01:21:35 crc kubenswrapper[4991]: I1206 01:21:35.223397 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-dlhpm" Dec 06 01:21:35 crc kubenswrapper[4991]: I1206 01:21:35.333525 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p99t6\" (UniqueName: \"kubernetes.io/projected/7e97b3a5-f53d-4aee-9d21-3ca0d6928b31-kube-api-access-p99t6\") pod \"7e97b3a5-f53d-4aee-9d21-3ca0d6928b31\" (UID: \"7e97b3a5-f53d-4aee-9d21-3ca0d6928b31\") " Dec 06 01:21:35 crc kubenswrapper[4991]: I1206 01:21:35.333608 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e97b3a5-f53d-4aee-9d21-3ca0d6928b31-config\") pod \"7e97b3a5-f53d-4aee-9d21-3ca0d6928b31\" (UID: \"7e97b3a5-f53d-4aee-9d21-3ca0d6928b31\") " Dec 06 01:21:35 crc kubenswrapper[4991]: I1206 01:21:35.333663 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e97b3a5-f53d-4aee-9d21-3ca0d6928b31-ovsdbserver-sb\") pod \"7e97b3a5-f53d-4aee-9d21-3ca0d6928b31\" (UID: \"7e97b3a5-f53d-4aee-9d21-3ca0d6928b31\") " Dec 06 01:21:35 crc kubenswrapper[4991]: I1206 01:21:35.333756 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e97b3a5-f53d-4aee-9d21-3ca0d6928b31-ovsdbserver-nb\") pod \"7e97b3a5-f53d-4aee-9d21-3ca0d6928b31\" (UID: \"7e97b3a5-f53d-4aee-9d21-3ca0d6928b31\") " Dec 06 01:21:35 crc kubenswrapper[4991]: I1206 01:21:35.333785 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e97b3a5-f53d-4aee-9d21-3ca0d6928b31-dns-svc\") pod \"7e97b3a5-f53d-4aee-9d21-3ca0d6928b31\" (UID: \"7e97b3a5-f53d-4aee-9d21-3ca0d6928b31\") " Dec 06 01:21:35 crc kubenswrapper[4991]: I1206 01:21:35.345791 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e97b3a5-f53d-4aee-9d21-3ca0d6928b31-kube-api-access-p99t6" (OuterVolumeSpecName: "kube-api-access-p99t6") pod "7e97b3a5-f53d-4aee-9d21-3ca0d6928b31" (UID: "7e97b3a5-f53d-4aee-9d21-3ca0d6928b31"). InnerVolumeSpecName "kube-api-access-p99t6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:21:35 crc kubenswrapper[4991]: I1206 01:21:35.347543 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p99t6\" (UniqueName: \"kubernetes.io/projected/7e97b3a5-f53d-4aee-9d21-3ca0d6928b31-kube-api-access-p99t6\") on node \"crc\" DevicePath \"\"" Dec 06 01:21:35 crc kubenswrapper[4991]: I1206 01:21:35.391577 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e97b3a5-f53d-4aee-9d21-3ca0d6928b31-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7e97b3a5-f53d-4aee-9d21-3ca0d6928b31" (UID: "7e97b3a5-f53d-4aee-9d21-3ca0d6928b31"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:21:35 crc kubenswrapper[4991]: I1206 01:21:35.397175 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e97b3a5-f53d-4aee-9d21-3ca0d6928b31-config" (OuterVolumeSpecName: "config") pod "7e97b3a5-f53d-4aee-9d21-3ca0d6928b31" (UID: "7e97b3a5-f53d-4aee-9d21-3ca0d6928b31"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:21:35 crc kubenswrapper[4991]: I1206 01:21:35.399562 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e97b3a5-f53d-4aee-9d21-3ca0d6928b31-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7e97b3a5-f53d-4aee-9d21-3ca0d6928b31" (UID: "7e97b3a5-f53d-4aee-9d21-3ca0d6928b31"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:21:35 crc kubenswrapper[4991]: I1206 01:21:35.403609 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e97b3a5-f53d-4aee-9d21-3ca0d6928b31-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7e97b3a5-f53d-4aee-9d21-3ca0d6928b31" (UID: "7e97b3a5-f53d-4aee-9d21-3ca0d6928b31"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:21:35 crc kubenswrapper[4991]: I1206 01:21:35.448877 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e97b3a5-f53d-4aee-9d21-3ca0d6928b31-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:21:35 crc kubenswrapper[4991]: I1206 01:21:35.448914 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e97b3a5-f53d-4aee-9d21-3ca0d6928b31-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 01:21:35 crc kubenswrapper[4991]: I1206 01:21:35.448925 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e97b3a5-f53d-4aee-9d21-3ca0d6928b31-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 01:21:35 crc kubenswrapper[4991]: I1206 01:21:35.448935 4991 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e97b3a5-f53d-4aee-9d21-3ca0d6928b31-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 01:21:35 crc kubenswrapper[4991]: I1206 01:21:35.460285 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-7xb7k" Dec 06 01:21:35 crc kubenswrapper[4991]: I1206 01:21:35.550696 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25db95d2-7c84-4c51-a212-4590d91cedee-operator-scripts\") pod \"25db95d2-7c84-4c51-a212-4590d91cedee\" (UID: \"25db95d2-7c84-4c51-a212-4590d91cedee\") " Dec 06 01:21:35 crc kubenswrapper[4991]: I1206 01:21:35.550779 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ctdw\" (UniqueName: \"kubernetes.io/projected/25db95d2-7c84-4c51-a212-4590d91cedee-kube-api-access-8ctdw\") pod \"25db95d2-7c84-4c51-a212-4590d91cedee\" (UID: \"25db95d2-7c84-4c51-a212-4590d91cedee\") " Dec 06 01:21:35 crc kubenswrapper[4991]: I1206 01:21:35.552071 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25db95d2-7c84-4c51-a212-4590d91cedee-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "25db95d2-7c84-4c51-a212-4590d91cedee" (UID: "25db95d2-7c84-4c51-a212-4590d91cedee"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:21:35 crc kubenswrapper[4991]: I1206 01:21:35.554480 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25db95d2-7c84-4c51-a212-4590d91cedee-kube-api-access-8ctdw" (OuterVolumeSpecName: "kube-api-access-8ctdw") pod "25db95d2-7c84-4c51-a212-4590d91cedee" (UID: "25db95d2-7c84-4c51-a212-4590d91cedee"). InnerVolumeSpecName "kube-api-access-8ctdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:21:35 crc kubenswrapper[4991]: I1206 01:21:35.653427 4991 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25db95d2-7c84-4c51-a212-4590d91cedee-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:21:35 crc kubenswrapper[4991]: I1206 01:21:35.653477 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ctdw\" (UniqueName: \"kubernetes.io/projected/25db95d2-7c84-4c51-a212-4590d91cedee-kube-api-access-8ctdw\") on node \"crc\" DevicePath \"\"" Dec 06 01:21:36 crc kubenswrapper[4991]: I1206 01:21:36.057249 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-dlhpm" event={"ID":"7e97b3a5-f53d-4aee-9d21-3ca0d6928b31","Type":"ContainerDied","Data":"76f739d933a78b9bddb7a5add40d869c3f218ef6b99650ae6889a09c5500ad3f"} Dec 06 01:21:36 crc kubenswrapper[4991]: I1206 01:21:36.057364 4991 scope.go:117] "RemoveContainer" containerID="8ea453040fcf5983c2c1002cddf839fba18ed4cb2145ee013a4608e74927f6e2" Dec 06 01:21:36 crc kubenswrapper[4991]: I1206 01:21:36.057284 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-dlhpm" Dec 06 01:21:36 crc kubenswrapper[4991]: I1206 01:21:36.063971 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-7xb7k" Dec 06 01:21:36 crc kubenswrapper[4991]: I1206 01:21:36.067222 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-7xb7k" event={"ID":"25db95d2-7c84-4c51-a212-4590d91cedee","Type":"ContainerDied","Data":"a4badc9d36ba6ab35fc7c9dec9e4878761af85b485166197af403a8c7dce4670"} Dec 06 01:21:36 crc kubenswrapper[4991]: I1206 01:21:36.074123 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4badc9d36ba6ab35fc7c9dec9e4878761af85b485166197af403a8c7dce4670" Dec 06 01:21:36 crc kubenswrapper[4991]: I1206 01:21:36.107361 4991 scope.go:117] "RemoveContainer" containerID="d95b1aadd43c5fa26fde516673f803479461a565fcc192a91401bb15ce699b25" Dec 06 01:21:36 crc kubenswrapper[4991]: I1206 01:21:36.126680 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-dlhpm"] Dec 06 01:21:36 crc kubenswrapper[4991]: I1206 01:21:36.144958 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-dlhpm"] Dec 06 01:21:36 crc kubenswrapper[4991]: I1206 01:21:36.525740 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e2eb-account-create-update-pd2t5" Dec 06 01:21:36 crc kubenswrapper[4991]: I1206 01:21:36.624449 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-2snkc" Dec 06 01:21:36 crc kubenswrapper[4991]: I1206 01:21:36.678744 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31697208-90b9-4041-98b1-9a63a2671b4b-operator-scripts\") pod \"31697208-90b9-4041-98b1-9a63a2671b4b\" (UID: \"31697208-90b9-4041-98b1-9a63a2671b4b\") " Dec 06 01:21:36 crc kubenswrapper[4991]: I1206 01:21:36.679097 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8xwh\" (UniqueName: \"kubernetes.io/projected/31697208-90b9-4041-98b1-9a63a2671b4b-kube-api-access-f8xwh\") pod \"31697208-90b9-4041-98b1-9a63a2671b4b\" (UID: \"31697208-90b9-4041-98b1-9a63a2671b4b\") " Dec 06 01:21:36 crc kubenswrapper[4991]: I1206 01:21:36.679832 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31697208-90b9-4041-98b1-9a63a2671b4b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "31697208-90b9-4041-98b1-9a63a2671b4b" (UID: "31697208-90b9-4041-98b1-9a63a2671b4b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:21:36 crc kubenswrapper[4991]: I1206 01:21:36.683842 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31697208-90b9-4041-98b1-9a63a2671b4b-kube-api-access-f8xwh" (OuterVolumeSpecName: "kube-api-access-f8xwh") pod "31697208-90b9-4041-98b1-9a63a2671b4b" (UID: "31697208-90b9-4041-98b1-9a63a2671b4b"). InnerVolumeSpecName "kube-api-access-f8xwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:21:36 crc kubenswrapper[4991]: I1206 01:21:36.736611 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6152-account-create-update-zzm8n" Dec 06 01:21:36 crc kubenswrapper[4991]: I1206 01:21:36.747573 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7wx6t" Dec 06 01:21:36 crc kubenswrapper[4991]: I1206 01:21:36.766118 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3d5c-account-create-update-vbgdh" Dec 06 01:21:36 crc kubenswrapper[4991]: I1206 01:21:36.783769 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97898e71-ae7d-405f-898b-5bb25c2948ff-operator-scripts\") pod \"97898e71-ae7d-405f-898b-5bb25c2948ff\" (UID: \"97898e71-ae7d-405f-898b-5bb25c2948ff\") " Dec 06 01:21:36 crc kubenswrapper[4991]: I1206 01:21:36.784207 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zj56\" (UniqueName: \"kubernetes.io/projected/97898e71-ae7d-405f-898b-5bb25c2948ff-kube-api-access-9zj56\") pod \"97898e71-ae7d-405f-898b-5bb25c2948ff\" (UID: \"97898e71-ae7d-405f-898b-5bb25c2948ff\") " Dec 06 01:21:36 crc kubenswrapper[4991]: I1206 01:21:36.784667 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8xwh\" (UniqueName: \"kubernetes.io/projected/31697208-90b9-4041-98b1-9a63a2671b4b-kube-api-access-f8xwh\") on node \"crc\" DevicePath \"\"" Dec 06 01:21:36 crc kubenswrapper[4991]: I1206 01:21:36.784688 4991 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31697208-90b9-4041-98b1-9a63a2671b4b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:21:36 crc kubenswrapper[4991]: I1206 01:21:36.785447 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97898e71-ae7d-405f-898b-5bb25c2948ff-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "97898e71-ae7d-405f-898b-5bb25c2948ff" (UID: "97898e71-ae7d-405f-898b-5bb25c2948ff"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:21:36 crc kubenswrapper[4991]: I1206 01:21:36.797443 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97898e71-ae7d-405f-898b-5bb25c2948ff-kube-api-access-9zj56" (OuterVolumeSpecName: "kube-api-access-9zj56") pod "97898e71-ae7d-405f-898b-5bb25c2948ff" (UID: "97898e71-ae7d-405f-898b-5bb25c2948ff"). InnerVolumeSpecName "kube-api-access-9zj56". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:21:36 crc kubenswrapper[4991]: I1206 01:21:36.885815 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78a85bfe-6df1-44b8-ad38-9a8eba3e662b-operator-scripts\") pod \"78a85bfe-6df1-44b8-ad38-9a8eba3e662b\" (UID: \"78a85bfe-6df1-44b8-ad38-9a8eba3e662b\") " Dec 06 01:21:36 crc kubenswrapper[4991]: I1206 01:21:36.885976 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d902c58d-de3b-420a-95de-27f9393ef061-operator-scripts\") pod \"d902c58d-de3b-420a-95de-27f9393ef061\" (UID: \"d902c58d-de3b-420a-95de-27f9393ef061\") " Dec 06 01:21:36 crc kubenswrapper[4991]: I1206 01:21:36.886030 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4q7pz\" (UniqueName: \"kubernetes.io/projected/adc2f876-daef-47fe-81a6-1eced742e7f6-kube-api-access-4q7pz\") pod \"adc2f876-daef-47fe-81a6-1eced742e7f6\" (UID: \"adc2f876-daef-47fe-81a6-1eced742e7f6\") " Dec 06 01:21:36 crc kubenswrapper[4991]: I1206 01:21:36.886090 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjlrf\" (UniqueName: \"kubernetes.io/projected/78a85bfe-6df1-44b8-ad38-9a8eba3e662b-kube-api-access-bjlrf\") pod \"78a85bfe-6df1-44b8-ad38-9a8eba3e662b\" (UID: \"78a85bfe-6df1-44b8-ad38-9a8eba3e662b\") " Dec 06 01:21:36 crc kubenswrapper[4991]: I1206 01:21:36.886159 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adc2f876-daef-47fe-81a6-1eced742e7f6-operator-scripts\") pod \"adc2f876-daef-47fe-81a6-1eced742e7f6\" (UID: \"adc2f876-daef-47fe-81a6-1eced742e7f6\") " Dec 06 01:21:36 crc kubenswrapper[4991]: I1206 01:21:36.886198 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tn9jl\" (UniqueName: \"kubernetes.io/projected/d902c58d-de3b-420a-95de-27f9393ef061-kube-api-access-tn9jl\") pod \"d902c58d-de3b-420a-95de-27f9393ef061\" (UID: \"d902c58d-de3b-420a-95de-27f9393ef061\") " Dec 06 01:21:36 crc kubenswrapper[4991]: I1206 01:21:36.886676 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zj56\" (UniqueName: \"kubernetes.io/projected/97898e71-ae7d-405f-898b-5bb25c2948ff-kube-api-access-9zj56\") on node \"crc\" DevicePath \"\"" Dec 06 01:21:36 crc kubenswrapper[4991]: I1206 01:21:36.886713 4991 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97898e71-ae7d-405f-898b-5bb25c2948ff-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:21:36 crc kubenswrapper[4991]: I1206 01:21:36.886936 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/adc2f876-daef-47fe-81a6-1eced742e7f6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "adc2f876-daef-47fe-81a6-1eced742e7f6" (UID: "adc2f876-daef-47fe-81a6-1eced742e7f6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:21:36 crc kubenswrapper[4991]: I1206 01:21:36.887043 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d902c58d-de3b-420a-95de-27f9393ef061-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d902c58d-de3b-420a-95de-27f9393ef061" (UID: "d902c58d-de3b-420a-95de-27f9393ef061"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:21:36 crc kubenswrapper[4991]: I1206 01:21:36.887213 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78a85bfe-6df1-44b8-ad38-9a8eba3e662b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "78a85bfe-6df1-44b8-ad38-9a8eba3e662b" (UID: "78a85bfe-6df1-44b8-ad38-9a8eba3e662b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:21:36 crc kubenswrapper[4991]: I1206 01:21:36.889311 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adc2f876-daef-47fe-81a6-1eced742e7f6-kube-api-access-4q7pz" (OuterVolumeSpecName: "kube-api-access-4q7pz") pod "adc2f876-daef-47fe-81a6-1eced742e7f6" (UID: "adc2f876-daef-47fe-81a6-1eced742e7f6"). InnerVolumeSpecName "kube-api-access-4q7pz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:21:36 crc kubenswrapper[4991]: I1206 01:21:36.889845 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78a85bfe-6df1-44b8-ad38-9a8eba3e662b-kube-api-access-bjlrf" (OuterVolumeSpecName: "kube-api-access-bjlrf") pod "78a85bfe-6df1-44b8-ad38-9a8eba3e662b" (UID: "78a85bfe-6df1-44b8-ad38-9a8eba3e662b"). InnerVolumeSpecName "kube-api-access-bjlrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:21:36 crc kubenswrapper[4991]: I1206 01:21:36.899496 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d902c58d-de3b-420a-95de-27f9393ef061-kube-api-access-tn9jl" (OuterVolumeSpecName: "kube-api-access-tn9jl") pod "d902c58d-de3b-420a-95de-27f9393ef061" (UID: "d902c58d-de3b-420a-95de-27f9393ef061"). InnerVolumeSpecName "kube-api-access-tn9jl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:21:36 crc kubenswrapper[4991]: I1206 01:21:36.989134 4991 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d902c58d-de3b-420a-95de-27f9393ef061-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:21:36 crc kubenswrapper[4991]: I1206 01:21:36.989215 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4q7pz\" (UniqueName: \"kubernetes.io/projected/adc2f876-daef-47fe-81a6-1eced742e7f6-kube-api-access-4q7pz\") on node \"crc\" DevicePath \"\"" Dec 06 01:21:36 crc kubenswrapper[4991]: I1206 01:21:36.989232 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjlrf\" (UniqueName: \"kubernetes.io/projected/78a85bfe-6df1-44b8-ad38-9a8eba3e662b-kube-api-access-bjlrf\") on node \"crc\" DevicePath \"\"" Dec 06 01:21:36 crc kubenswrapper[4991]: I1206 01:21:36.989244 4991 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adc2f876-daef-47fe-81a6-1eced742e7f6-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:21:36 crc kubenswrapper[4991]: I1206 01:21:36.989257 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tn9jl\" (UniqueName: \"kubernetes.io/projected/d902c58d-de3b-420a-95de-27f9393ef061-kube-api-access-tn9jl\") on node \"crc\" DevicePath \"\"" Dec 06 01:21:36 crc kubenswrapper[4991]: I1206 01:21:36.989266 4991 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78a85bfe-6df1-44b8-ad38-9a8eba3e662b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:21:37 crc kubenswrapper[4991]: I1206 01:21:37.051146 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e97b3a5-f53d-4aee-9d21-3ca0d6928b31" path="/var/lib/kubelet/pods/7e97b3a5-f53d-4aee-9d21-3ca0d6928b31/volumes" Dec 06 01:21:37 crc kubenswrapper[4991]: I1206 01:21:37.088573 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e2eb-account-create-update-pd2t5" Dec 06 01:21:37 crc kubenswrapper[4991]: I1206 01:21:37.089705 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-e2eb-account-create-update-pd2t5" event={"ID":"31697208-90b9-4041-98b1-9a63a2671b4b","Type":"ContainerDied","Data":"eb64de60f3ab6f19cd8d360144a6e344b754358ff4547b111cd6a86cd8d3f981"} Dec 06 01:21:37 crc kubenswrapper[4991]: I1206 01:21:37.092167 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb64de60f3ab6f19cd8d360144a6e344b754358ff4547b111cd6a86cd8d3f981" Dec 06 01:21:37 crc kubenswrapper[4991]: I1206 01:21:37.093737 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-2snkc" event={"ID":"97898e71-ae7d-405f-898b-5bb25c2948ff","Type":"ContainerDied","Data":"ffa5ede272eb378c2ea949374bd2266d9389a836e3e8cabd4b508d9d23ca43bf"} Dec 06 01:21:37 crc kubenswrapper[4991]: I1206 01:21:37.093772 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffa5ede272eb378c2ea949374bd2266d9389a836e3e8cabd4b508d9d23ca43bf" Dec 06 01:21:37 crc kubenswrapper[4991]: I1206 01:21:37.093836 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-2snkc" Dec 06 01:21:37 crc kubenswrapper[4991]: I1206 01:21:37.098499 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6152-account-create-update-zzm8n" Dec 06 01:21:37 crc kubenswrapper[4991]: I1206 01:21:37.098491 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6152-account-create-update-zzm8n" event={"ID":"78a85bfe-6df1-44b8-ad38-9a8eba3e662b","Type":"ContainerDied","Data":"42e8eb37fd9bb65e55fdd7d61f42771139e3b08b99ada5fc721c1ec10798d73e"} Dec 06 01:21:37 crc kubenswrapper[4991]: I1206 01:21:37.098617 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42e8eb37fd9bb65e55fdd7d61f42771139e3b08b99ada5fc721c1ec10798d73e" Dec 06 01:21:37 crc kubenswrapper[4991]: I1206 01:21:37.100845 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-7wx6t" event={"ID":"adc2f876-daef-47fe-81a6-1eced742e7f6","Type":"ContainerDied","Data":"c6964426a6a6dad9f2f6142fba8b9d9ee256c4ffac2479fdb6aef9ef28b4be77"} Dec 06 01:21:37 crc kubenswrapper[4991]: I1206 01:21:37.100878 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6964426a6a6dad9f2f6142fba8b9d9ee256c4ffac2479fdb6aef9ef28b4be77" Dec 06 01:21:37 crc kubenswrapper[4991]: I1206 01:21:37.100923 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7wx6t" Dec 06 01:21:37 crc kubenswrapper[4991]: I1206 01:21:37.104318 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3d5c-account-create-update-vbgdh" event={"ID":"d902c58d-de3b-420a-95de-27f9393ef061","Type":"ContainerDied","Data":"21a50fb33e61c82c469c23e649b26321e8a5975e544efd60991e7bd8be8546db"} Dec 06 01:21:37 crc kubenswrapper[4991]: I1206 01:21:37.104353 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21a50fb33e61c82c469c23e649b26321e8a5975e544efd60991e7bd8be8546db" Dec 06 01:21:37 crc kubenswrapper[4991]: I1206 01:21:37.104379 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3d5c-account-create-update-vbgdh" Dec 06 01:21:37 crc kubenswrapper[4991]: I1206 01:21:37.982436 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 06 01:21:40 crc kubenswrapper[4991]: I1206 01:21:40.134871 4991 generic.go:334] "Generic (PLEG): container finished" podID="955d4a82-f627-490d-bc6c-cb7420010230" containerID="5652ee63e5d24bf081a22e67ab33ae9a89ad0974c8287ae8b41bce1092d6c17a" exitCode=0 Dec 06 01:21:40 crc kubenswrapper[4991]: I1206 01:21:40.135033 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mtqz6" event={"ID":"955d4a82-f627-490d-bc6c-cb7420010230","Type":"ContainerDied","Data":"5652ee63e5d24bf081a22e67ab33ae9a89ad0974c8287ae8b41bce1092d6c17a"} Dec 06 01:21:41 crc kubenswrapper[4991]: I1206 01:21:41.064887 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/04a746be-9ac3-4e2e-a264-6fc99dc51527-etc-swift\") pod \"swift-storage-0\" (UID: \"04a746be-9ac3-4e2e-a264-6fc99dc51527\") " pod="openstack/swift-storage-0" Dec 06 01:21:41 crc kubenswrapper[4991]: I1206 01:21:41.073895 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/04a746be-9ac3-4e2e-a264-6fc99dc51527-etc-swift\") pod \"swift-storage-0\" (UID: \"04a746be-9ac3-4e2e-a264-6fc99dc51527\") " pod="openstack/swift-storage-0" Dec 06 01:21:41 crc kubenswrapper[4991]: I1206 01:21:41.345652 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 06 01:21:41 crc kubenswrapper[4991]: I1206 01:21:41.565791 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mtqz6" Dec 06 01:21:41 crc kubenswrapper[4991]: I1206 01:21:41.679715 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/955d4a82-f627-490d-bc6c-cb7420010230-scripts\") pod \"955d4a82-f627-490d-bc6c-cb7420010230\" (UID: \"955d4a82-f627-490d-bc6c-cb7420010230\") " Dec 06 01:21:41 crc kubenswrapper[4991]: I1206 01:21:41.680225 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/955d4a82-f627-490d-bc6c-cb7420010230-combined-ca-bundle\") pod \"955d4a82-f627-490d-bc6c-cb7420010230\" (UID: \"955d4a82-f627-490d-bc6c-cb7420010230\") " Dec 06 01:21:41 crc kubenswrapper[4991]: I1206 01:21:41.680257 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/955d4a82-f627-490d-bc6c-cb7420010230-swiftconf\") pod \"955d4a82-f627-490d-bc6c-cb7420010230\" (UID: \"955d4a82-f627-490d-bc6c-cb7420010230\") " Dec 06 01:21:41 crc kubenswrapper[4991]: I1206 01:21:41.680290 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/955d4a82-f627-490d-bc6c-cb7420010230-dispersionconf\") pod \"955d4a82-f627-490d-bc6c-cb7420010230\" (UID: \"955d4a82-f627-490d-bc6c-cb7420010230\") " Dec 06 01:21:41 crc kubenswrapper[4991]: I1206 01:21:41.680312 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-258rj\" (UniqueName: \"kubernetes.io/projected/955d4a82-f627-490d-bc6c-cb7420010230-kube-api-access-258rj\") pod \"955d4a82-f627-490d-bc6c-cb7420010230\" (UID: \"955d4a82-f627-490d-bc6c-cb7420010230\") " Dec 06 01:21:41 crc kubenswrapper[4991]: I1206 01:21:41.680419 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/955d4a82-f627-490d-bc6c-cb7420010230-ring-data-devices\") pod \"955d4a82-f627-490d-bc6c-cb7420010230\" (UID: \"955d4a82-f627-490d-bc6c-cb7420010230\") " Dec 06 01:21:41 crc kubenswrapper[4991]: I1206 01:21:41.680448 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/955d4a82-f627-490d-bc6c-cb7420010230-etc-swift\") pod \"955d4a82-f627-490d-bc6c-cb7420010230\" (UID: \"955d4a82-f627-490d-bc6c-cb7420010230\") " Dec 06 01:21:41 crc kubenswrapper[4991]: I1206 01:21:41.688557 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/955d4a82-f627-490d-bc6c-cb7420010230-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "955d4a82-f627-490d-bc6c-cb7420010230" (UID: "955d4a82-f627-490d-bc6c-cb7420010230"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:21:41 crc kubenswrapper[4991]: I1206 01:21:41.688621 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/955d4a82-f627-490d-bc6c-cb7420010230-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "955d4a82-f627-490d-bc6c-cb7420010230" (UID: "955d4a82-f627-490d-bc6c-cb7420010230"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:21:41 crc kubenswrapper[4991]: I1206 01:21:41.701114 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/955d4a82-f627-490d-bc6c-cb7420010230-kube-api-access-258rj" (OuterVolumeSpecName: "kube-api-access-258rj") pod "955d4a82-f627-490d-bc6c-cb7420010230" (UID: "955d4a82-f627-490d-bc6c-cb7420010230"). InnerVolumeSpecName "kube-api-access-258rj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:21:41 crc kubenswrapper[4991]: I1206 01:21:41.709095 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/955d4a82-f627-490d-bc6c-cb7420010230-scripts" (OuterVolumeSpecName: "scripts") pod "955d4a82-f627-490d-bc6c-cb7420010230" (UID: "955d4a82-f627-490d-bc6c-cb7420010230"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:21:41 crc kubenswrapper[4991]: I1206 01:21:41.731347 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/955d4a82-f627-490d-bc6c-cb7420010230-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "955d4a82-f627-490d-bc6c-cb7420010230" (UID: "955d4a82-f627-490d-bc6c-cb7420010230"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:21:41 crc kubenswrapper[4991]: I1206 01:21:41.738125 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/955d4a82-f627-490d-bc6c-cb7420010230-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "955d4a82-f627-490d-bc6c-cb7420010230" (UID: "955d4a82-f627-490d-bc6c-cb7420010230"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:21:41 crc kubenswrapper[4991]: I1206 01:21:41.769443 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/955d4a82-f627-490d-bc6c-cb7420010230-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "955d4a82-f627-490d-bc6c-cb7420010230" (UID: "955d4a82-f627-490d-bc6c-cb7420010230"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:21:41 crc kubenswrapper[4991]: I1206 01:21:41.784180 4991 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/955d4a82-f627-490d-bc6c-cb7420010230-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 06 01:21:41 crc kubenswrapper[4991]: I1206 01:21:41.784233 4991 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/955d4a82-f627-490d-bc6c-cb7420010230-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 06 01:21:41 crc kubenswrapper[4991]: I1206 01:21:41.784246 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/955d4a82-f627-490d-bc6c-cb7420010230-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:21:41 crc kubenswrapper[4991]: I1206 01:21:41.784258 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/955d4a82-f627-490d-bc6c-cb7420010230-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:21:41 crc kubenswrapper[4991]: I1206 01:21:41.784269 4991 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/955d4a82-f627-490d-bc6c-cb7420010230-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 06 01:21:41 crc kubenswrapper[4991]: I1206 01:21:41.784280 4991 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/955d4a82-f627-490d-bc6c-cb7420010230-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 06 01:21:41 crc kubenswrapper[4991]: I1206 01:21:41.784298 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-258rj\" (UniqueName: \"kubernetes.io/projected/955d4a82-f627-490d-bc6c-cb7420010230-kube-api-access-258rj\") on node \"crc\" DevicePath \"\"" Dec 06 01:21:41 crc kubenswrapper[4991]: I1206 01:21:41.927174 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 06 01:21:42 crc kubenswrapper[4991]: I1206 01:21:42.166846 4991 generic.go:334] "Generic (PLEG): container finished" podID="8380dcf2-30be-4578-937b-5e2a8f4fc074" containerID="906bedf92a02c420aa1b9648b0ca335743a0784def2ed1b54f18aedd5e785b8a" exitCode=0 Dec 06 01:21:42 crc kubenswrapper[4991]: I1206 01:21:42.167015 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8380dcf2-30be-4578-937b-5e2a8f4fc074","Type":"ContainerDied","Data":"906bedf92a02c420aa1b9648b0ca335743a0784def2ed1b54f18aedd5e785b8a"} Dec 06 01:21:42 crc kubenswrapper[4991]: I1206 01:21:42.172380 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"04a746be-9ac3-4e2e-a264-6fc99dc51527","Type":"ContainerStarted","Data":"87be8b78af80d3cf17cbbb2d026e5a57ad2431c3e88f847e8272bfc5c2ddf163"} Dec 06 01:21:42 crc kubenswrapper[4991]: I1206 01:21:42.177292 4991 generic.go:334] "Generic (PLEG): container finished" podID="d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af" containerID="ba88aa1a0ccfd2b59d2138748ee66af24deb48ba69955de9f56aa083e4ca9001" exitCode=0 Dec 06 01:21:42 crc kubenswrapper[4991]: I1206 01:21:42.177402 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af","Type":"ContainerDied","Data":"ba88aa1a0ccfd2b59d2138748ee66af24deb48ba69955de9f56aa083e4ca9001"} Dec 06 01:21:42 crc kubenswrapper[4991]: I1206 01:21:42.190764 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mtqz6" Dec 06 01:21:42 crc kubenswrapper[4991]: I1206 01:21:42.191502 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mtqz6" event={"ID":"955d4a82-f627-490d-bc6c-cb7420010230","Type":"ContainerDied","Data":"e3df034ccf93cac8353d9b07a92d23fb4c64d400e9ae9c6a197dd55173705236"} Dec 06 01:21:42 crc kubenswrapper[4991]: I1206 01:21:42.191558 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3df034ccf93cac8353d9b07a92d23fb4c64d400e9ae9c6a197dd55173705236" Dec 06 01:21:42 crc kubenswrapper[4991]: I1206 01:21:42.322481 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-rssqk"] Dec 06 01:21:42 crc kubenswrapper[4991]: E1206 01:21:42.323029 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25db95d2-7c84-4c51-a212-4590d91cedee" containerName="mariadb-database-create" Dec 06 01:21:42 crc kubenswrapper[4991]: I1206 01:21:42.323053 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="25db95d2-7c84-4c51-a212-4590d91cedee" containerName="mariadb-database-create" Dec 06 01:21:42 crc kubenswrapper[4991]: E1206 01:21:42.323076 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e97b3a5-f53d-4aee-9d21-3ca0d6928b31" containerName="init" Dec 06 01:21:42 crc kubenswrapper[4991]: I1206 01:21:42.323086 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e97b3a5-f53d-4aee-9d21-3ca0d6928b31" containerName="init" Dec 06 01:21:42 crc kubenswrapper[4991]: E1206 01:21:42.323102 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d902c58d-de3b-420a-95de-27f9393ef061" containerName="mariadb-account-create-update" Dec 06 01:21:42 crc kubenswrapper[4991]: I1206 01:21:42.323112 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="d902c58d-de3b-420a-95de-27f9393ef061" containerName="mariadb-account-create-update" Dec 06 01:21:42 crc kubenswrapper[4991]: E1206 01:21:42.323125 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78a85bfe-6df1-44b8-ad38-9a8eba3e662b" containerName="mariadb-account-create-update" Dec 06 01:21:42 crc kubenswrapper[4991]: I1206 01:21:42.323136 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="78a85bfe-6df1-44b8-ad38-9a8eba3e662b" containerName="mariadb-account-create-update" Dec 06 01:21:42 crc kubenswrapper[4991]: E1206 01:21:42.323150 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e97b3a5-f53d-4aee-9d21-3ca0d6928b31" containerName="dnsmasq-dns" Dec 06 01:21:42 crc kubenswrapper[4991]: I1206 01:21:42.323158 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e97b3a5-f53d-4aee-9d21-3ca0d6928b31" containerName="dnsmasq-dns" Dec 06 01:21:42 crc kubenswrapper[4991]: E1206 01:21:42.323181 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="955d4a82-f627-490d-bc6c-cb7420010230" containerName="swift-ring-rebalance" Dec 06 01:21:42 crc kubenswrapper[4991]: I1206 01:21:42.323191 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="955d4a82-f627-490d-bc6c-cb7420010230" containerName="swift-ring-rebalance" Dec 06 01:21:42 crc kubenswrapper[4991]: E1206 01:21:42.323206 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adc2f876-daef-47fe-81a6-1eced742e7f6" containerName="mariadb-database-create" Dec 06 01:21:42 crc kubenswrapper[4991]: I1206 01:21:42.323215 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="adc2f876-daef-47fe-81a6-1eced742e7f6" containerName="mariadb-database-create" Dec 06 01:21:42 crc kubenswrapper[4991]: E1206 01:21:42.323233 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31697208-90b9-4041-98b1-9a63a2671b4b" containerName="mariadb-account-create-update" Dec 06 01:21:42 crc kubenswrapper[4991]: I1206 01:21:42.323242 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="31697208-90b9-4041-98b1-9a63a2671b4b" containerName="mariadb-account-create-update" Dec 06 01:21:42 crc kubenswrapper[4991]: E1206 01:21:42.323264 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97898e71-ae7d-405f-898b-5bb25c2948ff" containerName="mariadb-database-create" Dec 06 01:21:42 crc kubenswrapper[4991]: I1206 01:21:42.323272 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="97898e71-ae7d-405f-898b-5bb25c2948ff" containerName="mariadb-database-create" Dec 06 01:21:42 crc kubenswrapper[4991]: I1206 01:21:42.323473 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e97b3a5-f53d-4aee-9d21-3ca0d6928b31" containerName="dnsmasq-dns" Dec 06 01:21:42 crc kubenswrapper[4991]: I1206 01:21:42.323492 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="78a85bfe-6df1-44b8-ad38-9a8eba3e662b" containerName="mariadb-account-create-update" Dec 06 01:21:42 crc kubenswrapper[4991]: I1206 01:21:42.323506 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="25db95d2-7c84-4c51-a212-4590d91cedee" containerName="mariadb-database-create" Dec 06 01:21:42 crc kubenswrapper[4991]: I1206 01:21:42.323544 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="adc2f876-daef-47fe-81a6-1eced742e7f6" containerName="mariadb-database-create" Dec 06 01:21:42 crc kubenswrapper[4991]: I1206 01:21:42.323563 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="31697208-90b9-4041-98b1-9a63a2671b4b" containerName="mariadb-account-create-update" Dec 06 01:21:42 crc kubenswrapper[4991]: I1206 01:21:42.323577 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="97898e71-ae7d-405f-898b-5bb25c2948ff" containerName="mariadb-database-create" Dec 06 01:21:42 crc kubenswrapper[4991]: I1206 01:21:42.323592 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="955d4a82-f627-490d-bc6c-cb7420010230" containerName="swift-ring-rebalance" Dec 06 01:21:42 crc kubenswrapper[4991]: I1206 01:21:42.323605 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="d902c58d-de3b-420a-95de-27f9393ef061" containerName="mariadb-account-create-update" Dec 06 01:21:42 crc kubenswrapper[4991]: I1206 01:21:42.324349 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-rssqk" Dec 06 01:21:42 crc kubenswrapper[4991]: I1206 01:21:42.326706 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 06 01:21:42 crc kubenswrapper[4991]: I1206 01:21:42.326728 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-88l6p" Dec 06 01:21:42 crc kubenswrapper[4991]: I1206 01:21:42.352805 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-rssqk"] Dec 06 01:21:42 crc kubenswrapper[4991]: I1206 01:21:42.402538 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1befea2d-2e39-4901-98ef-da500dcb82f9-combined-ca-bundle\") pod \"glance-db-sync-rssqk\" (UID: \"1befea2d-2e39-4901-98ef-da500dcb82f9\") " pod="openstack/glance-db-sync-rssqk" Dec 06 01:21:42 crc kubenswrapper[4991]: I1206 01:21:42.406148 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7crl2\" (UniqueName: \"kubernetes.io/projected/1befea2d-2e39-4901-98ef-da500dcb82f9-kube-api-access-7crl2\") pod \"glance-db-sync-rssqk\" (UID: \"1befea2d-2e39-4901-98ef-da500dcb82f9\") " pod="openstack/glance-db-sync-rssqk" Dec 06 01:21:42 crc kubenswrapper[4991]: I1206 01:21:42.406305 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1befea2d-2e39-4901-98ef-da500dcb82f9-config-data\") pod \"glance-db-sync-rssqk\" (UID: \"1befea2d-2e39-4901-98ef-da500dcb82f9\") " pod="openstack/glance-db-sync-rssqk" Dec 06 01:21:42 crc kubenswrapper[4991]: I1206 01:21:42.406408 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1befea2d-2e39-4901-98ef-da500dcb82f9-db-sync-config-data\") pod \"glance-db-sync-rssqk\" (UID: \"1befea2d-2e39-4901-98ef-da500dcb82f9\") " pod="openstack/glance-db-sync-rssqk" Dec 06 01:21:42 crc kubenswrapper[4991]: I1206 01:21:42.508122 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7crl2\" (UniqueName: \"kubernetes.io/projected/1befea2d-2e39-4901-98ef-da500dcb82f9-kube-api-access-7crl2\") pod \"glance-db-sync-rssqk\" (UID: \"1befea2d-2e39-4901-98ef-da500dcb82f9\") " pod="openstack/glance-db-sync-rssqk" Dec 06 01:21:42 crc kubenswrapper[4991]: I1206 01:21:42.508175 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1befea2d-2e39-4901-98ef-da500dcb82f9-config-data\") pod \"glance-db-sync-rssqk\" (UID: \"1befea2d-2e39-4901-98ef-da500dcb82f9\") " pod="openstack/glance-db-sync-rssqk" Dec 06 01:21:42 crc kubenswrapper[4991]: I1206 01:21:42.508211 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1befea2d-2e39-4901-98ef-da500dcb82f9-db-sync-config-data\") pod \"glance-db-sync-rssqk\" (UID: \"1befea2d-2e39-4901-98ef-da500dcb82f9\") " pod="openstack/glance-db-sync-rssqk" Dec 06 01:21:42 crc kubenswrapper[4991]: I1206 01:21:42.508251 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1befea2d-2e39-4901-98ef-da500dcb82f9-combined-ca-bundle\") pod \"glance-db-sync-rssqk\" (UID: \"1befea2d-2e39-4901-98ef-da500dcb82f9\") " pod="openstack/glance-db-sync-rssqk" Dec 06 01:21:42 crc kubenswrapper[4991]: I1206 01:21:42.514596 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1befea2d-2e39-4901-98ef-da500dcb82f9-config-data\") pod \"glance-db-sync-rssqk\" (UID: \"1befea2d-2e39-4901-98ef-da500dcb82f9\") " pod="openstack/glance-db-sync-rssqk" Dec 06 01:21:42 crc kubenswrapper[4991]: I1206 01:21:42.515433 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1befea2d-2e39-4901-98ef-da500dcb82f9-db-sync-config-data\") pod \"glance-db-sync-rssqk\" (UID: \"1befea2d-2e39-4901-98ef-da500dcb82f9\") " pod="openstack/glance-db-sync-rssqk" Dec 06 01:21:42 crc kubenswrapper[4991]: I1206 01:21:42.516485 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1befea2d-2e39-4901-98ef-da500dcb82f9-combined-ca-bundle\") pod \"glance-db-sync-rssqk\" (UID: \"1befea2d-2e39-4901-98ef-da500dcb82f9\") " pod="openstack/glance-db-sync-rssqk" Dec 06 01:21:42 crc kubenswrapper[4991]: I1206 01:21:42.533895 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7crl2\" (UniqueName: \"kubernetes.io/projected/1befea2d-2e39-4901-98ef-da500dcb82f9-kube-api-access-7crl2\") pod \"glance-db-sync-rssqk\" (UID: \"1befea2d-2e39-4901-98ef-da500dcb82f9\") " pod="openstack/glance-db-sync-rssqk" Dec 06 01:21:42 crc kubenswrapper[4991]: I1206 01:21:42.704855 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-rssqk" Dec 06 01:21:43 crc kubenswrapper[4991]: I1206 01:21:43.206862 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8380dcf2-30be-4578-937b-5e2a8f4fc074","Type":"ContainerStarted","Data":"485ffea6f5a453e06d197597f0b9df4471c81a596a195cf59876927841636f54"} Dec 06 01:21:43 crc kubenswrapper[4991]: I1206 01:21:43.207717 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:21:43 crc kubenswrapper[4991]: I1206 01:21:43.209812 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af","Type":"ContainerStarted","Data":"b9efb04596d3956458c1741004cf0c3c747574ec679f10d4a938952ad80a97e6"} Dec 06 01:21:43 crc kubenswrapper[4991]: I1206 01:21:43.210670 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 06 01:21:43 crc kubenswrapper[4991]: I1206 01:21:43.258380 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.451993467 podStartE2EDuration="57.258357681s" podCreationTimestamp="2025-12-06 01:20:46 +0000 UTC" firstStartedPulling="2025-12-06 01:20:48.938762597 +0000 UTC m=+1122.289053065" lastFinishedPulling="2025-12-06 01:21:07.745126811 +0000 UTC m=+1141.095417279" observedRunningTime="2025-12-06 01:21:43.241623663 +0000 UTC m=+1176.591914151" watchObservedRunningTime="2025-12-06 01:21:43.258357681 +0000 UTC m=+1176.608648169" Dec 06 01:21:43 crc kubenswrapper[4991]: I1206 01:21:43.271737 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=43.726287966 podStartE2EDuration="57.27169481s" podCreationTimestamp="2025-12-06 01:20:46 +0000 UTC" firstStartedPulling="2025-12-06 01:20:54.187698103 +0000 UTC m=+1127.537988571" lastFinishedPulling="2025-12-06 01:21:07.733104947 +0000 UTC m=+1141.083395415" observedRunningTime="2025-12-06 01:21:43.265164389 +0000 UTC m=+1176.615454877" watchObservedRunningTime="2025-12-06 01:21:43.27169481 +0000 UTC m=+1176.621985278" Dec 06 01:21:43 crc kubenswrapper[4991]: I1206 01:21:43.305530 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-rssqk"] Dec 06 01:21:44 crc kubenswrapper[4991]: I1206 01:21:44.221932 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"04a746be-9ac3-4e2e-a264-6fc99dc51527","Type":"ContainerStarted","Data":"6cd852776408dae5a71e2c0c9f1d906ca9e182b69486fe3810aeac3c56254035"} Dec 06 01:21:44 crc kubenswrapper[4991]: I1206 01:21:44.222470 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"04a746be-9ac3-4e2e-a264-6fc99dc51527","Type":"ContainerStarted","Data":"8e589b7a52a75801b41e7958b111ba8e354ff0ea5527c0a11e56281c8186320e"} Dec 06 01:21:44 crc kubenswrapper[4991]: I1206 01:21:44.222484 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"04a746be-9ac3-4e2e-a264-6fc99dc51527","Type":"ContainerStarted","Data":"59224c3c5a3f131f35a6059204241af79dab77c2bc7d33ad00f8e7f68c0d7b13"} Dec 06 01:21:44 crc kubenswrapper[4991]: I1206 01:21:44.222492 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"04a746be-9ac3-4e2e-a264-6fc99dc51527","Type":"ContainerStarted","Data":"0d13a23d4a24a40b83b84e9c61fd68aff67c6501cc26fd5b3c57b1bbfe6d773e"} Dec 06 01:21:44 crc kubenswrapper[4991]: I1206 01:21:44.223649 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-rssqk" event={"ID":"1befea2d-2e39-4901-98ef-da500dcb82f9","Type":"ContainerStarted","Data":"28d10f495de102a4cf1980ed911194090f411c05e1ab318930f4a5f09f7989ac"} Dec 06 01:21:47 crc kubenswrapper[4991]: I1206 01:21:47.875008 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-82ddm" podUID="a3995a5a-dfc3-4769-8161-31f7e2234d13" containerName="ovn-controller" probeResult="failure" output=< Dec 06 01:21:47 crc kubenswrapper[4991]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 06 01:21:47 crc kubenswrapper[4991]: > Dec 06 01:21:47 crc kubenswrapper[4991]: I1206 01:21:47.881759 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-529dn" Dec 06 01:21:49 crc kubenswrapper[4991]: I1206 01:21:49.276875 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"04a746be-9ac3-4e2e-a264-6fc99dc51527","Type":"ContainerStarted","Data":"cd4631ea3073dc12fe911b57aee7dcef780cd37195789c41ffd4edc658996ad6"} Dec 06 01:21:49 crc kubenswrapper[4991]: I1206 01:21:49.277431 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"04a746be-9ac3-4e2e-a264-6fc99dc51527","Type":"ContainerStarted","Data":"3c5fec6f1d10994caa4a31389b129833adff289dd60f588a21501bd8dd21736e"} Dec 06 01:21:49 crc kubenswrapper[4991]: I1206 01:21:49.277445 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"04a746be-9ac3-4e2e-a264-6fc99dc51527","Type":"ContainerStarted","Data":"d3a12716ed5cd142d546fee4e1567aa2455a249b72b69716ee9609e1c36e14f9"} Dec 06 01:21:49 crc kubenswrapper[4991]: I1206 01:21:49.277455 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"04a746be-9ac3-4e2e-a264-6fc99dc51527","Type":"ContainerStarted","Data":"76553030b1211318d895a003e54cc793e3f250109afaa95a9897e5e235c25130"} Dec 06 01:21:52 crc kubenswrapper[4991]: I1206 01:21:52.893775 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-82ddm" podUID="a3995a5a-dfc3-4769-8161-31f7e2234d13" containerName="ovn-controller" probeResult="failure" output=< Dec 06 01:21:52 crc kubenswrapper[4991]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 06 01:21:52 crc kubenswrapper[4991]: > Dec 06 01:21:52 crc kubenswrapper[4991]: I1206 01:21:52.898959 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-529dn" Dec 06 01:21:53 crc kubenswrapper[4991]: I1206 01:21:53.118719 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-82ddm-config-7n6nx"] Dec 06 01:21:53 crc kubenswrapper[4991]: I1206 01:21:53.121454 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-82ddm-config-7n6nx" Dec 06 01:21:53 crc kubenswrapper[4991]: I1206 01:21:53.124640 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 06 01:21:53 crc kubenswrapper[4991]: I1206 01:21:53.169765 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv6db\" (UniqueName: \"kubernetes.io/projected/c43175cc-c478-4abf-9e58-f6a31f656494-kube-api-access-dv6db\") pod \"ovn-controller-82ddm-config-7n6nx\" (UID: \"c43175cc-c478-4abf-9e58-f6a31f656494\") " pod="openstack/ovn-controller-82ddm-config-7n6nx" Dec 06 01:21:53 crc kubenswrapper[4991]: I1206 01:21:53.169823 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c43175cc-c478-4abf-9e58-f6a31f656494-scripts\") pod \"ovn-controller-82ddm-config-7n6nx\" (UID: \"c43175cc-c478-4abf-9e58-f6a31f656494\") " pod="openstack/ovn-controller-82ddm-config-7n6nx" Dec 06 01:21:53 crc kubenswrapper[4991]: I1206 01:21:53.169862 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c43175cc-c478-4abf-9e58-f6a31f656494-var-run\") pod \"ovn-controller-82ddm-config-7n6nx\" (UID: \"c43175cc-c478-4abf-9e58-f6a31f656494\") " pod="openstack/ovn-controller-82ddm-config-7n6nx" Dec 06 01:21:53 crc kubenswrapper[4991]: I1206 01:21:53.169892 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c43175cc-c478-4abf-9e58-f6a31f656494-var-run-ovn\") pod \"ovn-controller-82ddm-config-7n6nx\" (UID: \"c43175cc-c478-4abf-9e58-f6a31f656494\") " pod="openstack/ovn-controller-82ddm-config-7n6nx" Dec 06 01:21:53 crc kubenswrapper[4991]: I1206 01:21:53.169915 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c43175cc-c478-4abf-9e58-f6a31f656494-var-log-ovn\") pod \"ovn-controller-82ddm-config-7n6nx\" (UID: \"c43175cc-c478-4abf-9e58-f6a31f656494\") " pod="openstack/ovn-controller-82ddm-config-7n6nx" Dec 06 01:21:53 crc kubenswrapper[4991]: I1206 01:21:53.170139 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c43175cc-c478-4abf-9e58-f6a31f656494-additional-scripts\") pod \"ovn-controller-82ddm-config-7n6nx\" (UID: \"c43175cc-c478-4abf-9e58-f6a31f656494\") " pod="openstack/ovn-controller-82ddm-config-7n6nx" Dec 06 01:21:53 crc kubenswrapper[4991]: I1206 01:21:53.170858 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-82ddm-config-7n6nx"] Dec 06 01:21:53 crc kubenswrapper[4991]: I1206 01:21:53.271271 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c43175cc-c478-4abf-9e58-f6a31f656494-scripts\") pod \"ovn-controller-82ddm-config-7n6nx\" (UID: \"c43175cc-c478-4abf-9e58-f6a31f656494\") " pod="openstack/ovn-controller-82ddm-config-7n6nx" Dec 06 01:21:53 crc kubenswrapper[4991]: I1206 01:21:53.271311 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dv6db\" (UniqueName: \"kubernetes.io/projected/c43175cc-c478-4abf-9e58-f6a31f656494-kube-api-access-dv6db\") pod \"ovn-controller-82ddm-config-7n6nx\" (UID: \"c43175cc-c478-4abf-9e58-f6a31f656494\") " pod="openstack/ovn-controller-82ddm-config-7n6nx" Dec 06 01:21:53 crc kubenswrapper[4991]: I1206 01:21:53.271338 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c43175cc-c478-4abf-9e58-f6a31f656494-var-run\") pod \"ovn-controller-82ddm-config-7n6nx\" (UID: \"c43175cc-c478-4abf-9e58-f6a31f656494\") " pod="openstack/ovn-controller-82ddm-config-7n6nx" Dec 06 01:21:53 crc kubenswrapper[4991]: I1206 01:21:53.271361 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c43175cc-c478-4abf-9e58-f6a31f656494-var-run-ovn\") pod \"ovn-controller-82ddm-config-7n6nx\" (UID: \"c43175cc-c478-4abf-9e58-f6a31f656494\") " pod="openstack/ovn-controller-82ddm-config-7n6nx" Dec 06 01:21:53 crc kubenswrapper[4991]: I1206 01:21:53.271384 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c43175cc-c478-4abf-9e58-f6a31f656494-var-log-ovn\") pod \"ovn-controller-82ddm-config-7n6nx\" (UID: \"c43175cc-c478-4abf-9e58-f6a31f656494\") " pod="openstack/ovn-controller-82ddm-config-7n6nx" Dec 06 01:21:53 crc kubenswrapper[4991]: I1206 01:21:53.271430 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c43175cc-c478-4abf-9e58-f6a31f656494-additional-scripts\") pod \"ovn-controller-82ddm-config-7n6nx\" (UID: \"c43175cc-c478-4abf-9e58-f6a31f656494\") " pod="openstack/ovn-controller-82ddm-config-7n6nx" Dec 06 01:21:53 crc kubenswrapper[4991]: I1206 01:21:53.271693 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c43175cc-c478-4abf-9e58-f6a31f656494-var-run\") pod \"ovn-controller-82ddm-config-7n6nx\" (UID: \"c43175cc-c478-4abf-9e58-f6a31f656494\") " pod="openstack/ovn-controller-82ddm-config-7n6nx" Dec 06 01:21:53 crc kubenswrapper[4991]: I1206 01:21:53.271987 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c43175cc-c478-4abf-9e58-f6a31f656494-additional-scripts\") pod \"ovn-controller-82ddm-config-7n6nx\" (UID: \"c43175cc-c478-4abf-9e58-f6a31f656494\") " pod="openstack/ovn-controller-82ddm-config-7n6nx" Dec 06 01:21:53 crc kubenswrapper[4991]: I1206 01:21:53.272022 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c43175cc-c478-4abf-9e58-f6a31f656494-var-run-ovn\") pod \"ovn-controller-82ddm-config-7n6nx\" (UID: \"c43175cc-c478-4abf-9e58-f6a31f656494\") " pod="openstack/ovn-controller-82ddm-config-7n6nx" Dec 06 01:21:53 crc kubenswrapper[4991]: I1206 01:21:53.272071 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c43175cc-c478-4abf-9e58-f6a31f656494-var-log-ovn\") pod \"ovn-controller-82ddm-config-7n6nx\" (UID: \"c43175cc-c478-4abf-9e58-f6a31f656494\") " pod="openstack/ovn-controller-82ddm-config-7n6nx" Dec 06 01:21:53 crc kubenswrapper[4991]: I1206 01:21:53.273388 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c43175cc-c478-4abf-9e58-f6a31f656494-scripts\") pod \"ovn-controller-82ddm-config-7n6nx\" (UID: \"c43175cc-c478-4abf-9e58-f6a31f656494\") " pod="openstack/ovn-controller-82ddm-config-7n6nx" Dec 06 01:21:53 crc kubenswrapper[4991]: I1206 01:21:53.292222 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv6db\" (UniqueName: \"kubernetes.io/projected/c43175cc-c478-4abf-9e58-f6a31f656494-kube-api-access-dv6db\") pod \"ovn-controller-82ddm-config-7n6nx\" (UID: \"c43175cc-c478-4abf-9e58-f6a31f656494\") " pod="openstack/ovn-controller-82ddm-config-7n6nx" Dec 06 01:21:53 crc kubenswrapper[4991]: I1206 01:21:53.449759 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-82ddm-config-7n6nx" Dec 06 01:21:57 crc kubenswrapper[4991]: I1206 01:21:57.853419 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-82ddm" podUID="a3995a5a-dfc3-4769-8161-31f7e2234d13" containerName="ovn-controller" probeResult="failure" output=< Dec 06 01:21:57 crc kubenswrapper[4991]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 06 01:21:57 crc kubenswrapper[4991]: > Dec 06 01:21:58 crc kubenswrapper[4991]: I1206 01:21:58.024176 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:21:58 crc kubenswrapper[4991]: I1206 01:21:58.350281 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 06 01:21:59 crc kubenswrapper[4991]: I1206 01:21:59.896683 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-7g5ts"] Dec 06 01:21:59 crc kubenswrapper[4991]: I1206 01:21:59.898116 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-7g5ts" Dec 06 01:21:59 crc kubenswrapper[4991]: I1206 01:21:59.913059 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-7g5ts"] Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.012291 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-073a-account-create-update-5n44d"] Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.014417 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-073a-account-create-update-5n44d" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.019893 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.021497 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-pz4lk"] Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.024728 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-pz4lk" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.040641 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-073a-account-create-update-5n44d"] Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.067325 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-pz4lk"] Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.089439 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93aee8c2-c556-4635-ab46-f72bb431ed81-operator-scripts\") pod \"cinder-db-create-7g5ts\" (UID: \"93aee8c2-c556-4635-ab46-f72bb431ed81\") " pod="openstack/cinder-db-create-7g5ts" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.089672 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wv9s6\" (UniqueName: \"kubernetes.io/projected/93aee8c2-c556-4635-ab46-f72bb431ed81-kube-api-access-wv9s6\") pod \"cinder-db-create-7g5ts\" (UID: \"93aee8c2-c556-4635-ab46-f72bb431ed81\") " pod="openstack/cinder-db-create-7g5ts" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.125733 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-5745-account-create-update-54rdp"] Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.127613 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5745-account-create-update-54rdp" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.130825 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.139931 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-5745-account-create-update-54rdp"] Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.191821 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83441b36-34ed-4c10-848b-22ed0afae8cb-operator-scripts\") pod \"barbican-db-create-pz4lk\" (UID: \"83441b36-34ed-4c10-848b-22ed0afae8cb\") " pod="openstack/barbican-db-create-pz4lk" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.191879 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wv9s6\" (UniqueName: \"kubernetes.io/projected/93aee8c2-c556-4635-ab46-f72bb431ed81-kube-api-access-wv9s6\") pod \"cinder-db-create-7g5ts\" (UID: \"93aee8c2-c556-4635-ab46-f72bb431ed81\") " pod="openstack/cinder-db-create-7g5ts" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.191947 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlc6v\" (UniqueName: \"kubernetes.io/projected/83441b36-34ed-4c10-848b-22ed0afae8cb-kube-api-access-hlc6v\") pod \"barbican-db-create-pz4lk\" (UID: \"83441b36-34ed-4c10-848b-22ed0afae8cb\") " pod="openstack/barbican-db-create-pz4lk" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.191970 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba9c8d1e-e8d2-4bd2-8963-45ea5f7110fb-operator-scripts\") pod \"barbican-073a-account-create-update-5n44d\" (UID: \"ba9c8d1e-e8d2-4bd2-8963-45ea5f7110fb\") " pod="openstack/barbican-073a-account-create-update-5n44d" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.192044 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93aee8c2-c556-4635-ab46-f72bb431ed81-operator-scripts\") pod \"cinder-db-create-7g5ts\" (UID: \"93aee8c2-c556-4635-ab46-f72bb431ed81\") " pod="openstack/cinder-db-create-7g5ts" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.192512 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln578\" (UniqueName: \"kubernetes.io/projected/ba9c8d1e-e8d2-4bd2-8963-45ea5f7110fb-kube-api-access-ln578\") pod \"barbican-073a-account-create-update-5n44d\" (UID: \"ba9c8d1e-e8d2-4bd2-8963-45ea5f7110fb\") " pod="openstack/barbican-073a-account-create-update-5n44d" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.193426 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93aee8c2-c556-4635-ab46-f72bb431ed81-operator-scripts\") pod \"cinder-db-create-7g5ts\" (UID: \"93aee8c2-c556-4635-ab46-f72bb431ed81\") " pod="openstack/cinder-db-create-7g5ts" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.231441 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wv9s6\" (UniqueName: \"kubernetes.io/projected/93aee8c2-c556-4635-ab46-f72bb431ed81-kube-api-access-wv9s6\") pod \"cinder-db-create-7g5ts\" (UID: \"93aee8c2-c556-4635-ab46-f72bb431ed81\") " pod="openstack/cinder-db-create-7g5ts" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.264678 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-7g5ts" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.294896 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f680de3a-45a6-4f83-b27c-79ade3e8071a-operator-scripts\") pod \"cinder-5745-account-create-update-54rdp\" (UID: \"f680de3a-45a6-4f83-b27c-79ade3e8071a\") " pod="openstack/cinder-5745-account-create-update-54rdp" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.294963 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83441b36-34ed-4c10-848b-22ed0afae8cb-operator-scripts\") pod \"barbican-db-create-pz4lk\" (UID: \"83441b36-34ed-4c10-848b-22ed0afae8cb\") " pod="openstack/barbican-db-create-pz4lk" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.295119 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlc6v\" (UniqueName: \"kubernetes.io/projected/83441b36-34ed-4c10-848b-22ed0afae8cb-kube-api-access-hlc6v\") pod \"barbican-db-create-pz4lk\" (UID: \"83441b36-34ed-4c10-848b-22ed0afae8cb\") " pod="openstack/barbican-db-create-pz4lk" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.295155 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba9c8d1e-e8d2-4bd2-8963-45ea5f7110fb-operator-scripts\") pod \"barbican-073a-account-create-update-5n44d\" (UID: \"ba9c8d1e-e8d2-4bd2-8963-45ea5f7110fb\") " pod="openstack/barbican-073a-account-create-update-5n44d" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.295184 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6th7\" (UniqueName: \"kubernetes.io/projected/f680de3a-45a6-4f83-b27c-79ade3e8071a-kube-api-access-j6th7\") pod \"cinder-5745-account-create-update-54rdp\" (UID: \"f680de3a-45a6-4f83-b27c-79ade3e8071a\") " pod="openstack/cinder-5745-account-create-update-54rdp" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.295270 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ln578\" (UniqueName: \"kubernetes.io/projected/ba9c8d1e-e8d2-4bd2-8963-45ea5f7110fb-kube-api-access-ln578\") pod \"barbican-073a-account-create-update-5n44d\" (UID: \"ba9c8d1e-e8d2-4bd2-8963-45ea5f7110fb\") " pod="openstack/barbican-073a-account-create-update-5n44d" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.295804 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83441b36-34ed-4c10-848b-22ed0afae8cb-operator-scripts\") pod \"barbican-db-create-pz4lk\" (UID: \"83441b36-34ed-4c10-848b-22ed0afae8cb\") " pod="openstack/barbican-db-create-pz4lk" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.295936 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba9c8d1e-e8d2-4bd2-8963-45ea5f7110fb-operator-scripts\") pod \"barbican-073a-account-create-update-5n44d\" (UID: \"ba9c8d1e-e8d2-4bd2-8963-45ea5f7110fb\") " pod="openstack/barbican-073a-account-create-update-5n44d" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.309022 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-q7rp9"] Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.310253 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-q7rp9" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.317554 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln578\" (UniqueName: \"kubernetes.io/projected/ba9c8d1e-e8d2-4bd2-8963-45ea5f7110fb-kube-api-access-ln578\") pod \"barbican-073a-account-create-update-5n44d\" (UID: \"ba9c8d1e-e8d2-4bd2-8963-45ea5f7110fb\") " pod="openstack/barbican-073a-account-create-update-5n44d" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.319462 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlc6v\" (UniqueName: \"kubernetes.io/projected/83441b36-34ed-4c10-848b-22ed0afae8cb-kube-api-access-hlc6v\") pod \"barbican-db-create-pz4lk\" (UID: \"83441b36-34ed-4c10-848b-22ed0afae8cb\") " pod="openstack/barbican-db-create-pz4lk" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.328152 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-q7rp9"] Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.339300 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-073a-account-create-update-5n44d" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.360948 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-pz4lk" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.369122 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-xqxr2"] Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.370633 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xqxr2" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.375356 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.375453 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.375458 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.375641 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-z245k" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.380348 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-xqxr2"] Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.397038 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e64b0721-7f94-464a-a14f-3fc10d3b6c53-combined-ca-bundle\") pod \"keystone-db-sync-xqxr2\" (UID: \"e64b0721-7f94-464a-a14f-3fc10d3b6c53\") " pod="openstack/keystone-db-sync-xqxr2" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.397689 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f680de3a-45a6-4f83-b27c-79ade3e8071a-operator-scripts\") pod \"cinder-5745-account-create-update-54rdp\" (UID: \"f680de3a-45a6-4f83-b27c-79ade3e8071a\") " pod="openstack/cinder-5745-account-create-update-54rdp" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.397737 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzbbg\" (UniqueName: \"kubernetes.io/projected/e64b0721-7f94-464a-a14f-3fc10d3b6c53-kube-api-access-qzbbg\") pod \"keystone-db-sync-xqxr2\" (UID: \"e64b0721-7f94-464a-a14f-3fc10d3b6c53\") " pod="openstack/keystone-db-sync-xqxr2" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.397822 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e64b0721-7f94-464a-a14f-3fc10d3b6c53-config-data\") pod \"keystone-db-sync-xqxr2\" (UID: \"e64b0721-7f94-464a-a14f-3fc10d3b6c53\") " pod="openstack/keystone-db-sync-xqxr2" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.397897 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6th7\" (UniqueName: \"kubernetes.io/projected/f680de3a-45a6-4f83-b27c-79ade3e8071a-kube-api-access-j6th7\") pod \"cinder-5745-account-create-update-54rdp\" (UID: \"f680de3a-45a6-4f83-b27c-79ade3e8071a\") " pod="openstack/cinder-5745-account-create-update-54rdp" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.398869 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f680de3a-45a6-4f83-b27c-79ade3e8071a-operator-scripts\") pod \"cinder-5745-account-create-update-54rdp\" (UID: \"f680de3a-45a6-4f83-b27c-79ade3e8071a\") " pod="openstack/cinder-5745-account-create-update-54rdp" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.415497 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6th7\" (UniqueName: \"kubernetes.io/projected/f680de3a-45a6-4f83-b27c-79ade3e8071a-kube-api-access-j6th7\") pod \"cinder-5745-account-create-update-54rdp\" (UID: \"f680de3a-45a6-4f83-b27c-79ade3e8071a\") " pod="openstack/cinder-5745-account-create-update-54rdp" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.450588 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5745-account-create-update-54rdp" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.499633 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzbbg\" (UniqueName: \"kubernetes.io/projected/e64b0721-7f94-464a-a14f-3fc10d3b6c53-kube-api-access-qzbbg\") pod \"keystone-db-sync-xqxr2\" (UID: \"e64b0721-7f94-464a-a14f-3fc10d3b6c53\") " pod="openstack/keystone-db-sync-xqxr2" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.499739 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e64b0721-7f94-464a-a14f-3fc10d3b6c53-config-data\") pod \"keystone-db-sync-xqxr2\" (UID: \"e64b0721-7f94-464a-a14f-3fc10d3b6c53\") " pod="openstack/keystone-db-sync-xqxr2" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.499840 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3672862f-2f4b-487a-b545-48dfd8515d8d-operator-scripts\") pod \"neutron-db-create-q7rp9\" (UID: \"3672862f-2f4b-487a-b545-48dfd8515d8d\") " pod="openstack/neutron-db-create-q7rp9" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.499876 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e64b0721-7f94-464a-a14f-3fc10d3b6c53-combined-ca-bundle\") pod \"keystone-db-sync-xqxr2\" (UID: \"e64b0721-7f94-464a-a14f-3fc10d3b6c53\") " pod="openstack/keystone-db-sync-xqxr2" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.499978 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnfnm\" (UniqueName: \"kubernetes.io/projected/3672862f-2f4b-487a-b545-48dfd8515d8d-kube-api-access-lnfnm\") pod \"neutron-db-create-q7rp9\" (UID: \"3672862f-2f4b-487a-b545-48dfd8515d8d\") " pod="openstack/neutron-db-create-q7rp9" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.502674 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e64b0721-7f94-464a-a14f-3fc10d3b6c53-config-data\") pod \"keystone-db-sync-xqxr2\" (UID: \"e64b0721-7f94-464a-a14f-3fc10d3b6c53\") " pod="openstack/keystone-db-sync-xqxr2" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.503536 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e64b0721-7f94-464a-a14f-3fc10d3b6c53-combined-ca-bundle\") pod \"keystone-db-sync-xqxr2\" (UID: \"e64b0721-7f94-464a-a14f-3fc10d3b6c53\") " pod="openstack/keystone-db-sync-xqxr2" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.515572 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzbbg\" (UniqueName: \"kubernetes.io/projected/e64b0721-7f94-464a-a14f-3fc10d3b6c53-kube-api-access-qzbbg\") pod \"keystone-db-sync-xqxr2\" (UID: \"e64b0721-7f94-464a-a14f-3fc10d3b6c53\") " pod="openstack/keystone-db-sync-xqxr2" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.522463 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-1f6b-account-create-update-hcrwx"] Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.523725 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1f6b-account-create-update-hcrwx" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.525868 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-1f6b-account-create-update-hcrwx"] Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.527801 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.602003 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z654\" (UniqueName: \"kubernetes.io/projected/59662742-d08d-498c-9e70-53288e527d9f-kube-api-access-2z654\") pod \"neutron-1f6b-account-create-update-hcrwx\" (UID: \"59662742-d08d-498c-9e70-53288e527d9f\") " pod="openstack/neutron-1f6b-account-create-update-hcrwx" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.602168 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3672862f-2f4b-487a-b545-48dfd8515d8d-operator-scripts\") pod \"neutron-db-create-q7rp9\" (UID: \"3672862f-2f4b-487a-b545-48dfd8515d8d\") " pod="openstack/neutron-db-create-q7rp9" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.602239 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnfnm\" (UniqueName: \"kubernetes.io/projected/3672862f-2f4b-487a-b545-48dfd8515d8d-kube-api-access-lnfnm\") pod \"neutron-db-create-q7rp9\" (UID: \"3672862f-2f4b-487a-b545-48dfd8515d8d\") " pod="openstack/neutron-db-create-q7rp9" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.602274 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59662742-d08d-498c-9e70-53288e527d9f-operator-scripts\") pod \"neutron-1f6b-account-create-update-hcrwx\" (UID: \"59662742-d08d-498c-9e70-53288e527d9f\") " pod="openstack/neutron-1f6b-account-create-update-hcrwx" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.602902 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3672862f-2f4b-487a-b545-48dfd8515d8d-operator-scripts\") pod \"neutron-db-create-q7rp9\" (UID: \"3672862f-2f4b-487a-b545-48dfd8515d8d\") " pod="openstack/neutron-db-create-q7rp9" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.620208 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnfnm\" (UniqueName: \"kubernetes.io/projected/3672862f-2f4b-487a-b545-48dfd8515d8d-kube-api-access-lnfnm\") pod \"neutron-db-create-q7rp9\" (UID: \"3672862f-2f4b-487a-b545-48dfd8515d8d\") " pod="openstack/neutron-db-create-q7rp9" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.703112 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z654\" (UniqueName: \"kubernetes.io/projected/59662742-d08d-498c-9e70-53288e527d9f-kube-api-access-2z654\") pod \"neutron-1f6b-account-create-update-hcrwx\" (UID: \"59662742-d08d-498c-9e70-53288e527d9f\") " pod="openstack/neutron-1f6b-account-create-update-hcrwx" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.703232 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59662742-d08d-498c-9e70-53288e527d9f-operator-scripts\") pod \"neutron-1f6b-account-create-update-hcrwx\" (UID: \"59662742-d08d-498c-9e70-53288e527d9f\") " pod="openstack/neutron-1f6b-account-create-update-hcrwx" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.703838 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59662742-d08d-498c-9e70-53288e527d9f-operator-scripts\") pod \"neutron-1f6b-account-create-update-hcrwx\" (UID: \"59662742-d08d-498c-9e70-53288e527d9f\") " pod="openstack/neutron-1f6b-account-create-update-hcrwx" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.705579 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-q7rp9" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.715390 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xqxr2" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.724693 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z654\" (UniqueName: \"kubernetes.io/projected/59662742-d08d-498c-9e70-53288e527d9f-kube-api-access-2z654\") pod \"neutron-1f6b-account-create-update-hcrwx\" (UID: \"59662742-d08d-498c-9e70-53288e527d9f\") " pod="openstack/neutron-1f6b-account-create-update-hcrwx" Dec 06 01:22:00 crc kubenswrapper[4991]: E1206 01:22:00.850894 4991 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Dec 06 01:22:00 crc kubenswrapper[4991]: E1206 01:22:00.851116 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7crl2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-rssqk_openstack(1befea2d-2e39-4901-98ef-da500dcb82f9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 01:22:00 crc kubenswrapper[4991]: E1206 01:22:00.852345 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-rssqk" podUID="1befea2d-2e39-4901-98ef-da500dcb82f9" Dec 06 01:22:00 crc kubenswrapper[4991]: I1206 01:22:00.864517 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1f6b-account-create-update-hcrwx" Dec 06 01:22:01 crc kubenswrapper[4991]: E1206 01:22:01.419492 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-rssqk" podUID="1befea2d-2e39-4901-98ef-da500dcb82f9" Dec 06 01:22:01 crc kubenswrapper[4991]: I1206 01:22:01.588509 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-82ddm-config-7n6nx"] Dec 06 01:22:01 crc kubenswrapper[4991]: I1206 01:22:01.613794 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-1f6b-account-create-update-hcrwx"] Dec 06 01:22:01 crc kubenswrapper[4991]: I1206 01:22:01.636671 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-pz4lk"] Dec 06 01:22:01 crc kubenswrapper[4991]: I1206 01:22:01.666062 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-7g5ts"] Dec 06 01:22:01 crc kubenswrapper[4991]: I1206 01:22:01.682306 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-xqxr2"] Dec 06 01:22:01 crc kubenswrapper[4991]: I1206 01:22:01.742422 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-5745-account-create-update-54rdp"] Dec 06 01:22:01 crc kubenswrapper[4991]: I1206 01:22:01.849307 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-073a-account-create-update-5n44d"] Dec 06 01:22:01 crc kubenswrapper[4991]: I1206 01:22:01.898467 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-q7rp9"] Dec 06 01:22:01 crc kubenswrapper[4991]: W1206 01:22:01.954761 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59662742_d08d_498c_9e70_53288e527d9f.slice/crio-6565787e49d053520d6dcc6941d526b303b689db8c957d472411d3cf4c434860 WatchSource:0}: Error finding container 6565787e49d053520d6dcc6941d526b303b689db8c957d472411d3cf4c434860: Status 404 returned error can't find the container with id 6565787e49d053520d6dcc6941d526b303b689db8c957d472411d3cf4c434860 Dec 06 01:22:01 crc kubenswrapper[4991]: W1206 01:22:01.965569 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba9c8d1e_e8d2_4bd2_8963_45ea5f7110fb.slice/crio-8220d222b95b23907a175ace62478e120bfe6468f465e0a0ed440c787cb1b106 WatchSource:0}: Error finding container 8220d222b95b23907a175ace62478e120bfe6468f465e0a0ed440c787cb1b106: Status 404 returned error can't find the container with id 8220d222b95b23907a175ace62478e120bfe6468f465e0a0ed440c787cb1b106 Dec 06 01:22:01 crc kubenswrapper[4991]: W1206 01:22:01.968529 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3672862f_2f4b_487a_b545_48dfd8515d8d.slice/crio-092b9f972b97bb6d25b2cae49e8bc84c6808b90b440af3252e3756e147a43fae WatchSource:0}: Error finding container 092b9f972b97bb6d25b2cae49e8bc84c6808b90b440af3252e3756e147a43fae: Status 404 returned error can't find the container with id 092b9f972b97bb6d25b2cae49e8bc84c6808b90b440af3252e3756e147a43fae Dec 06 01:22:02 crc kubenswrapper[4991]: I1206 01:22:02.432014 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-073a-account-create-update-5n44d" event={"ID":"ba9c8d1e-e8d2-4bd2-8963-45ea5f7110fb","Type":"ContainerStarted","Data":"7333191c50f374062532cdc0893b90f11fb9de6310c3e103ddcfb276772886c0"} Dec 06 01:22:02 crc kubenswrapper[4991]: I1206 01:22:02.432337 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-073a-account-create-update-5n44d" event={"ID":"ba9c8d1e-e8d2-4bd2-8963-45ea5f7110fb","Type":"ContainerStarted","Data":"8220d222b95b23907a175ace62478e120bfe6468f465e0a0ed440c787cb1b106"} Dec 06 01:22:02 crc kubenswrapper[4991]: I1206 01:22:02.435272 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-5745-account-create-update-54rdp" event={"ID":"f680de3a-45a6-4f83-b27c-79ade3e8071a","Type":"ContainerStarted","Data":"9ba1f9363eab2d4cefcc3224ad8dc8008dde6ce02e8ae35584d4698c88943997"} Dec 06 01:22:02 crc kubenswrapper[4991]: I1206 01:22:02.435316 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-5745-account-create-update-54rdp" event={"ID":"f680de3a-45a6-4f83-b27c-79ade3e8071a","Type":"ContainerStarted","Data":"c0989412fab363e24f5b23efea21a94ade3cb6a9fcfce13fbc1a4d08bf35c94a"} Dec 06 01:22:02 crc kubenswrapper[4991]: I1206 01:22:02.439012 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1f6b-account-create-update-hcrwx" event={"ID":"59662742-d08d-498c-9e70-53288e527d9f","Type":"ContainerStarted","Data":"9b837859471346fe2cdb38e81e756d345537fcd908d4593f2c37feb351f0a824"} Dec 06 01:22:02 crc kubenswrapper[4991]: I1206 01:22:02.439053 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1f6b-account-create-update-hcrwx" event={"ID":"59662742-d08d-498c-9e70-53288e527d9f","Type":"ContainerStarted","Data":"6565787e49d053520d6dcc6941d526b303b689db8c957d472411d3cf4c434860"} Dec 06 01:22:02 crc kubenswrapper[4991]: I1206 01:22:02.441063 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-pz4lk" event={"ID":"83441b36-34ed-4c10-848b-22ed0afae8cb","Type":"ContainerStarted","Data":"38b0d127cd8af1a795f53a3bb1de89e71548588b0396410f5b4dc2f31f456b18"} Dec 06 01:22:02 crc kubenswrapper[4991]: I1206 01:22:02.441090 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-pz4lk" event={"ID":"83441b36-34ed-4c10-848b-22ed0afae8cb","Type":"ContainerStarted","Data":"cc13890caf6d75c5db84e7eb71078578083b883430618a395a5d3aa7a358fe1a"} Dec 06 01:22:02 crc kubenswrapper[4991]: I1206 01:22:02.445689 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"04a746be-9ac3-4e2e-a264-6fc99dc51527","Type":"ContainerStarted","Data":"7cbd5ff72c32ee5dc74b36134267ace1c62298a13ed1183c13504b36918331b0"} Dec 06 01:22:02 crc kubenswrapper[4991]: I1206 01:22:02.447214 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-q7rp9" event={"ID":"3672862f-2f4b-487a-b545-48dfd8515d8d","Type":"ContainerStarted","Data":"092b9f972b97bb6d25b2cae49e8bc84c6808b90b440af3252e3756e147a43fae"} Dec 06 01:22:02 crc kubenswrapper[4991]: I1206 01:22:02.451654 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-073a-account-create-update-5n44d" podStartSLOduration=3.451637963 podStartE2EDuration="3.451637963s" podCreationTimestamp="2025-12-06 01:21:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:22:02.450701948 +0000 UTC m=+1195.800992436" watchObservedRunningTime="2025-12-06 01:22:02.451637963 +0000 UTC m=+1195.801928431" Dec 06 01:22:02 crc kubenswrapper[4991]: I1206 01:22:02.464302 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-82ddm-config-7n6nx" event={"ID":"c43175cc-c478-4abf-9e58-f6a31f656494","Type":"ContainerStarted","Data":"8d56f2d1604806ccada0f86b92242b0d98db84ac69435d9ad92d653ec97d2e5a"} Dec 06 01:22:02 crc kubenswrapper[4991]: I1206 01:22:02.478841 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-1f6b-account-create-update-hcrwx" podStartSLOduration=2.478824175 podStartE2EDuration="2.478824175s" podCreationTimestamp="2025-12-06 01:22:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:22:02.471586885 +0000 UTC m=+1195.821877363" watchObservedRunningTime="2025-12-06 01:22:02.478824175 +0000 UTC m=+1195.829114633" Dec 06 01:22:02 crc kubenswrapper[4991]: I1206 01:22:02.485865 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-7g5ts" event={"ID":"93aee8c2-c556-4635-ab46-f72bb431ed81","Type":"ContainerStarted","Data":"fbd2f9d4a3c7ec83307cd8d78841f72655859a751be415e536128375885f9d27"} Dec 06 01:22:02 crc kubenswrapper[4991]: I1206 01:22:02.485912 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-7g5ts" event={"ID":"93aee8c2-c556-4635-ab46-f72bb431ed81","Type":"ContainerStarted","Data":"29b40fde66bd01f8588f47ac1306f0c70cc38b6318dd408860c034c46a048b90"} Dec 06 01:22:02 crc kubenswrapper[4991]: I1206 01:22:02.490401 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-5745-account-create-update-54rdp" podStartSLOduration=2.490389247 podStartE2EDuration="2.490389247s" podCreationTimestamp="2025-12-06 01:22:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:22:02.487963654 +0000 UTC m=+1195.838254122" watchObservedRunningTime="2025-12-06 01:22:02.490389247 +0000 UTC m=+1195.840679715" Dec 06 01:22:02 crc kubenswrapper[4991]: I1206 01:22:02.492095 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xqxr2" event={"ID":"e64b0721-7f94-464a-a14f-3fc10d3b6c53","Type":"ContainerStarted","Data":"93df943daf40e5acba1d56453160797ee20da73f8b08896aedd66d7ce55aeaa4"} Dec 06 01:22:02 crc kubenswrapper[4991]: I1206 01:22:02.514344 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-pz4lk" podStartSLOduration=3.514311224 podStartE2EDuration="3.514311224s" podCreationTimestamp="2025-12-06 01:21:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:22:02.509133828 +0000 UTC m=+1195.859424296" watchObservedRunningTime="2025-12-06 01:22:02.514311224 +0000 UTC m=+1195.864601692" Dec 06 01:22:02 crc kubenswrapper[4991]: I1206 01:22:02.535469 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-7g5ts" podStartSLOduration=3.535441307 podStartE2EDuration="3.535441307s" podCreationTimestamp="2025-12-06 01:21:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:22:02.527415497 +0000 UTC m=+1195.877705965" watchObservedRunningTime="2025-12-06 01:22:02.535441307 +0000 UTC m=+1195.885731775" Dec 06 01:22:02 crc kubenswrapper[4991]: I1206 01:22:02.866112 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-82ddm" Dec 06 01:22:03 crc kubenswrapper[4991]: I1206 01:22:03.502742 4991 generic.go:334] "Generic (PLEG): container finished" podID="83441b36-34ed-4c10-848b-22ed0afae8cb" containerID="38b0d127cd8af1a795f53a3bb1de89e71548588b0396410f5b4dc2f31f456b18" exitCode=0 Dec 06 01:22:03 crc kubenswrapper[4991]: I1206 01:22:03.502944 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-pz4lk" event={"ID":"83441b36-34ed-4c10-848b-22ed0afae8cb","Type":"ContainerDied","Data":"38b0d127cd8af1a795f53a3bb1de89e71548588b0396410f5b4dc2f31f456b18"} Dec 06 01:22:03 crc kubenswrapper[4991]: I1206 01:22:03.510896 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"04a746be-9ac3-4e2e-a264-6fc99dc51527","Type":"ContainerStarted","Data":"22ebc6fb2c4adb4435ac4e187fe5258552dd440e00e529ed7fd0dc6cf594e65a"} Dec 06 01:22:03 crc kubenswrapper[4991]: I1206 01:22:03.510932 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"04a746be-9ac3-4e2e-a264-6fc99dc51527","Type":"ContainerStarted","Data":"291b5ec375baa80a3cddcdb6e675b69d9d99468d8721f07d5781f5b075de663e"} Dec 06 01:22:03 crc kubenswrapper[4991]: I1206 01:22:03.510944 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"04a746be-9ac3-4e2e-a264-6fc99dc51527","Type":"ContainerStarted","Data":"362f731443d3881a1f14e1b0a7bedbd7ee447eedccc3563e949aaaf7f8d5e1b2"} Dec 06 01:22:03 crc kubenswrapper[4991]: I1206 01:22:03.510957 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"04a746be-9ac3-4e2e-a264-6fc99dc51527","Type":"ContainerStarted","Data":"d0b49b913c8e36d42a75fdbf46aedd3a55c83aceb438051153cafa3f99641bbc"} Dec 06 01:22:03 crc kubenswrapper[4991]: I1206 01:22:03.512698 4991 generic.go:334] "Generic (PLEG): container finished" podID="3672862f-2f4b-487a-b545-48dfd8515d8d" containerID="7b563f9d8d80963347c62d3191f2df9a2122f5c1037039569973064b1b1f687f" exitCode=0 Dec 06 01:22:03 crc kubenswrapper[4991]: I1206 01:22:03.512751 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-q7rp9" event={"ID":"3672862f-2f4b-487a-b545-48dfd8515d8d","Type":"ContainerDied","Data":"7b563f9d8d80963347c62d3191f2df9a2122f5c1037039569973064b1b1f687f"} Dec 06 01:22:03 crc kubenswrapper[4991]: I1206 01:22:03.534257 4991 generic.go:334] "Generic (PLEG): container finished" podID="ba9c8d1e-e8d2-4bd2-8963-45ea5f7110fb" containerID="7333191c50f374062532cdc0893b90f11fb9de6310c3e103ddcfb276772886c0" exitCode=0 Dec 06 01:22:03 crc kubenswrapper[4991]: I1206 01:22:03.534385 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-073a-account-create-update-5n44d" event={"ID":"ba9c8d1e-e8d2-4bd2-8963-45ea5f7110fb","Type":"ContainerDied","Data":"7333191c50f374062532cdc0893b90f11fb9de6310c3e103ddcfb276772886c0"} Dec 06 01:22:03 crc kubenswrapper[4991]: I1206 01:22:03.544948 4991 generic.go:334] "Generic (PLEG): container finished" podID="c43175cc-c478-4abf-9e58-f6a31f656494" containerID="96b7d648cf0392096ea187447910834b858e1d9561006dcdf7a4c7bfed01f7d9" exitCode=0 Dec 06 01:22:03 crc kubenswrapper[4991]: I1206 01:22:03.546274 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-82ddm-config-7n6nx" event={"ID":"c43175cc-c478-4abf-9e58-f6a31f656494","Type":"ContainerDied","Data":"96b7d648cf0392096ea187447910834b858e1d9561006dcdf7a4c7bfed01f7d9"} Dec 06 01:22:03 crc kubenswrapper[4991]: I1206 01:22:03.552331 4991 generic.go:334] "Generic (PLEG): container finished" podID="f680de3a-45a6-4f83-b27c-79ade3e8071a" containerID="9ba1f9363eab2d4cefcc3224ad8dc8008dde6ce02e8ae35584d4698c88943997" exitCode=0 Dec 06 01:22:03 crc kubenswrapper[4991]: I1206 01:22:03.552414 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-5745-account-create-update-54rdp" event={"ID":"f680de3a-45a6-4f83-b27c-79ade3e8071a","Type":"ContainerDied","Data":"9ba1f9363eab2d4cefcc3224ad8dc8008dde6ce02e8ae35584d4698c88943997"} Dec 06 01:22:03 crc kubenswrapper[4991]: I1206 01:22:03.555489 4991 generic.go:334] "Generic (PLEG): container finished" podID="93aee8c2-c556-4635-ab46-f72bb431ed81" containerID="fbd2f9d4a3c7ec83307cd8d78841f72655859a751be415e536128375885f9d27" exitCode=0 Dec 06 01:22:03 crc kubenswrapper[4991]: I1206 01:22:03.555518 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-7g5ts" event={"ID":"93aee8c2-c556-4635-ab46-f72bb431ed81","Type":"ContainerDied","Data":"fbd2f9d4a3c7ec83307cd8d78841f72655859a751be415e536128375885f9d27"} Dec 06 01:22:03 crc kubenswrapper[4991]: I1206 01:22:03.562167 4991 generic.go:334] "Generic (PLEG): container finished" podID="59662742-d08d-498c-9e70-53288e527d9f" containerID="9b837859471346fe2cdb38e81e756d345537fcd908d4593f2c37feb351f0a824" exitCode=0 Dec 06 01:22:03 crc kubenswrapper[4991]: I1206 01:22:03.562213 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1f6b-account-create-update-hcrwx" event={"ID":"59662742-d08d-498c-9e70-53288e527d9f","Type":"ContainerDied","Data":"9b837859471346fe2cdb38e81e756d345537fcd908d4593f2c37feb351f0a824"} Dec 06 01:22:04 crc kubenswrapper[4991]: I1206 01:22:04.583016 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"04a746be-9ac3-4e2e-a264-6fc99dc51527","Type":"ContainerStarted","Data":"067cca556690b6da46f9217c690cd53267d24f2af5d513b9667741453042633a"} Dec 06 01:22:04 crc kubenswrapper[4991]: I1206 01:22:04.583081 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"04a746be-9ac3-4e2e-a264-6fc99dc51527","Type":"ContainerStarted","Data":"eadadbebf20093551ed792d43e45670432e03f81928da7086de4598707ec398a"} Dec 06 01:22:04 crc kubenswrapper[4991]: I1206 01:22:04.646371 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.499395334 podStartE2EDuration="40.646351315s" podCreationTimestamp="2025-12-06 01:21:24 +0000 UTC" firstStartedPulling="2025-12-06 01:21:41.938614787 +0000 UTC m=+1175.288905255" lastFinishedPulling="2025-12-06 01:22:02.085570748 +0000 UTC m=+1195.435861236" observedRunningTime="2025-12-06 01:22:04.642109604 +0000 UTC m=+1197.992400112" watchObservedRunningTime="2025-12-06 01:22:04.646351315 +0000 UTC m=+1197.996641783" Dec 06 01:22:04 crc kubenswrapper[4991]: I1206 01:22:04.932138 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-lnjwh"] Dec 06 01:22:04 crc kubenswrapper[4991]: I1206 01:22:04.933542 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-lnjwh" Dec 06 01:22:04 crc kubenswrapper[4991]: I1206 01:22:04.935313 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 06 01:22:04 crc kubenswrapper[4991]: I1206 01:22:04.959053 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-lnjwh"] Dec 06 01:22:05 crc kubenswrapper[4991]: I1206 01:22:05.010701 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d73d18a0-25b2-4b77-a62d-b6766b700f3b-config\") pod \"dnsmasq-dns-6d5b6d6b67-lnjwh\" (UID: \"d73d18a0-25b2-4b77-a62d-b6766b700f3b\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-lnjwh" Dec 06 01:22:05 crc kubenswrapper[4991]: I1206 01:22:05.010749 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d73d18a0-25b2-4b77-a62d-b6766b700f3b-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-lnjwh\" (UID: \"d73d18a0-25b2-4b77-a62d-b6766b700f3b\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-lnjwh" Dec 06 01:22:05 crc kubenswrapper[4991]: I1206 01:22:05.010770 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7fsq\" (UniqueName: \"kubernetes.io/projected/d73d18a0-25b2-4b77-a62d-b6766b700f3b-kube-api-access-b7fsq\") pod \"dnsmasq-dns-6d5b6d6b67-lnjwh\" (UID: \"d73d18a0-25b2-4b77-a62d-b6766b700f3b\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-lnjwh" Dec 06 01:22:05 crc kubenswrapper[4991]: I1206 01:22:05.010793 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d73d18a0-25b2-4b77-a62d-b6766b700f3b-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-lnjwh\" (UID: \"d73d18a0-25b2-4b77-a62d-b6766b700f3b\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-lnjwh" Dec 06 01:22:05 crc kubenswrapper[4991]: I1206 01:22:05.010829 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d73d18a0-25b2-4b77-a62d-b6766b700f3b-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-lnjwh\" (UID: \"d73d18a0-25b2-4b77-a62d-b6766b700f3b\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-lnjwh" Dec 06 01:22:05 crc kubenswrapper[4991]: I1206 01:22:05.010847 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d73d18a0-25b2-4b77-a62d-b6766b700f3b-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-lnjwh\" (UID: \"d73d18a0-25b2-4b77-a62d-b6766b700f3b\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-lnjwh" Dec 06 01:22:05 crc kubenswrapper[4991]: I1206 01:22:05.112127 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d73d18a0-25b2-4b77-a62d-b6766b700f3b-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-lnjwh\" (UID: \"d73d18a0-25b2-4b77-a62d-b6766b700f3b\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-lnjwh" Dec 06 01:22:05 crc kubenswrapper[4991]: I1206 01:22:05.112385 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d73d18a0-25b2-4b77-a62d-b6766b700f3b-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-lnjwh\" (UID: \"d73d18a0-25b2-4b77-a62d-b6766b700f3b\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-lnjwh" Dec 06 01:22:05 crc kubenswrapper[4991]: I1206 01:22:05.112471 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d73d18a0-25b2-4b77-a62d-b6766b700f3b-config\") pod \"dnsmasq-dns-6d5b6d6b67-lnjwh\" (UID: \"d73d18a0-25b2-4b77-a62d-b6766b700f3b\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-lnjwh" Dec 06 01:22:05 crc kubenswrapper[4991]: I1206 01:22:05.112496 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d73d18a0-25b2-4b77-a62d-b6766b700f3b-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-lnjwh\" (UID: \"d73d18a0-25b2-4b77-a62d-b6766b700f3b\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-lnjwh" Dec 06 01:22:05 crc kubenswrapper[4991]: I1206 01:22:05.112517 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7fsq\" (UniqueName: \"kubernetes.io/projected/d73d18a0-25b2-4b77-a62d-b6766b700f3b-kube-api-access-b7fsq\") pod \"dnsmasq-dns-6d5b6d6b67-lnjwh\" (UID: \"d73d18a0-25b2-4b77-a62d-b6766b700f3b\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-lnjwh" Dec 06 01:22:05 crc kubenswrapper[4991]: I1206 01:22:05.112546 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d73d18a0-25b2-4b77-a62d-b6766b700f3b-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-lnjwh\" (UID: \"d73d18a0-25b2-4b77-a62d-b6766b700f3b\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-lnjwh" Dec 06 01:22:05 crc kubenswrapper[4991]: I1206 01:22:05.114473 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d73d18a0-25b2-4b77-a62d-b6766b700f3b-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-lnjwh\" (UID: \"d73d18a0-25b2-4b77-a62d-b6766b700f3b\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-lnjwh" Dec 06 01:22:05 crc kubenswrapper[4991]: I1206 01:22:05.115041 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d73d18a0-25b2-4b77-a62d-b6766b700f3b-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-lnjwh\" (UID: \"d73d18a0-25b2-4b77-a62d-b6766b700f3b\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-lnjwh" Dec 06 01:22:05 crc kubenswrapper[4991]: I1206 01:22:05.127310 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d73d18a0-25b2-4b77-a62d-b6766b700f3b-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-lnjwh\" (UID: \"d73d18a0-25b2-4b77-a62d-b6766b700f3b\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-lnjwh" Dec 06 01:22:05 crc kubenswrapper[4991]: I1206 01:22:05.127415 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d73d18a0-25b2-4b77-a62d-b6766b700f3b-config\") pod \"dnsmasq-dns-6d5b6d6b67-lnjwh\" (UID: \"d73d18a0-25b2-4b77-a62d-b6766b700f3b\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-lnjwh" Dec 06 01:22:05 crc kubenswrapper[4991]: I1206 01:22:05.130156 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d73d18a0-25b2-4b77-a62d-b6766b700f3b-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-lnjwh\" (UID: \"d73d18a0-25b2-4b77-a62d-b6766b700f3b\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-lnjwh" Dec 06 01:22:05 crc kubenswrapper[4991]: I1206 01:22:05.131370 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7fsq\" (UniqueName: \"kubernetes.io/projected/d73d18a0-25b2-4b77-a62d-b6766b700f3b-kube-api-access-b7fsq\") pod \"dnsmasq-dns-6d5b6d6b67-lnjwh\" (UID: \"d73d18a0-25b2-4b77-a62d-b6766b700f3b\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-lnjwh" Dec 06 01:22:05 crc kubenswrapper[4991]: I1206 01:22:05.272522 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-lnjwh" Dec 06 01:22:07 crc kubenswrapper[4991]: I1206 01:22:07.904710 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-pz4lk" Dec 06 01:22:07 crc kubenswrapper[4991]: I1206 01:22:07.911049 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-7g5ts" Dec 06 01:22:07 crc kubenswrapper[4991]: I1206 01:22:07.930732 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5745-account-create-update-54rdp" Dec 06 01:22:07 crc kubenswrapper[4991]: I1206 01:22:07.936763 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1f6b-account-create-update-hcrwx" Dec 06 01:22:07 crc kubenswrapper[4991]: I1206 01:22:07.954001 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-073a-account-create-update-5n44d" Dec 06 01:22:07 crc kubenswrapper[4991]: I1206 01:22:07.976447 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-q7rp9" Dec 06 01:22:07 crc kubenswrapper[4991]: I1206 01:22:07.976896 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-82ddm-config-7n6nx" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.067576 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2z654\" (UniqueName: \"kubernetes.io/projected/59662742-d08d-498c-9e70-53288e527d9f-kube-api-access-2z654\") pod \"59662742-d08d-498c-9e70-53288e527d9f\" (UID: \"59662742-d08d-498c-9e70-53288e527d9f\") " Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.067664 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93aee8c2-c556-4635-ab46-f72bb431ed81-operator-scripts\") pod \"93aee8c2-c556-4635-ab46-f72bb431ed81\" (UID: \"93aee8c2-c556-4635-ab46-f72bb431ed81\") " Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.067698 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83441b36-34ed-4c10-848b-22ed0afae8cb-operator-scripts\") pod \"83441b36-34ed-4c10-848b-22ed0afae8cb\" (UID: \"83441b36-34ed-4c10-848b-22ed0afae8cb\") " Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.067737 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wv9s6\" (UniqueName: \"kubernetes.io/projected/93aee8c2-c556-4635-ab46-f72bb431ed81-kube-api-access-wv9s6\") pod \"93aee8c2-c556-4635-ab46-f72bb431ed81\" (UID: \"93aee8c2-c556-4635-ab46-f72bb431ed81\") " Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.067772 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlc6v\" (UniqueName: \"kubernetes.io/projected/83441b36-34ed-4c10-848b-22ed0afae8cb-kube-api-access-hlc6v\") pod \"83441b36-34ed-4c10-848b-22ed0afae8cb\" (UID: \"83441b36-34ed-4c10-848b-22ed0afae8cb\") " Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.068892 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93aee8c2-c556-4635-ab46-f72bb431ed81-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "93aee8c2-c556-4635-ab46-f72bb431ed81" (UID: "93aee8c2-c556-4635-ab46-f72bb431ed81"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.069077 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c43175cc-c478-4abf-9e58-f6a31f656494-var-log-ovn\") pod \"c43175cc-c478-4abf-9e58-f6a31f656494\" (UID: \"c43175cc-c478-4abf-9e58-f6a31f656494\") " Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.069109 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba9c8d1e-e8d2-4bd2-8963-45ea5f7110fb-operator-scripts\") pod \"ba9c8d1e-e8d2-4bd2-8963-45ea5f7110fb\" (UID: \"ba9c8d1e-e8d2-4bd2-8963-45ea5f7110fb\") " Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.069130 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3672862f-2f4b-487a-b545-48dfd8515d8d-operator-scripts\") pod \"3672862f-2f4b-487a-b545-48dfd8515d8d\" (UID: \"3672862f-2f4b-487a-b545-48dfd8515d8d\") " Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.069156 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnfnm\" (UniqueName: \"kubernetes.io/projected/3672862f-2f4b-487a-b545-48dfd8515d8d-kube-api-access-lnfnm\") pod \"3672862f-2f4b-487a-b545-48dfd8515d8d\" (UID: \"3672862f-2f4b-487a-b545-48dfd8515d8d\") " Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.069197 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c43175cc-c478-4abf-9e58-f6a31f656494-var-run\") pod \"c43175cc-c478-4abf-9e58-f6a31f656494\" (UID: \"c43175cc-c478-4abf-9e58-f6a31f656494\") " Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.069218 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ln578\" (UniqueName: \"kubernetes.io/projected/ba9c8d1e-e8d2-4bd2-8963-45ea5f7110fb-kube-api-access-ln578\") pod \"ba9c8d1e-e8d2-4bd2-8963-45ea5f7110fb\" (UID: \"ba9c8d1e-e8d2-4bd2-8963-45ea5f7110fb\") " Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.069233 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59662742-d08d-498c-9e70-53288e527d9f-operator-scripts\") pod \"59662742-d08d-498c-9e70-53288e527d9f\" (UID: \"59662742-d08d-498c-9e70-53288e527d9f\") " Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.069257 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dv6db\" (UniqueName: \"kubernetes.io/projected/c43175cc-c478-4abf-9e58-f6a31f656494-kube-api-access-dv6db\") pod \"c43175cc-c478-4abf-9e58-f6a31f656494\" (UID: \"c43175cc-c478-4abf-9e58-f6a31f656494\") " Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.069273 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c43175cc-c478-4abf-9e58-f6a31f656494-var-run-ovn\") pod \"c43175cc-c478-4abf-9e58-f6a31f656494\" (UID: \"c43175cc-c478-4abf-9e58-f6a31f656494\") " Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.069293 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f680de3a-45a6-4f83-b27c-79ade3e8071a-operator-scripts\") pod \"f680de3a-45a6-4f83-b27c-79ade3e8071a\" (UID: \"f680de3a-45a6-4f83-b27c-79ade3e8071a\") " Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.069317 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c43175cc-c478-4abf-9e58-f6a31f656494-additional-scripts\") pod \"c43175cc-c478-4abf-9e58-f6a31f656494\" (UID: \"c43175cc-c478-4abf-9e58-f6a31f656494\") " Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.069336 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6th7\" (UniqueName: \"kubernetes.io/projected/f680de3a-45a6-4f83-b27c-79ade3e8071a-kube-api-access-j6th7\") pod \"f680de3a-45a6-4f83-b27c-79ade3e8071a\" (UID: \"f680de3a-45a6-4f83-b27c-79ade3e8071a\") " Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.069613 4991 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93aee8c2-c556-4635-ab46-f72bb431ed81-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.070497 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59662742-d08d-498c-9e70-53288e527d9f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "59662742-d08d-498c-9e70-53288e527d9f" (UID: "59662742-d08d-498c-9e70-53288e527d9f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.070529 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c43175cc-c478-4abf-9e58-f6a31f656494-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "c43175cc-c478-4abf-9e58-f6a31f656494" (UID: "c43175cc-c478-4abf-9e58-f6a31f656494"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.070834 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba9c8d1e-e8d2-4bd2-8963-45ea5f7110fb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ba9c8d1e-e8d2-4bd2-8963-45ea5f7110fb" (UID: "ba9c8d1e-e8d2-4bd2-8963-45ea5f7110fb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.071126 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3672862f-2f4b-487a-b545-48dfd8515d8d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3672862f-2f4b-487a-b545-48dfd8515d8d" (UID: "3672862f-2f4b-487a-b545-48dfd8515d8d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.072628 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c43175cc-c478-4abf-9e58-f6a31f656494-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "c43175cc-c478-4abf-9e58-f6a31f656494" (UID: "c43175cc-c478-4abf-9e58-f6a31f656494"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.072669 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c43175cc-c478-4abf-9e58-f6a31f656494-var-run" (OuterVolumeSpecName: "var-run") pod "c43175cc-c478-4abf-9e58-f6a31f656494" (UID: "c43175cc-c478-4abf-9e58-f6a31f656494"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.073530 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f680de3a-45a6-4f83-b27c-79ade3e8071a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f680de3a-45a6-4f83-b27c-79ade3e8071a" (UID: "f680de3a-45a6-4f83-b27c-79ade3e8071a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.074025 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c43175cc-c478-4abf-9e58-f6a31f656494-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "c43175cc-c478-4abf-9e58-f6a31f656494" (UID: "c43175cc-c478-4abf-9e58-f6a31f656494"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.074253 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83441b36-34ed-4c10-848b-22ed0afae8cb-kube-api-access-hlc6v" (OuterVolumeSpecName: "kube-api-access-hlc6v") pod "83441b36-34ed-4c10-848b-22ed0afae8cb" (UID: "83441b36-34ed-4c10-848b-22ed0afae8cb"). InnerVolumeSpecName "kube-api-access-hlc6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.074334 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93aee8c2-c556-4635-ab46-f72bb431ed81-kube-api-access-wv9s6" (OuterVolumeSpecName: "kube-api-access-wv9s6") pod "93aee8c2-c556-4635-ab46-f72bb431ed81" (UID: "93aee8c2-c556-4635-ab46-f72bb431ed81"). InnerVolumeSpecName "kube-api-access-wv9s6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.074364 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59662742-d08d-498c-9e70-53288e527d9f-kube-api-access-2z654" (OuterVolumeSpecName: "kube-api-access-2z654") pod "59662742-d08d-498c-9e70-53288e527d9f" (UID: "59662742-d08d-498c-9e70-53288e527d9f"). InnerVolumeSpecName "kube-api-access-2z654". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.075106 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3672862f-2f4b-487a-b545-48dfd8515d8d-kube-api-access-lnfnm" (OuterVolumeSpecName: "kube-api-access-lnfnm") pod "3672862f-2f4b-487a-b545-48dfd8515d8d" (UID: "3672862f-2f4b-487a-b545-48dfd8515d8d"). InnerVolumeSpecName "kube-api-access-lnfnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.075632 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba9c8d1e-e8d2-4bd2-8963-45ea5f7110fb-kube-api-access-ln578" (OuterVolumeSpecName: "kube-api-access-ln578") pod "ba9c8d1e-e8d2-4bd2-8963-45ea5f7110fb" (UID: "ba9c8d1e-e8d2-4bd2-8963-45ea5f7110fb"). InnerVolumeSpecName "kube-api-access-ln578". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.076355 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f680de3a-45a6-4f83-b27c-79ade3e8071a-kube-api-access-j6th7" (OuterVolumeSpecName: "kube-api-access-j6th7") pod "f680de3a-45a6-4f83-b27c-79ade3e8071a" (UID: "f680de3a-45a6-4f83-b27c-79ade3e8071a"). InnerVolumeSpecName "kube-api-access-j6th7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.077223 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83441b36-34ed-4c10-848b-22ed0afae8cb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "83441b36-34ed-4c10-848b-22ed0afae8cb" (UID: "83441b36-34ed-4c10-848b-22ed0afae8cb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.084257 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c43175cc-c478-4abf-9e58-f6a31f656494-kube-api-access-dv6db" (OuterVolumeSpecName: "kube-api-access-dv6db") pod "c43175cc-c478-4abf-9e58-f6a31f656494" (UID: "c43175cc-c478-4abf-9e58-f6a31f656494"). InnerVolumeSpecName "kube-api-access-dv6db". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.170594 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c43175cc-c478-4abf-9e58-f6a31f656494-scripts\") pod \"c43175cc-c478-4abf-9e58-f6a31f656494\" (UID: \"c43175cc-c478-4abf-9e58-f6a31f656494\") " Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.171159 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnfnm\" (UniqueName: \"kubernetes.io/projected/3672862f-2f4b-487a-b545-48dfd8515d8d-kube-api-access-lnfnm\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.171190 4991 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c43175cc-c478-4abf-9e58-f6a31f656494-var-run\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.171205 4991 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59662742-d08d-498c-9e70-53288e527d9f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.171217 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ln578\" (UniqueName: \"kubernetes.io/projected/ba9c8d1e-e8d2-4bd2-8963-45ea5f7110fb-kube-api-access-ln578\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.171230 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dv6db\" (UniqueName: \"kubernetes.io/projected/c43175cc-c478-4abf-9e58-f6a31f656494-kube-api-access-dv6db\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.171243 4991 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c43175cc-c478-4abf-9e58-f6a31f656494-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.171255 4991 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f680de3a-45a6-4f83-b27c-79ade3e8071a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.171266 4991 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c43175cc-c478-4abf-9e58-f6a31f656494-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.171278 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6th7\" (UniqueName: \"kubernetes.io/projected/f680de3a-45a6-4f83-b27c-79ade3e8071a-kube-api-access-j6th7\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.171292 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2z654\" (UniqueName: \"kubernetes.io/projected/59662742-d08d-498c-9e70-53288e527d9f-kube-api-access-2z654\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.171303 4991 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83441b36-34ed-4c10-848b-22ed0afae8cb-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.171315 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wv9s6\" (UniqueName: \"kubernetes.io/projected/93aee8c2-c556-4635-ab46-f72bb431ed81-kube-api-access-wv9s6\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.171326 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlc6v\" (UniqueName: \"kubernetes.io/projected/83441b36-34ed-4c10-848b-22ed0afae8cb-kube-api-access-hlc6v\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.171339 4991 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c43175cc-c478-4abf-9e58-f6a31f656494-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.171350 4991 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba9c8d1e-e8d2-4bd2-8963-45ea5f7110fb-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.171362 4991 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3672862f-2f4b-487a-b545-48dfd8515d8d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.171665 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c43175cc-c478-4abf-9e58-f6a31f656494-scripts" (OuterVolumeSpecName: "scripts") pod "c43175cc-c478-4abf-9e58-f6a31f656494" (UID: "c43175cc-c478-4abf-9e58-f6a31f656494"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.276356 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c43175cc-c478-4abf-9e58-f6a31f656494-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.277956 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-lnjwh"] Dec 06 01:22:08 crc kubenswrapper[4991]: W1206 01:22:08.284910 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd73d18a0_25b2_4b77_a62d_b6766b700f3b.slice/crio-cc8aaae54f96ff7fb89e833fb967b2f7ac0da27a17eb86686caad2f92eaf85c9 WatchSource:0}: Error finding container cc8aaae54f96ff7fb89e833fb967b2f7ac0da27a17eb86686caad2f92eaf85c9: Status 404 returned error can't find the container with id cc8aaae54f96ff7fb89e833fb967b2f7ac0da27a17eb86686caad2f92eaf85c9 Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.635166 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-5745-account-create-update-54rdp" event={"ID":"f680de3a-45a6-4f83-b27c-79ade3e8071a","Type":"ContainerDied","Data":"c0989412fab363e24f5b23efea21a94ade3cb6a9fcfce13fbc1a4d08bf35c94a"} Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.635529 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0989412fab363e24f5b23efea21a94ade3cb6a9fcfce13fbc1a4d08bf35c94a" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.635255 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5745-account-create-update-54rdp" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.637710 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-7g5ts" event={"ID":"93aee8c2-c556-4635-ab46-f72bb431ed81","Type":"ContainerDied","Data":"29b40fde66bd01f8588f47ac1306f0c70cc38b6318dd408860c034c46a048b90"} Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.637777 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29b40fde66bd01f8588f47ac1306f0c70cc38b6318dd408860c034c46a048b90" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.637789 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-7g5ts" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.640188 4991 generic.go:334] "Generic (PLEG): container finished" podID="d73d18a0-25b2-4b77-a62d-b6766b700f3b" containerID="eeb10dcd78cd2d90546d566f0e0b821475691080a4e0457fe3cd11aec25b4dee" exitCode=0 Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.640314 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-lnjwh" event={"ID":"d73d18a0-25b2-4b77-a62d-b6766b700f3b","Type":"ContainerDied","Data":"eeb10dcd78cd2d90546d566f0e0b821475691080a4e0457fe3cd11aec25b4dee"} Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.640433 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-lnjwh" event={"ID":"d73d18a0-25b2-4b77-a62d-b6766b700f3b","Type":"ContainerStarted","Data":"cc8aaae54f96ff7fb89e833fb967b2f7ac0da27a17eb86686caad2f92eaf85c9"} Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.644898 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-q7rp9" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.644907 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-q7rp9" event={"ID":"3672862f-2f4b-487a-b545-48dfd8515d8d","Type":"ContainerDied","Data":"092b9f972b97bb6d25b2cae49e8bc84c6808b90b440af3252e3756e147a43fae"} Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.645056 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="092b9f972b97bb6d25b2cae49e8bc84c6808b90b440af3252e3756e147a43fae" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.648171 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xqxr2" event={"ID":"e64b0721-7f94-464a-a14f-3fc10d3b6c53","Type":"ContainerStarted","Data":"987f550151737b52f41ff96232ad7783addb53fd67e320f7a9588d5a983ea64b"} Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.650341 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1f6b-account-create-update-hcrwx" event={"ID":"59662742-d08d-498c-9e70-53288e527d9f","Type":"ContainerDied","Data":"6565787e49d053520d6dcc6941d526b303b689db8c957d472411d3cf4c434860"} Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.650379 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6565787e49d053520d6dcc6941d526b303b689db8c957d472411d3cf4c434860" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.650478 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1f6b-account-create-update-hcrwx" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.653101 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-82ddm-config-7n6nx" event={"ID":"c43175cc-c478-4abf-9e58-f6a31f656494","Type":"ContainerDied","Data":"8d56f2d1604806ccada0f86b92242b0d98db84ac69435d9ad92d653ec97d2e5a"} Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.653188 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d56f2d1604806ccada0f86b92242b0d98db84ac69435d9ad92d653ec97d2e5a" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.653356 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-82ddm-config-7n6nx" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.663226 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-pz4lk" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.663245 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-pz4lk" event={"ID":"83441b36-34ed-4c10-848b-22ed0afae8cb","Type":"ContainerDied","Data":"cc13890caf6d75c5db84e7eb71078578083b883430618a395a5d3aa7a358fe1a"} Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.663493 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc13890caf6d75c5db84e7eb71078578083b883430618a395a5d3aa7a358fe1a" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.665657 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-073a-account-create-update-5n44d" event={"ID":"ba9c8d1e-e8d2-4bd2-8963-45ea5f7110fb","Type":"ContainerDied","Data":"8220d222b95b23907a175ace62478e120bfe6468f465e0a0ed440c787cb1b106"} Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.665795 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8220d222b95b23907a175ace62478e120bfe6468f465e0a0ed440c787cb1b106" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.665803 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-073a-account-create-update-5n44d" Dec 06 01:22:08 crc kubenswrapper[4991]: I1206 01:22:08.731631 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-xqxr2" podStartSLOduration=2.863240448 podStartE2EDuration="8.731602083s" podCreationTimestamp="2025-12-06 01:22:00 +0000 UTC" firstStartedPulling="2025-12-06 01:22:01.951105798 +0000 UTC m=+1195.301396266" lastFinishedPulling="2025-12-06 01:22:07.819467443 +0000 UTC m=+1201.169757901" observedRunningTime="2025-12-06 01:22:08.710148872 +0000 UTC m=+1202.060439380" watchObservedRunningTime="2025-12-06 01:22:08.731602083 +0000 UTC m=+1202.081892591" Dec 06 01:22:09 crc kubenswrapper[4991]: I1206 01:22:09.160101 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-82ddm-config-7n6nx"] Dec 06 01:22:09 crc kubenswrapper[4991]: I1206 01:22:09.166390 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-82ddm-config-7n6nx"] Dec 06 01:22:09 crc kubenswrapper[4991]: I1206 01:22:09.258603 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-82ddm-config-drvft"] Dec 06 01:22:09 crc kubenswrapper[4991]: E1206 01:22:09.260282 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83441b36-34ed-4c10-848b-22ed0afae8cb" containerName="mariadb-database-create" Dec 06 01:22:09 crc kubenswrapper[4991]: I1206 01:22:09.260307 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="83441b36-34ed-4c10-848b-22ed0afae8cb" containerName="mariadb-database-create" Dec 06 01:22:09 crc kubenswrapper[4991]: E1206 01:22:09.260338 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba9c8d1e-e8d2-4bd2-8963-45ea5f7110fb" containerName="mariadb-account-create-update" Dec 06 01:22:09 crc kubenswrapper[4991]: I1206 01:22:09.260347 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba9c8d1e-e8d2-4bd2-8963-45ea5f7110fb" containerName="mariadb-account-create-update" Dec 06 01:22:09 crc kubenswrapper[4991]: E1206 01:22:09.260372 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59662742-d08d-498c-9e70-53288e527d9f" containerName="mariadb-account-create-update" Dec 06 01:22:09 crc kubenswrapper[4991]: I1206 01:22:09.260380 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="59662742-d08d-498c-9e70-53288e527d9f" containerName="mariadb-account-create-update" Dec 06 01:22:09 crc kubenswrapper[4991]: E1206 01:22:09.260393 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c43175cc-c478-4abf-9e58-f6a31f656494" containerName="ovn-config" Dec 06 01:22:09 crc kubenswrapper[4991]: I1206 01:22:09.260404 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="c43175cc-c478-4abf-9e58-f6a31f656494" containerName="ovn-config" Dec 06 01:22:09 crc kubenswrapper[4991]: E1206 01:22:09.260432 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f680de3a-45a6-4f83-b27c-79ade3e8071a" containerName="mariadb-account-create-update" Dec 06 01:22:09 crc kubenswrapper[4991]: I1206 01:22:09.260441 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="f680de3a-45a6-4f83-b27c-79ade3e8071a" containerName="mariadb-account-create-update" Dec 06 01:22:09 crc kubenswrapper[4991]: E1206 01:22:09.260457 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93aee8c2-c556-4635-ab46-f72bb431ed81" containerName="mariadb-database-create" Dec 06 01:22:09 crc kubenswrapper[4991]: I1206 01:22:09.260465 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="93aee8c2-c556-4635-ab46-f72bb431ed81" containerName="mariadb-database-create" Dec 06 01:22:09 crc kubenswrapper[4991]: E1206 01:22:09.260482 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3672862f-2f4b-487a-b545-48dfd8515d8d" containerName="mariadb-database-create" Dec 06 01:22:09 crc kubenswrapper[4991]: I1206 01:22:09.260492 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="3672862f-2f4b-487a-b545-48dfd8515d8d" containerName="mariadb-database-create" Dec 06 01:22:09 crc kubenswrapper[4991]: I1206 01:22:09.260703 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="f680de3a-45a6-4f83-b27c-79ade3e8071a" containerName="mariadb-account-create-update" Dec 06 01:22:09 crc kubenswrapper[4991]: I1206 01:22:09.260721 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="83441b36-34ed-4c10-848b-22ed0afae8cb" containerName="mariadb-database-create" Dec 06 01:22:09 crc kubenswrapper[4991]: I1206 01:22:09.260733 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="3672862f-2f4b-487a-b545-48dfd8515d8d" containerName="mariadb-database-create" Dec 06 01:22:09 crc kubenswrapper[4991]: I1206 01:22:09.260753 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba9c8d1e-e8d2-4bd2-8963-45ea5f7110fb" containerName="mariadb-account-create-update" Dec 06 01:22:09 crc kubenswrapper[4991]: I1206 01:22:09.260765 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="59662742-d08d-498c-9e70-53288e527d9f" containerName="mariadb-account-create-update" Dec 06 01:22:09 crc kubenswrapper[4991]: I1206 01:22:09.260789 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="93aee8c2-c556-4635-ab46-f72bb431ed81" containerName="mariadb-database-create" Dec 06 01:22:09 crc kubenswrapper[4991]: I1206 01:22:09.260801 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="c43175cc-c478-4abf-9e58-f6a31f656494" containerName="ovn-config" Dec 06 01:22:09 crc kubenswrapper[4991]: I1206 01:22:09.262695 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-82ddm-config-drvft" Dec 06 01:22:09 crc kubenswrapper[4991]: I1206 01:22:09.264753 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 06 01:22:09 crc kubenswrapper[4991]: I1206 01:22:09.268533 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-82ddm-config-drvft"] Dec 06 01:22:09 crc kubenswrapper[4991]: I1206 01:22:09.396877 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/418fb0c8-5dd0-4425-9dc5-7cfe8810958e-additional-scripts\") pod \"ovn-controller-82ddm-config-drvft\" (UID: \"418fb0c8-5dd0-4425-9dc5-7cfe8810958e\") " pod="openstack/ovn-controller-82ddm-config-drvft" Dec 06 01:22:09 crc kubenswrapper[4991]: I1206 01:22:09.397088 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/418fb0c8-5dd0-4425-9dc5-7cfe8810958e-var-run-ovn\") pod \"ovn-controller-82ddm-config-drvft\" (UID: \"418fb0c8-5dd0-4425-9dc5-7cfe8810958e\") " pod="openstack/ovn-controller-82ddm-config-drvft" Dec 06 01:22:09 crc kubenswrapper[4991]: I1206 01:22:09.397128 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/418fb0c8-5dd0-4425-9dc5-7cfe8810958e-var-log-ovn\") pod \"ovn-controller-82ddm-config-drvft\" (UID: \"418fb0c8-5dd0-4425-9dc5-7cfe8810958e\") " pod="openstack/ovn-controller-82ddm-config-drvft" Dec 06 01:22:09 crc kubenswrapper[4991]: I1206 01:22:09.397167 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/418fb0c8-5dd0-4425-9dc5-7cfe8810958e-var-run\") pod \"ovn-controller-82ddm-config-drvft\" (UID: \"418fb0c8-5dd0-4425-9dc5-7cfe8810958e\") " pod="openstack/ovn-controller-82ddm-config-drvft" Dec 06 01:22:09 crc kubenswrapper[4991]: I1206 01:22:09.397201 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzrpb\" (UniqueName: \"kubernetes.io/projected/418fb0c8-5dd0-4425-9dc5-7cfe8810958e-kube-api-access-kzrpb\") pod \"ovn-controller-82ddm-config-drvft\" (UID: \"418fb0c8-5dd0-4425-9dc5-7cfe8810958e\") " pod="openstack/ovn-controller-82ddm-config-drvft" Dec 06 01:22:09 crc kubenswrapper[4991]: I1206 01:22:09.397235 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/418fb0c8-5dd0-4425-9dc5-7cfe8810958e-scripts\") pod \"ovn-controller-82ddm-config-drvft\" (UID: \"418fb0c8-5dd0-4425-9dc5-7cfe8810958e\") " pod="openstack/ovn-controller-82ddm-config-drvft" Dec 06 01:22:09 crc kubenswrapper[4991]: I1206 01:22:09.498877 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/418fb0c8-5dd0-4425-9dc5-7cfe8810958e-additional-scripts\") pod \"ovn-controller-82ddm-config-drvft\" (UID: \"418fb0c8-5dd0-4425-9dc5-7cfe8810958e\") " pod="openstack/ovn-controller-82ddm-config-drvft" Dec 06 01:22:09 crc kubenswrapper[4991]: I1206 01:22:09.499012 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/418fb0c8-5dd0-4425-9dc5-7cfe8810958e-var-run-ovn\") pod \"ovn-controller-82ddm-config-drvft\" (UID: \"418fb0c8-5dd0-4425-9dc5-7cfe8810958e\") " pod="openstack/ovn-controller-82ddm-config-drvft" Dec 06 01:22:09 crc kubenswrapper[4991]: I1206 01:22:09.499040 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/418fb0c8-5dd0-4425-9dc5-7cfe8810958e-var-log-ovn\") pod \"ovn-controller-82ddm-config-drvft\" (UID: \"418fb0c8-5dd0-4425-9dc5-7cfe8810958e\") " pod="openstack/ovn-controller-82ddm-config-drvft" Dec 06 01:22:09 crc kubenswrapper[4991]: I1206 01:22:09.499074 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/418fb0c8-5dd0-4425-9dc5-7cfe8810958e-var-run\") pod \"ovn-controller-82ddm-config-drvft\" (UID: \"418fb0c8-5dd0-4425-9dc5-7cfe8810958e\") " pod="openstack/ovn-controller-82ddm-config-drvft" Dec 06 01:22:09 crc kubenswrapper[4991]: I1206 01:22:09.499105 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzrpb\" (UniqueName: \"kubernetes.io/projected/418fb0c8-5dd0-4425-9dc5-7cfe8810958e-kube-api-access-kzrpb\") pod \"ovn-controller-82ddm-config-drvft\" (UID: \"418fb0c8-5dd0-4425-9dc5-7cfe8810958e\") " pod="openstack/ovn-controller-82ddm-config-drvft" Dec 06 01:22:09 crc kubenswrapper[4991]: I1206 01:22:09.499130 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/418fb0c8-5dd0-4425-9dc5-7cfe8810958e-scripts\") pod \"ovn-controller-82ddm-config-drvft\" (UID: \"418fb0c8-5dd0-4425-9dc5-7cfe8810958e\") " pod="openstack/ovn-controller-82ddm-config-drvft" Dec 06 01:22:09 crc kubenswrapper[4991]: I1206 01:22:09.499391 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/418fb0c8-5dd0-4425-9dc5-7cfe8810958e-var-run-ovn\") pod \"ovn-controller-82ddm-config-drvft\" (UID: \"418fb0c8-5dd0-4425-9dc5-7cfe8810958e\") " pod="openstack/ovn-controller-82ddm-config-drvft" Dec 06 01:22:09 crc kubenswrapper[4991]: I1206 01:22:09.499469 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/418fb0c8-5dd0-4425-9dc5-7cfe8810958e-var-run\") pod \"ovn-controller-82ddm-config-drvft\" (UID: \"418fb0c8-5dd0-4425-9dc5-7cfe8810958e\") " pod="openstack/ovn-controller-82ddm-config-drvft" Dec 06 01:22:09 crc kubenswrapper[4991]: I1206 01:22:09.499488 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/418fb0c8-5dd0-4425-9dc5-7cfe8810958e-var-log-ovn\") pod \"ovn-controller-82ddm-config-drvft\" (UID: \"418fb0c8-5dd0-4425-9dc5-7cfe8810958e\") " pod="openstack/ovn-controller-82ddm-config-drvft" Dec 06 01:22:09 crc kubenswrapper[4991]: I1206 01:22:09.499895 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/418fb0c8-5dd0-4425-9dc5-7cfe8810958e-additional-scripts\") pod \"ovn-controller-82ddm-config-drvft\" (UID: \"418fb0c8-5dd0-4425-9dc5-7cfe8810958e\") " pod="openstack/ovn-controller-82ddm-config-drvft" Dec 06 01:22:09 crc kubenswrapper[4991]: I1206 01:22:09.501128 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/418fb0c8-5dd0-4425-9dc5-7cfe8810958e-scripts\") pod \"ovn-controller-82ddm-config-drvft\" (UID: \"418fb0c8-5dd0-4425-9dc5-7cfe8810958e\") " pod="openstack/ovn-controller-82ddm-config-drvft" Dec 06 01:22:09 crc kubenswrapper[4991]: I1206 01:22:09.518759 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzrpb\" (UniqueName: \"kubernetes.io/projected/418fb0c8-5dd0-4425-9dc5-7cfe8810958e-kube-api-access-kzrpb\") pod \"ovn-controller-82ddm-config-drvft\" (UID: \"418fb0c8-5dd0-4425-9dc5-7cfe8810958e\") " pod="openstack/ovn-controller-82ddm-config-drvft" Dec 06 01:22:09 crc kubenswrapper[4991]: I1206 01:22:09.597231 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-82ddm-config-drvft" Dec 06 01:22:09 crc kubenswrapper[4991]: I1206 01:22:09.683765 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-lnjwh" event={"ID":"d73d18a0-25b2-4b77-a62d-b6766b700f3b","Type":"ContainerStarted","Data":"e028cfba7f3e414ae2c808b5c301de7c336744f3f8cfb27f6230c285427ebd4c"} Dec 06 01:22:09 crc kubenswrapper[4991]: I1206 01:22:09.686171 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d5b6d6b67-lnjwh" Dec 06 01:22:09 crc kubenswrapper[4991]: I1206 01:22:09.723973 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d5b6d6b67-lnjwh" podStartSLOduration=5.723951585 podStartE2EDuration="5.723951585s" podCreationTimestamp="2025-12-06 01:22:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:22:09.720558736 +0000 UTC m=+1203.070849224" watchObservedRunningTime="2025-12-06 01:22:09.723951585 +0000 UTC m=+1203.074242053" Dec 06 01:22:09 crc kubenswrapper[4991]: I1206 01:22:09.885317 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-82ddm-config-drvft"] Dec 06 01:22:09 crc kubenswrapper[4991]: W1206 01:22:09.894761 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod418fb0c8_5dd0_4425_9dc5_7cfe8810958e.slice/crio-bad2b72263bf52e1c564eed43950c86df2e01192dd61cb87d32ccfdef054d223 WatchSource:0}: Error finding container bad2b72263bf52e1c564eed43950c86df2e01192dd61cb87d32ccfdef054d223: Status 404 returned error can't find the container with id bad2b72263bf52e1c564eed43950c86df2e01192dd61cb87d32ccfdef054d223 Dec 06 01:22:10 crc kubenswrapper[4991]: I1206 01:22:10.694726 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-82ddm-config-drvft" event={"ID":"418fb0c8-5dd0-4425-9dc5-7cfe8810958e","Type":"ContainerStarted","Data":"86cf516df72440324964d815e6964ee90ca245175fb9dd35a5350df7d283107f"} Dec 06 01:22:10 crc kubenswrapper[4991]: I1206 01:22:10.695148 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-82ddm-config-drvft" event={"ID":"418fb0c8-5dd0-4425-9dc5-7cfe8810958e","Type":"ContainerStarted","Data":"bad2b72263bf52e1c564eed43950c86df2e01192dd61cb87d32ccfdef054d223"} Dec 06 01:22:10 crc kubenswrapper[4991]: I1206 01:22:10.717972 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-82ddm-config-drvft" podStartSLOduration=1.71795247 podStartE2EDuration="1.71795247s" podCreationTimestamp="2025-12-06 01:22:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:22:10.710196007 +0000 UTC m=+1204.060486485" watchObservedRunningTime="2025-12-06 01:22:10.71795247 +0000 UTC m=+1204.068242938" Dec 06 01:22:11 crc kubenswrapper[4991]: I1206 01:22:11.047525 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c43175cc-c478-4abf-9e58-f6a31f656494" path="/var/lib/kubelet/pods/c43175cc-c478-4abf-9e58-f6a31f656494/volumes" Dec 06 01:22:11 crc kubenswrapper[4991]: I1206 01:22:11.709047 4991 generic.go:334] "Generic (PLEG): container finished" podID="418fb0c8-5dd0-4425-9dc5-7cfe8810958e" containerID="86cf516df72440324964d815e6964ee90ca245175fb9dd35a5350df7d283107f" exitCode=0 Dec 06 01:22:11 crc kubenswrapper[4991]: I1206 01:22:11.709211 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-82ddm-config-drvft" event={"ID":"418fb0c8-5dd0-4425-9dc5-7cfe8810958e","Type":"ContainerDied","Data":"86cf516df72440324964d815e6964ee90ca245175fb9dd35a5350df7d283107f"} Dec 06 01:22:12 crc kubenswrapper[4991]: I1206 01:22:12.721502 4991 generic.go:334] "Generic (PLEG): container finished" podID="e64b0721-7f94-464a-a14f-3fc10d3b6c53" containerID="987f550151737b52f41ff96232ad7783addb53fd67e320f7a9588d5a983ea64b" exitCode=0 Dec 06 01:22:12 crc kubenswrapper[4991]: I1206 01:22:12.721610 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xqxr2" event={"ID":"e64b0721-7f94-464a-a14f-3fc10d3b6c53","Type":"ContainerDied","Data":"987f550151737b52f41ff96232ad7783addb53fd67e320f7a9588d5a983ea64b"} Dec 06 01:22:13 crc kubenswrapper[4991]: I1206 01:22:13.069548 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-82ddm-config-drvft" Dec 06 01:22:13 crc kubenswrapper[4991]: I1206 01:22:13.172230 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/418fb0c8-5dd0-4425-9dc5-7cfe8810958e-scripts\") pod \"418fb0c8-5dd0-4425-9dc5-7cfe8810958e\" (UID: \"418fb0c8-5dd0-4425-9dc5-7cfe8810958e\") " Dec 06 01:22:13 crc kubenswrapper[4991]: I1206 01:22:13.172347 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/418fb0c8-5dd0-4425-9dc5-7cfe8810958e-additional-scripts\") pod \"418fb0c8-5dd0-4425-9dc5-7cfe8810958e\" (UID: \"418fb0c8-5dd0-4425-9dc5-7cfe8810958e\") " Dec 06 01:22:13 crc kubenswrapper[4991]: I1206 01:22:13.172418 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzrpb\" (UniqueName: \"kubernetes.io/projected/418fb0c8-5dd0-4425-9dc5-7cfe8810958e-kube-api-access-kzrpb\") pod \"418fb0c8-5dd0-4425-9dc5-7cfe8810958e\" (UID: \"418fb0c8-5dd0-4425-9dc5-7cfe8810958e\") " Dec 06 01:22:13 crc kubenswrapper[4991]: I1206 01:22:13.172474 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/418fb0c8-5dd0-4425-9dc5-7cfe8810958e-var-run\") pod \"418fb0c8-5dd0-4425-9dc5-7cfe8810958e\" (UID: \"418fb0c8-5dd0-4425-9dc5-7cfe8810958e\") " Dec 06 01:22:13 crc kubenswrapper[4991]: I1206 01:22:13.172514 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/418fb0c8-5dd0-4425-9dc5-7cfe8810958e-var-run-ovn\") pod \"418fb0c8-5dd0-4425-9dc5-7cfe8810958e\" (UID: \"418fb0c8-5dd0-4425-9dc5-7cfe8810958e\") " Dec 06 01:22:13 crc kubenswrapper[4991]: I1206 01:22:13.172539 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/418fb0c8-5dd0-4425-9dc5-7cfe8810958e-var-log-ovn\") pod \"418fb0c8-5dd0-4425-9dc5-7cfe8810958e\" (UID: \"418fb0c8-5dd0-4425-9dc5-7cfe8810958e\") " Dec 06 01:22:13 crc kubenswrapper[4991]: I1206 01:22:13.173134 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/418fb0c8-5dd0-4425-9dc5-7cfe8810958e-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "418fb0c8-5dd0-4425-9dc5-7cfe8810958e" (UID: "418fb0c8-5dd0-4425-9dc5-7cfe8810958e"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 01:22:13 crc kubenswrapper[4991]: I1206 01:22:13.173194 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/418fb0c8-5dd0-4425-9dc5-7cfe8810958e-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "418fb0c8-5dd0-4425-9dc5-7cfe8810958e" (UID: "418fb0c8-5dd0-4425-9dc5-7cfe8810958e"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 01:22:13 crc kubenswrapper[4991]: I1206 01:22:13.173210 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/418fb0c8-5dd0-4425-9dc5-7cfe8810958e-var-run" (OuterVolumeSpecName: "var-run") pod "418fb0c8-5dd0-4425-9dc5-7cfe8810958e" (UID: "418fb0c8-5dd0-4425-9dc5-7cfe8810958e"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 01:22:13 crc kubenswrapper[4991]: I1206 01:22:13.173820 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/418fb0c8-5dd0-4425-9dc5-7cfe8810958e-scripts" (OuterVolumeSpecName: "scripts") pod "418fb0c8-5dd0-4425-9dc5-7cfe8810958e" (UID: "418fb0c8-5dd0-4425-9dc5-7cfe8810958e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:22:13 crc kubenswrapper[4991]: I1206 01:22:13.174542 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/418fb0c8-5dd0-4425-9dc5-7cfe8810958e-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "418fb0c8-5dd0-4425-9dc5-7cfe8810958e" (UID: "418fb0c8-5dd0-4425-9dc5-7cfe8810958e"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:22:13 crc kubenswrapper[4991]: I1206 01:22:13.192244 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/418fb0c8-5dd0-4425-9dc5-7cfe8810958e-kube-api-access-kzrpb" (OuterVolumeSpecName: "kube-api-access-kzrpb") pod "418fb0c8-5dd0-4425-9dc5-7cfe8810958e" (UID: "418fb0c8-5dd0-4425-9dc5-7cfe8810958e"). InnerVolumeSpecName "kube-api-access-kzrpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:22:13 crc kubenswrapper[4991]: I1206 01:22:13.275436 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/418fb0c8-5dd0-4425-9dc5-7cfe8810958e-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:13 crc kubenswrapper[4991]: I1206 01:22:13.275489 4991 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/418fb0c8-5dd0-4425-9dc5-7cfe8810958e-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:13 crc kubenswrapper[4991]: I1206 01:22:13.275508 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzrpb\" (UniqueName: \"kubernetes.io/projected/418fb0c8-5dd0-4425-9dc5-7cfe8810958e-kube-api-access-kzrpb\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:13 crc kubenswrapper[4991]: I1206 01:22:13.275526 4991 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/418fb0c8-5dd0-4425-9dc5-7cfe8810958e-var-run\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:13 crc kubenswrapper[4991]: I1206 01:22:13.275543 4991 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/418fb0c8-5dd0-4425-9dc5-7cfe8810958e-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:13 crc kubenswrapper[4991]: I1206 01:22:13.275559 4991 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/418fb0c8-5dd0-4425-9dc5-7cfe8810958e-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:13 crc kubenswrapper[4991]: I1206 01:22:13.732026 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-82ddm-config-drvft" Dec 06 01:22:13 crc kubenswrapper[4991]: I1206 01:22:13.732050 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-82ddm-config-drvft" event={"ID":"418fb0c8-5dd0-4425-9dc5-7cfe8810958e","Type":"ContainerDied","Data":"bad2b72263bf52e1c564eed43950c86df2e01192dd61cb87d32ccfdef054d223"} Dec 06 01:22:13 crc kubenswrapper[4991]: I1206 01:22:13.732137 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bad2b72263bf52e1c564eed43950c86df2e01192dd61cb87d32ccfdef054d223" Dec 06 01:22:13 crc kubenswrapper[4991]: I1206 01:22:13.816113 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-82ddm-config-drvft"] Dec 06 01:22:13 crc kubenswrapper[4991]: I1206 01:22:13.851279 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-82ddm-config-drvft"] Dec 06 01:22:14 crc kubenswrapper[4991]: I1206 01:22:14.047659 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xqxr2" Dec 06 01:22:14 crc kubenswrapper[4991]: I1206 01:22:14.193411 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e64b0721-7f94-464a-a14f-3fc10d3b6c53-combined-ca-bundle\") pod \"e64b0721-7f94-464a-a14f-3fc10d3b6c53\" (UID: \"e64b0721-7f94-464a-a14f-3fc10d3b6c53\") " Dec 06 01:22:14 crc kubenswrapper[4991]: I1206 01:22:14.193491 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzbbg\" (UniqueName: \"kubernetes.io/projected/e64b0721-7f94-464a-a14f-3fc10d3b6c53-kube-api-access-qzbbg\") pod \"e64b0721-7f94-464a-a14f-3fc10d3b6c53\" (UID: \"e64b0721-7f94-464a-a14f-3fc10d3b6c53\") " Dec 06 01:22:14 crc kubenswrapper[4991]: I1206 01:22:14.193575 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e64b0721-7f94-464a-a14f-3fc10d3b6c53-config-data\") pod \"e64b0721-7f94-464a-a14f-3fc10d3b6c53\" (UID: \"e64b0721-7f94-464a-a14f-3fc10d3b6c53\") " Dec 06 01:22:14 crc kubenswrapper[4991]: I1206 01:22:14.199459 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e64b0721-7f94-464a-a14f-3fc10d3b6c53-kube-api-access-qzbbg" (OuterVolumeSpecName: "kube-api-access-qzbbg") pod "e64b0721-7f94-464a-a14f-3fc10d3b6c53" (UID: "e64b0721-7f94-464a-a14f-3fc10d3b6c53"). InnerVolumeSpecName "kube-api-access-qzbbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:22:14 crc kubenswrapper[4991]: I1206 01:22:14.217148 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e64b0721-7f94-464a-a14f-3fc10d3b6c53-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e64b0721-7f94-464a-a14f-3fc10d3b6c53" (UID: "e64b0721-7f94-464a-a14f-3fc10d3b6c53"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:22:14 crc kubenswrapper[4991]: I1206 01:22:14.236568 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e64b0721-7f94-464a-a14f-3fc10d3b6c53-config-data" (OuterVolumeSpecName: "config-data") pod "e64b0721-7f94-464a-a14f-3fc10d3b6c53" (UID: "e64b0721-7f94-464a-a14f-3fc10d3b6c53"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:22:14 crc kubenswrapper[4991]: I1206 01:22:14.295810 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e64b0721-7f94-464a-a14f-3fc10d3b6c53-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:14 crc kubenswrapper[4991]: I1206 01:22:14.295864 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e64b0721-7f94-464a-a14f-3fc10d3b6c53-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:14 crc kubenswrapper[4991]: I1206 01:22:14.295887 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzbbg\" (UniqueName: \"kubernetes.io/projected/e64b0721-7f94-464a-a14f-3fc10d3b6c53-kube-api-access-qzbbg\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:14 crc kubenswrapper[4991]: I1206 01:22:14.745821 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xqxr2" event={"ID":"e64b0721-7f94-464a-a14f-3fc10d3b6c53","Type":"ContainerDied","Data":"93df943daf40e5acba1d56453160797ee20da73f8b08896aedd66d7ce55aeaa4"} Dec 06 01:22:14 crc kubenswrapper[4991]: I1206 01:22:14.745860 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93df943daf40e5acba1d56453160797ee20da73f8b08896aedd66d7ce55aeaa4" Dec 06 01:22:14 crc kubenswrapper[4991]: I1206 01:22:14.745911 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xqxr2" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.056119 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="418fb0c8-5dd0-4425-9dc5-7cfe8810958e" path="/var/lib/kubelet/pods/418fb0c8-5dd0-4425-9dc5-7cfe8810958e/volumes" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.057490 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-lnjwh"] Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.057719 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d5b6d6b67-lnjwh" podUID="d73d18a0-25b2-4b77-a62d-b6766b700f3b" containerName="dnsmasq-dns" containerID="cri-o://e028cfba7f3e414ae2c808b5c301de7c336744f3f8cfb27f6230c285427ebd4c" gracePeriod=10 Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.060351 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d5b6d6b67-lnjwh" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.072512 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-zdfxt"] Dec 06 01:22:15 crc kubenswrapper[4991]: E1206 01:22:15.072893 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="418fb0c8-5dd0-4425-9dc5-7cfe8810958e" containerName="ovn-config" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.072911 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="418fb0c8-5dd0-4425-9dc5-7cfe8810958e" containerName="ovn-config" Dec 06 01:22:15 crc kubenswrapper[4991]: E1206 01:22:15.072943 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e64b0721-7f94-464a-a14f-3fc10d3b6c53" containerName="keystone-db-sync" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.072950 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="e64b0721-7f94-464a-a14f-3fc10d3b6c53" containerName="keystone-db-sync" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.073118 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="e64b0721-7f94-464a-a14f-3fc10d3b6c53" containerName="keystone-db-sync" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.073136 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="418fb0c8-5dd0-4425-9dc5-7cfe8810958e" containerName="ovn-config" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.073724 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zdfxt" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.085537 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.085588 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.085752 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.085790 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.085952 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-z245k" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.091613 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-zdfxt"] Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.104971 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f8c45789f-7gnlm"] Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.106322 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f8c45789f-7gnlm" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.162201 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f8c45789f-7gnlm"] Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.217597 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b098106-2ef2-4f7a-adad-6b9b3349d846-combined-ca-bundle\") pod \"keystone-bootstrap-zdfxt\" (UID: \"4b098106-2ef2-4f7a-adad-6b9b3349d846\") " pod="openstack/keystone-bootstrap-zdfxt" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.217657 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b098106-2ef2-4f7a-adad-6b9b3349d846-config-data\") pod \"keystone-bootstrap-zdfxt\" (UID: \"4b098106-2ef2-4f7a-adad-6b9b3349d846\") " pod="openstack/keystone-bootstrap-zdfxt" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.217719 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8d73ecd-6cba-4db9-b60e-e8613f2ee903-ovsdbserver-sb\") pod \"dnsmasq-dns-6f8c45789f-7gnlm\" (UID: \"c8d73ecd-6cba-4db9-b60e-e8613f2ee903\") " pod="openstack/dnsmasq-dns-6f8c45789f-7gnlm" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.217745 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8d73ecd-6cba-4db9-b60e-e8613f2ee903-ovsdbserver-nb\") pod \"dnsmasq-dns-6f8c45789f-7gnlm\" (UID: \"c8d73ecd-6cba-4db9-b60e-e8613f2ee903\") " pod="openstack/dnsmasq-dns-6f8c45789f-7gnlm" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.217768 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4b098106-2ef2-4f7a-adad-6b9b3349d846-credential-keys\") pod \"keystone-bootstrap-zdfxt\" (UID: \"4b098106-2ef2-4f7a-adad-6b9b3349d846\") " pod="openstack/keystone-bootstrap-zdfxt" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.217791 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c8d73ecd-6cba-4db9-b60e-e8613f2ee903-dns-swift-storage-0\") pod \"dnsmasq-dns-6f8c45789f-7gnlm\" (UID: \"c8d73ecd-6cba-4db9-b60e-e8613f2ee903\") " pod="openstack/dnsmasq-dns-6f8c45789f-7gnlm" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.217814 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tp8tc\" (UniqueName: \"kubernetes.io/projected/c8d73ecd-6cba-4db9-b60e-e8613f2ee903-kube-api-access-tp8tc\") pod \"dnsmasq-dns-6f8c45789f-7gnlm\" (UID: \"c8d73ecd-6cba-4db9-b60e-e8613f2ee903\") " pod="openstack/dnsmasq-dns-6f8c45789f-7gnlm" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.218011 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b098106-2ef2-4f7a-adad-6b9b3349d846-scripts\") pod \"keystone-bootstrap-zdfxt\" (UID: \"4b098106-2ef2-4f7a-adad-6b9b3349d846\") " pod="openstack/keystone-bootstrap-zdfxt" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.218075 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jbkt\" (UniqueName: \"kubernetes.io/projected/4b098106-2ef2-4f7a-adad-6b9b3349d846-kube-api-access-2jbkt\") pod \"keystone-bootstrap-zdfxt\" (UID: \"4b098106-2ef2-4f7a-adad-6b9b3349d846\") " pod="openstack/keystone-bootstrap-zdfxt" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.218108 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8d73ecd-6cba-4db9-b60e-e8613f2ee903-dns-svc\") pod \"dnsmasq-dns-6f8c45789f-7gnlm\" (UID: \"c8d73ecd-6cba-4db9-b60e-e8613f2ee903\") " pod="openstack/dnsmasq-dns-6f8c45789f-7gnlm" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.218153 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8d73ecd-6cba-4db9-b60e-e8613f2ee903-config\") pod \"dnsmasq-dns-6f8c45789f-7gnlm\" (UID: \"c8d73ecd-6cba-4db9-b60e-e8613f2ee903\") " pod="openstack/dnsmasq-dns-6f8c45789f-7gnlm" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.218207 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4b098106-2ef2-4f7a-adad-6b9b3349d846-fernet-keys\") pod \"keystone-bootstrap-zdfxt\" (UID: \"4b098106-2ef2-4f7a-adad-6b9b3349d846\") " pod="openstack/keystone-bootstrap-zdfxt" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.234691 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-59788c8c95-f8d4v"] Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.235941 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-59788c8c95-f8d4v" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.238100 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.238218 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-hcpsd" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.238407 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.238647 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.266347 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-59788c8c95-f8d4v"] Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.273111 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6d5b6d6b67-lnjwh" podUID="d73d18a0-25b2-4b77-a62d-b6766b700f3b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.130:5353: connect: connection refused" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.319463 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b098106-2ef2-4f7a-adad-6b9b3349d846-combined-ca-bundle\") pod \"keystone-bootstrap-zdfxt\" (UID: \"4b098106-2ef2-4f7a-adad-6b9b3349d846\") " pod="openstack/keystone-bootstrap-zdfxt" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.319740 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b098106-2ef2-4f7a-adad-6b9b3349d846-config-data\") pod \"keystone-bootstrap-zdfxt\" (UID: \"4b098106-2ef2-4f7a-adad-6b9b3349d846\") " pod="openstack/keystone-bootstrap-zdfxt" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.319862 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8d73ecd-6cba-4db9-b60e-e8613f2ee903-ovsdbserver-sb\") pod \"dnsmasq-dns-6f8c45789f-7gnlm\" (UID: \"c8d73ecd-6cba-4db9-b60e-e8613f2ee903\") " pod="openstack/dnsmasq-dns-6f8c45789f-7gnlm" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.319942 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8d73ecd-6cba-4db9-b60e-e8613f2ee903-ovsdbserver-nb\") pod \"dnsmasq-dns-6f8c45789f-7gnlm\" (UID: \"c8d73ecd-6cba-4db9-b60e-e8613f2ee903\") " pod="openstack/dnsmasq-dns-6f8c45789f-7gnlm" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.320036 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4b098106-2ef2-4f7a-adad-6b9b3349d846-credential-keys\") pod \"keystone-bootstrap-zdfxt\" (UID: \"4b098106-2ef2-4f7a-adad-6b9b3349d846\") " pod="openstack/keystone-bootstrap-zdfxt" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.320128 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c8d73ecd-6cba-4db9-b60e-e8613f2ee903-dns-swift-storage-0\") pod \"dnsmasq-dns-6f8c45789f-7gnlm\" (UID: \"c8d73ecd-6cba-4db9-b60e-e8613f2ee903\") " pod="openstack/dnsmasq-dns-6f8c45789f-7gnlm" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.320203 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tp8tc\" (UniqueName: \"kubernetes.io/projected/c8d73ecd-6cba-4db9-b60e-e8613f2ee903-kube-api-access-tp8tc\") pod \"dnsmasq-dns-6f8c45789f-7gnlm\" (UID: \"c8d73ecd-6cba-4db9-b60e-e8613f2ee903\") " pod="openstack/dnsmasq-dns-6f8c45789f-7gnlm" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.320286 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b098106-2ef2-4f7a-adad-6b9b3349d846-scripts\") pod \"keystone-bootstrap-zdfxt\" (UID: \"4b098106-2ef2-4f7a-adad-6b9b3349d846\") " pod="openstack/keystone-bootstrap-zdfxt" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.320358 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jbkt\" (UniqueName: \"kubernetes.io/projected/4b098106-2ef2-4f7a-adad-6b9b3349d846-kube-api-access-2jbkt\") pod \"keystone-bootstrap-zdfxt\" (UID: \"4b098106-2ef2-4f7a-adad-6b9b3349d846\") " pod="openstack/keystone-bootstrap-zdfxt" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.320440 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8d73ecd-6cba-4db9-b60e-e8613f2ee903-dns-svc\") pod \"dnsmasq-dns-6f8c45789f-7gnlm\" (UID: \"c8d73ecd-6cba-4db9-b60e-e8613f2ee903\") " pod="openstack/dnsmasq-dns-6f8c45789f-7gnlm" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.320510 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8d73ecd-6cba-4db9-b60e-e8613f2ee903-config\") pod \"dnsmasq-dns-6f8c45789f-7gnlm\" (UID: \"c8d73ecd-6cba-4db9-b60e-e8613f2ee903\") " pod="openstack/dnsmasq-dns-6f8c45789f-7gnlm" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.320587 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4b098106-2ef2-4f7a-adad-6b9b3349d846-fernet-keys\") pod \"keystone-bootstrap-zdfxt\" (UID: \"4b098106-2ef2-4f7a-adad-6b9b3349d846\") " pod="openstack/keystone-bootstrap-zdfxt" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.330463 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8d73ecd-6cba-4db9-b60e-e8613f2ee903-dns-svc\") pod \"dnsmasq-dns-6f8c45789f-7gnlm\" (UID: \"c8d73ecd-6cba-4db9-b60e-e8613f2ee903\") " pod="openstack/dnsmasq-dns-6f8c45789f-7gnlm" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.330701 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8d73ecd-6cba-4db9-b60e-e8613f2ee903-config\") pod \"dnsmasq-dns-6f8c45789f-7gnlm\" (UID: \"c8d73ecd-6cba-4db9-b60e-e8613f2ee903\") " pod="openstack/dnsmasq-dns-6f8c45789f-7gnlm" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.331551 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8d73ecd-6cba-4db9-b60e-e8613f2ee903-ovsdbserver-sb\") pod \"dnsmasq-dns-6f8c45789f-7gnlm\" (UID: \"c8d73ecd-6cba-4db9-b60e-e8613f2ee903\") " pod="openstack/dnsmasq-dns-6f8c45789f-7gnlm" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.332035 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c8d73ecd-6cba-4db9-b60e-e8613f2ee903-dns-swift-storage-0\") pod \"dnsmasq-dns-6f8c45789f-7gnlm\" (UID: \"c8d73ecd-6cba-4db9-b60e-e8613f2ee903\") " pod="openstack/dnsmasq-dns-6f8c45789f-7gnlm" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.334358 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-znbg5"] Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.337904 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b098106-2ef2-4f7a-adad-6b9b3349d846-config-data\") pod \"keystone-bootstrap-zdfxt\" (UID: \"4b098106-2ef2-4f7a-adad-6b9b3349d846\") " pod="openstack/keystone-bootstrap-zdfxt" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.339393 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b098106-2ef2-4f7a-adad-6b9b3349d846-combined-ca-bundle\") pod \"keystone-bootstrap-zdfxt\" (UID: \"4b098106-2ef2-4f7a-adad-6b9b3349d846\") " pod="openstack/keystone-bootstrap-zdfxt" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.343591 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4b098106-2ef2-4f7a-adad-6b9b3349d846-credential-keys\") pod \"keystone-bootstrap-zdfxt\" (UID: \"4b098106-2ef2-4f7a-adad-6b9b3349d846\") " pod="openstack/keystone-bootstrap-zdfxt" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.344168 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4b098106-2ef2-4f7a-adad-6b9b3349d846-fernet-keys\") pod \"keystone-bootstrap-zdfxt\" (UID: \"4b098106-2ef2-4f7a-adad-6b9b3349d846\") " pod="openstack/keystone-bootstrap-zdfxt" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.344457 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-znbg5" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.345533 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b098106-2ef2-4f7a-adad-6b9b3349d846-scripts\") pod \"keystone-bootstrap-zdfxt\" (UID: \"4b098106-2ef2-4f7a-adad-6b9b3349d846\") " pod="openstack/keystone-bootstrap-zdfxt" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.349587 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8d73ecd-6cba-4db9-b60e-e8613f2ee903-ovsdbserver-nb\") pod \"dnsmasq-dns-6f8c45789f-7gnlm\" (UID: \"c8d73ecd-6cba-4db9-b60e-e8613f2ee903\") " pod="openstack/dnsmasq-dns-6f8c45789f-7gnlm" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.356850 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.357768 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.358031 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-6wbm9" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.360177 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jbkt\" (UniqueName: \"kubernetes.io/projected/4b098106-2ef2-4f7a-adad-6b9b3349d846-kube-api-access-2jbkt\") pod \"keystone-bootstrap-zdfxt\" (UID: \"4b098106-2ef2-4f7a-adad-6b9b3349d846\") " pod="openstack/keystone-bootstrap-zdfxt" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.373773 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tp8tc\" (UniqueName: \"kubernetes.io/projected/c8d73ecd-6cba-4db9-b60e-e8613f2ee903-kube-api-access-tp8tc\") pod \"dnsmasq-dns-6f8c45789f-7gnlm\" (UID: \"c8d73ecd-6cba-4db9-b60e-e8613f2ee903\") " pod="openstack/dnsmasq-dns-6f8c45789f-7gnlm" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.380957 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-5hg4t"] Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.382210 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-5hg4t" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.388401 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-znbg5"] Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.388822 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-kdjvx" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.389008 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.389593 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.403718 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-5hg4t"] Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.413079 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.434889 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.435878 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41dee50a-eb94-4447-b27a-9480e91a10e1-logs\") pod \"horizon-59788c8c95-f8d4v\" (UID: \"41dee50a-eb94-4447-b27a-9480e91a10e1\") " pod="openstack/horizon-59788c8c95-f8d4v" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.435996 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/41dee50a-eb94-4447-b27a-9480e91a10e1-scripts\") pod \"horizon-59788c8c95-f8d4v\" (UID: \"41dee50a-eb94-4447-b27a-9480e91a10e1\") " pod="openstack/horizon-59788c8c95-f8d4v" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.436038 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgf9h\" (UniqueName: \"kubernetes.io/projected/41dee50a-eb94-4447-b27a-9480e91a10e1-kube-api-access-rgf9h\") pod \"horizon-59788c8c95-f8d4v\" (UID: \"41dee50a-eb94-4447-b27a-9480e91a10e1\") " pod="openstack/horizon-59788c8c95-f8d4v" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.436058 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/41dee50a-eb94-4447-b27a-9480e91a10e1-horizon-secret-key\") pod \"horizon-59788c8c95-f8d4v\" (UID: \"41dee50a-eb94-4447-b27a-9480e91a10e1\") " pod="openstack/horizon-59788c8c95-f8d4v" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.436097 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/41dee50a-eb94-4447-b27a-9480e91a10e1-config-data\") pod \"horizon-59788c8c95-f8d4v\" (UID: \"41dee50a-eb94-4447-b27a-9480e91a10e1\") " pod="openstack/horizon-59788c8c95-f8d4v" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.440285 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.442934 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.443172 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.539264 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62fc1d5f-d757-46ae-bea1-2a68ae61a794-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"62fc1d5f-d757-46ae-bea1-2a68ae61a794\") " pod="openstack/ceilometer-0" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.539658 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6r5j\" (UniqueName: \"kubernetes.io/projected/48107f10-c266-4cf9-b65f-bca87b8cdc86-kube-api-access-z6r5j\") pod \"neutron-db-sync-5hg4t\" (UID: \"48107f10-c266-4cf9-b65f-bca87b8cdc86\") " pod="openstack/neutron-db-sync-5hg4t" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.539708 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/62fc1d5f-d757-46ae-bea1-2a68ae61a794-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"62fc1d5f-d757-46ae-bea1-2a68ae61a794\") " pod="openstack/ceilometer-0" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.539748 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41dee50a-eb94-4447-b27a-9480e91a10e1-logs\") pod \"horizon-59788c8c95-f8d4v\" (UID: \"41dee50a-eb94-4447-b27a-9480e91a10e1\") " pod="openstack/horizon-59788c8c95-f8d4v" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.539776 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/41629f2e-563a-4e76-8ac2-530e8d204578-etc-machine-id\") pod \"cinder-db-sync-znbg5\" (UID: \"41629f2e-563a-4e76-8ac2-530e8d204578\") " pod="openstack/cinder-db-sync-znbg5" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.539802 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/41629f2e-563a-4e76-8ac2-530e8d204578-db-sync-config-data\") pod \"cinder-db-sync-znbg5\" (UID: \"41629f2e-563a-4e76-8ac2-530e8d204578\") " pod="openstack/cinder-db-sync-znbg5" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.539827 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41629f2e-563a-4e76-8ac2-530e8d204578-scripts\") pod \"cinder-db-sync-znbg5\" (UID: \"41629f2e-563a-4e76-8ac2-530e8d204578\") " pod="openstack/cinder-db-sync-znbg5" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.550230 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62fc1d5f-d757-46ae-bea1-2a68ae61a794-run-httpd\") pod \"ceilometer-0\" (UID: \"62fc1d5f-d757-46ae-bea1-2a68ae61a794\") " pod="openstack/ceilometer-0" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.550282 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62fc1d5f-d757-46ae-bea1-2a68ae61a794-scripts\") pod \"ceilometer-0\" (UID: \"62fc1d5f-d757-46ae-bea1-2a68ae61a794\") " pod="openstack/ceilometer-0" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.550318 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/41dee50a-eb94-4447-b27a-9480e91a10e1-scripts\") pod \"horizon-59788c8c95-f8d4v\" (UID: \"41dee50a-eb94-4447-b27a-9480e91a10e1\") " pod="openstack/horizon-59788c8c95-f8d4v" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.550370 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62fc1d5f-d757-46ae-bea1-2a68ae61a794-log-httpd\") pod \"ceilometer-0\" (UID: \"62fc1d5f-d757-46ae-bea1-2a68ae61a794\") " pod="openstack/ceilometer-0" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.550403 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41629f2e-563a-4e76-8ac2-530e8d204578-combined-ca-bundle\") pod \"cinder-db-sync-znbg5\" (UID: \"41629f2e-563a-4e76-8ac2-530e8d204578\") " pod="openstack/cinder-db-sync-znbg5" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.550431 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgf9h\" (UniqueName: \"kubernetes.io/projected/41dee50a-eb94-4447-b27a-9480e91a10e1-kube-api-access-rgf9h\") pod \"horizon-59788c8c95-f8d4v\" (UID: \"41dee50a-eb94-4447-b27a-9480e91a10e1\") " pod="openstack/horizon-59788c8c95-f8d4v" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.550457 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/41dee50a-eb94-4447-b27a-9480e91a10e1-horizon-secret-key\") pod \"horizon-59788c8c95-f8d4v\" (UID: \"41dee50a-eb94-4447-b27a-9480e91a10e1\") " pod="openstack/horizon-59788c8c95-f8d4v" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.550482 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41629f2e-563a-4e76-8ac2-530e8d204578-config-data\") pod \"cinder-db-sync-znbg5\" (UID: \"41629f2e-563a-4e76-8ac2-530e8d204578\") " pod="openstack/cinder-db-sync-znbg5" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.550506 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfw6n\" (UniqueName: \"kubernetes.io/projected/41629f2e-563a-4e76-8ac2-530e8d204578-kube-api-access-lfw6n\") pod \"cinder-db-sync-znbg5\" (UID: \"41629f2e-563a-4e76-8ac2-530e8d204578\") " pod="openstack/cinder-db-sync-znbg5" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.550583 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7pcc\" (UniqueName: \"kubernetes.io/projected/62fc1d5f-d757-46ae-bea1-2a68ae61a794-kube-api-access-g7pcc\") pod \"ceilometer-0\" (UID: \"62fc1d5f-d757-46ae-bea1-2a68ae61a794\") " pod="openstack/ceilometer-0" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.550624 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62fc1d5f-d757-46ae-bea1-2a68ae61a794-config-data\") pod \"ceilometer-0\" (UID: \"62fc1d5f-d757-46ae-bea1-2a68ae61a794\") " pod="openstack/ceilometer-0" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.550651 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/41dee50a-eb94-4447-b27a-9480e91a10e1-config-data\") pod \"horizon-59788c8c95-f8d4v\" (UID: \"41dee50a-eb94-4447-b27a-9480e91a10e1\") " pod="openstack/horizon-59788c8c95-f8d4v" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.550675 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/48107f10-c266-4cf9-b65f-bca87b8cdc86-config\") pod \"neutron-db-sync-5hg4t\" (UID: \"48107f10-c266-4cf9-b65f-bca87b8cdc86\") " pod="openstack/neutron-db-sync-5hg4t" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.550700 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48107f10-c266-4cf9-b65f-bca87b8cdc86-combined-ca-bundle\") pod \"neutron-db-sync-5hg4t\" (UID: \"48107f10-c266-4cf9-b65f-bca87b8cdc86\") " pod="openstack/neutron-db-sync-5hg4t" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.551708 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/41dee50a-eb94-4447-b27a-9480e91a10e1-scripts\") pod \"horizon-59788c8c95-f8d4v\" (UID: \"41dee50a-eb94-4447-b27a-9480e91a10e1\") " pod="openstack/horizon-59788c8c95-f8d4v" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.552211 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/41dee50a-eb94-4447-b27a-9480e91a10e1-config-data\") pod \"horizon-59788c8c95-f8d4v\" (UID: \"41dee50a-eb94-4447-b27a-9480e91a10e1\") " pod="openstack/horizon-59788c8c95-f8d4v" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.558776 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/41dee50a-eb94-4447-b27a-9480e91a10e1-horizon-secret-key\") pod \"horizon-59788c8c95-f8d4v\" (UID: \"41dee50a-eb94-4447-b27a-9480e91a10e1\") " pod="openstack/horizon-59788c8c95-f8d4v" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.564086 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41dee50a-eb94-4447-b27a-9480e91a10e1-logs\") pod \"horizon-59788c8c95-f8d4v\" (UID: \"41dee50a-eb94-4447-b27a-9480e91a10e1\") " pod="openstack/horizon-59788c8c95-f8d4v" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.577589 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zdfxt" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.600785 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgf9h\" (UniqueName: \"kubernetes.io/projected/41dee50a-eb94-4447-b27a-9480e91a10e1-kube-api-access-rgf9h\") pod \"horizon-59788c8c95-f8d4v\" (UID: \"41dee50a-eb94-4447-b27a-9480e91a10e1\") " pod="openstack/horizon-59788c8c95-f8d4v" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.627850 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f8c45789f-7gnlm" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.646166 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-7sjq8"] Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.648411 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-7sjq8" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.651648 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.651857 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-25jds" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.652880 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7pcc\" (UniqueName: \"kubernetes.io/projected/62fc1d5f-d757-46ae-bea1-2a68ae61a794-kube-api-access-g7pcc\") pod \"ceilometer-0\" (UID: \"62fc1d5f-d757-46ae-bea1-2a68ae61a794\") " pod="openstack/ceilometer-0" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.652915 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62fc1d5f-d757-46ae-bea1-2a68ae61a794-config-data\") pod \"ceilometer-0\" (UID: \"62fc1d5f-d757-46ae-bea1-2a68ae61a794\") " pod="openstack/ceilometer-0" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.652940 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/48107f10-c266-4cf9-b65f-bca87b8cdc86-config\") pod \"neutron-db-sync-5hg4t\" (UID: \"48107f10-c266-4cf9-b65f-bca87b8cdc86\") " pod="openstack/neutron-db-sync-5hg4t" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.652958 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48107f10-c266-4cf9-b65f-bca87b8cdc86-combined-ca-bundle\") pod \"neutron-db-sync-5hg4t\" (UID: \"48107f10-c266-4cf9-b65f-bca87b8cdc86\") " pod="openstack/neutron-db-sync-5hg4t" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.653032 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62fc1d5f-d757-46ae-bea1-2a68ae61a794-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"62fc1d5f-d757-46ae-bea1-2a68ae61a794\") " pod="openstack/ceilometer-0" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.653063 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6r5j\" (UniqueName: \"kubernetes.io/projected/48107f10-c266-4cf9-b65f-bca87b8cdc86-kube-api-access-z6r5j\") pod \"neutron-db-sync-5hg4t\" (UID: \"48107f10-c266-4cf9-b65f-bca87b8cdc86\") " pod="openstack/neutron-db-sync-5hg4t" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.653089 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/62fc1d5f-d757-46ae-bea1-2a68ae61a794-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"62fc1d5f-d757-46ae-bea1-2a68ae61a794\") " pod="openstack/ceilometer-0" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.653119 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/41629f2e-563a-4e76-8ac2-530e8d204578-etc-machine-id\") pod \"cinder-db-sync-znbg5\" (UID: \"41629f2e-563a-4e76-8ac2-530e8d204578\") " pod="openstack/cinder-db-sync-znbg5" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.653142 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/41629f2e-563a-4e76-8ac2-530e8d204578-db-sync-config-data\") pod \"cinder-db-sync-znbg5\" (UID: \"41629f2e-563a-4e76-8ac2-530e8d204578\") " pod="openstack/cinder-db-sync-znbg5" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.653163 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41629f2e-563a-4e76-8ac2-530e8d204578-scripts\") pod \"cinder-db-sync-znbg5\" (UID: \"41629f2e-563a-4e76-8ac2-530e8d204578\") " pod="openstack/cinder-db-sync-znbg5" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.653185 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62fc1d5f-d757-46ae-bea1-2a68ae61a794-run-httpd\") pod \"ceilometer-0\" (UID: \"62fc1d5f-d757-46ae-bea1-2a68ae61a794\") " pod="openstack/ceilometer-0" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.655116 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62fc1d5f-d757-46ae-bea1-2a68ae61a794-scripts\") pod \"ceilometer-0\" (UID: \"62fc1d5f-d757-46ae-bea1-2a68ae61a794\") " pod="openstack/ceilometer-0" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.655155 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62fc1d5f-d757-46ae-bea1-2a68ae61a794-log-httpd\") pod \"ceilometer-0\" (UID: \"62fc1d5f-d757-46ae-bea1-2a68ae61a794\") " pod="openstack/ceilometer-0" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.655178 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41629f2e-563a-4e76-8ac2-530e8d204578-combined-ca-bundle\") pod \"cinder-db-sync-znbg5\" (UID: \"41629f2e-563a-4e76-8ac2-530e8d204578\") " pod="openstack/cinder-db-sync-znbg5" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.655200 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41629f2e-563a-4e76-8ac2-530e8d204578-config-data\") pod \"cinder-db-sync-znbg5\" (UID: \"41629f2e-563a-4e76-8ac2-530e8d204578\") " pod="openstack/cinder-db-sync-znbg5" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.655229 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfw6n\" (UniqueName: \"kubernetes.io/projected/41629f2e-563a-4e76-8ac2-530e8d204578-kube-api-access-lfw6n\") pod \"cinder-db-sync-znbg5\" (UID: \"41629f2e-563a-4e76-8ac2-530e8d204578\") " pod="openstack/cinder-db-sync-znbg5" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.656396 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/41629f2e-563a-4e76-8ac2-530e8d204578-etc-machine-id\") pod \"cinder-db-sync-znbg5\" (UID: \"41629f2e-563a-4e76-8ac2-530e8d204578\") " pod="openstack/cinder-db-sync-znbg5" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.656839 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62fc1d5f-d757-46ae-bea1-2a68ae61a794-run-httpd\") pod \"ceilometer-0\" (UID: \"62fc1d5f-d757-46ae-bea1-2a68ae61a794\") " pod="openstack/ceilometer-0" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.657560 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62fc1d5f-d757-46ae-bea1-2a68ae61a794-log-httpd\") pod \"ceilometer-0\" (UID: \"62fc1d5f-d757-46ae-bea1-2a68ae61a794\") " pod="openstack/ceilometer-0" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.661206 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/48107f10-c266-4cf9-b65f-bca87b8cdc86-config\") pod \"neutron-db-sync-5hg4t\" (UID: \"48107f10-c266-4cf9-b65f-bca87b8cdc86\") " pod="openstack/neutron-db-sync-5hg4t" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.661315 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f8c45789f-7gnlm"] Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.666036 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62fc1d5f-d757-46ae-bea1-2a68ae61a794-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"62fc1d5f-d757-46ae-bea1-2a68ae61a794\") " pod="openstack/ceilometer-0" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.667764 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/41629f2e-563a-4e76-8ac2-530e8d204578-db-sync-config-data\") pod \"cinder-db-sync-znbg5\" (UID: \"41629f2e-563a-4e76-8ac2-530e8d204578\") " pod="openstack/cinder-db-sync-znbg5" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.680391 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62fc1d5f-d757-46ae-bea1-2a68ae61a794-scripts\") pod \"ceilometer-0\" (UID: \"62fc1d5f-d757-46ae-bea1-2a68ae61a794\") " pod="openstack/ceilometer-0" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.680762 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/62fc1d5f-d757-46ae-bea1-2a68ae61a794-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"62fc1d5f-d757-46ae-bea1-2a68ae61a794\") " pod="openstack/ceilometer-0" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.691536 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41629f2e-563a-4e76-8ac2-530e8d204578-scripts\") pod \"cinder-db-sync-znbg5\" (UID: \"41629f2e-563a-4e76-8ac2-530e8d204578\") " pod="openstack/cinder-db-sync-znbg5" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.692304 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6r5j\" (UniqueName: \"kubernetes.io/projected/48107f10-c266-4cf9-b65f-bca87b8cdc86-kube-api-access-z6r5j\") pod \"neutron-db-sync-5hg4t\" (UID: \"48107f10-c266-4cf9-b65f-bca87b8cdc86\") " pod="openstack/neutron-db-sync-5hg4t" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.692840 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62fc1d5f-d757-46ae-bea1-2a68ae61a794-config-data\") pod \"ceilometer-0\" (UID: \"62fc1d5f-d757-46ae-bea1-2a68ae61a794\") " pod="openstack/ceilometer-0" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.693405 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41629f2e-563a-4e76-8ac2-530e8d204578-config-data\") pod \"cinder-db-sync-znbg5\" (UID: \"41629f2e-563a-4e76-8ac2-530e8d204578\") " pod="openstack/cinder-db-sync-znbg5" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.695962 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfw6n\" (UniqueName: \"kubernetes.io/projected/41629f2e-563a-4e76-8ac2-530e8d204578-kube-api-access-lfw6n\") pod \"cinder-db-sync-znbg5\" (UID: \"41629f2e-563a-4e76-8ac2-530e8d204578\") " pod="openstack/cinder-db-sync-znbg5" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.705233 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-676c488777-8p6b4"] Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.706865 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-676c488777-8p6b4" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.708807 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41629f2e-563a-4e76-8ac2-530e8d204578-combined-ca-bundle\") pod \"cinder-db-sync-znbg5\" (UID: \"41629f2e-563a-4e76-8ac2-530e8d204578\") " pod="openstack/cinder-db-sync-znbg5" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.717968 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-7sjq8"] Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.733820 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7pcc\" (UniqueName: \"kubernetes.io/projected/62fc1d5f-d757-46ae-bea1-2a68ae61a794-kube-api-access-g7pcc\") pod \"ceilometer-0\" (UID: \"62fc1d5f-d757-46ae-bea1-2a68ae61a794\") " pod="openstack/ceilometer-0" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.745789 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48107f10-c266-4cf9-b65f-bca87b8cdc86-combined-ca-bundle\") pod \"neutron-db-sync-5hg4t\" (UID: \"48107f10-c266-4cf9-b65f-bca87b8cdc86\") " pod="openstack/neutron-db-sync-5hg4t" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.750345 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-59788c8c95-f8d4v" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.757257 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f102ba4-81e9-4d61-b5a7-90126ab9dd80-combined-ca-bundle\") pod \"barbican-db-sync-7sjq8\" (UID: \"1f102ba4-81e9-4d61-b5a7-90126ab9dd80\") " pod="openstack/barbican-db-sync-7sjq8" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.757321 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8d2c212d-7062-4b49-948d-ee5ebccf9701-scripts\") pod \"horizon-676c488777-8p6b4\" (UID: \"8d2c212d-7062-4b49-948d-ee5ebccf9701\") " pod="openstack/horizon-676c488777-8p6b4" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.757394 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8d2c212d-7062-4b49-948d-ee5ebccf9701-horizon-secret-key\") pod \"horizon-676c488777-8p6b4\" (UID: \"8d2c212d-7062-4b49-948d-ee5ebccf9701\") " pod="openstack/horizon-676c488777-8p6b4" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.757488 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8d2c212d-7062-4b49-948d-ee5ebccf9701-config-data\") pod \"horizon-676c488777-8p6b4\" (UID: \"8d2c212d-7062-4b49-948d-ee5ebccf9701\") " pod="openstack/horizon-676c488777-8p6b4" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.757526 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sz6n\" (UniqueName: \"kubernetes.io/projected/1f102ba4-81e9-4d61-b5a7-90126ab9dd80-kube-api-access-5sz6n\") pod \"barbican-db-sync-7sjq8\" (UID: \"1f102ba4-81e9-4d61-b5a7-90126ab9dd80\") " pod="openstack/barbican-db-sync-7sjq8" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.757571 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dm6c\" (UniqueName: \"kubernetes.io/projected/8d2c212d-7062-4b49-948d-ee5ebccf9701-kube-api-access-4dm6c\") pod \"horizon-676c488777-8p6b4\" (UID: \"8d2c212d-7062-4b49-948d-ee5ebccf9701\") " pod="openstack/horizon-676c488777-8p6b4" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.757593 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d2c212d-7062-4b49-948d-ee5ebccf9701-logs\") pod \"horizon-676c488777-8p6b4\" (UID: \"8d2c212d-7062-4b49-948d-ee5ebccf9701\") " pod="openstack/horizon-676c488777-8p6b4" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.757632 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1f102ba4-81e9-4d61-b5a7-90126ab9dd80-db-sync-config-data\") pod \"barbican-db-sync-7sjq8\" (UID: \"1f102ba4-81e9-4d61-b5a7-90126ab9dd80\") " pod="openstack/barbican-db-sync-7sjq8" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.760613 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-676c488777-8p6b4"] Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.768682 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-fcfdd6f9f-vx96n"] Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.770217 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fcfdd6f9f-vx96n" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.785706 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-669lm"] Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.801305 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-znbg5" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.810760 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-669lm" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.812109 4991 generic.go:334] "Generic (PLEG): container finished" podID="d73d18a0-25b2-4b77-a62d-b6766b700f3b" containerID="e028cfba7f3e414ae2c808b5c301de7c336744f3f8cfb27f6230c285427ebd4c" exitCode=0 Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.812171 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-lnjwh" event={"ID":"d73d18a0-25b2-4b77-a62d-b6766b700f3b","Type":"ContainerDied","Data":"e028cfba7f3e414ae2c808b5c301de7c336744f3f8cfb27f6230c285427ebd4c"} Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.816736 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.818470 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-xl2dp" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.818605 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.833835 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fcfdd6f9f-vx96n"] Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.844341 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-5hg4t" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.848064 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-669lm"] Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.848994 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-lnjwh" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.861418 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7fsq\" (UniqueName: \"kubernetes.io/projected/d73d18a0-25b2-4b77-a62d-b6766b700f3b-kube-api-access-b7fsq\") pod \"d73d18a0-25b2-4b77-a62d-b6766b700f3b\" (UID: \"d73d18a0-25b2-4b77-a62d-b6766b700f3b\") " Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.861465 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d73d18a0-25b2-4b77-a62d-b6766b700f3b-dns-svc\") pod \"d73d18a0-25b2-4b77-a62d-b6766b700f3b\" (UID: \"d73d18a0-25b2-4b77-a62d-b6766b700f3b\") " Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.861559 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d73d18a0-25b2-4b77-a62d-b6766b700f3b-config\") pod \"d73d18a0-25b2-4b77-a62d-b6766b700f3b\" (UID: \"d73d18a0-25b2-4b77-a62d-b6766b700f3b\") " Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.861640 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d73d18a0-25b2-4b77-a62d-b6766b700f3b-dns-swift-storage-0\") pod \"d73d18a0-25b2-4b77-a62d-b6766b700f3b\" (UID: \"d73d18a0-25b2-4b77-a62d-b6766b700f3b\") " Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.861690 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d73d18a0-25b2-4b77-a62d-b6766b700f3b-ovsdbserver-sb\") pod \"d73d18a0-25b2-4b77-a62d-b6766b700f3b\" (UID: \"d73d18a0-25b2-4b77-a62d-b6766b700f3b\") " Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.861715 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d73d18a0-25b2-4b77-a62d-b6766b700f3b-ovsdbserver-nb\") pod \"d73d18a0-25b2-4b77-a62d-b6766b700f3b\" (UID: \"d73d18a0-25b2-4b77-a62d-b6766b700f3b\") " Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.861878 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbd0cb1a-1401-4031-b980-ef43103702a3-config\") pod \"dnsmasq-dns-fcfdd6f9f-vx96n\" (UID: \"dbd0cb1a-1401-4031-b980-ef43103702a3\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-vx96n" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.861906 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f102ba4-81e9-4d61-b5a7-90126ab9dd80-combined-ca-bundle\") pod \"barbican-db-sync-7sjq8\" (UID: \"1f102ba4-81e9-4d61-b5a7-90126ab9dd80\") " pod="openstack/barbican-db-sync-7sjq8" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.861932 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8d2c212d-7062-4b49-948d-ee5ebccf9701-scripts\") pod \"horizon-676c488777-8p6b4\" (UID: \"8d2c212d-7062-4b49-948d-ee5ebccf9701\") " pod="openstack/horizon-676c488777-8p6b4" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.861957 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01da8e85-cc82-46af-9370-419a710b2a8d-combined-ca-bundle\") pod \"placement-db-sync-669lm\" (UID: \"01da8e85-cc82-46af-9370-419a710b2a8d\") " pod="openstack/placement-db-sync-669lm" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.861976 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dbd0cb1a-1401-4031-b980-ef43103702a3-ovsdbserver-nb\") pod \"dnsmasq-dns-fcfdd6f9f-vx96n\" (UID: \"dbd0cb1a-1401-4031-b980-ef43103702a3\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-vx96n" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.862009 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dbd0cb1a-1401-4031-b980-ef43103702a3-dns-swift-storage-0\") pod \"dnsmasq-dns-fcfdd6f9f-vx96n\" (UID: \"dbd0cb1a-1401-4031-b980-ef43103702a3\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-vx96n" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.862030 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dbd0cb1a-1401-4031-b980-ef43103702a3-ovsdbserver-sb\") pod \"dnsmasq-dns-fcfdd6f9f-vx96n\" (UID: \"dbd0cb1a-1401-4031-b980-ef43103702a3\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-vx96n" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.862052 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8d2c212d-7062-4b49-948d-ee5ebccf9701-horizon-secret-key\") pod \"horizon-676c488777-8p6b4\" (UID: \"8d2c212d-7062-4b49-948d-ee5ebccf9701\") " pod="openstack/horizon-676c488777-8p6b4" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.862076 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01da8e85-cc82-46af-9370-419a710b2a8d-config-data\") pod \"placement-db-sync-669lm\" (UID: \"01da8e85-cc82-46af-9370-419a710b2a8d\") " pod="openstack/placement-db-sync-669lm" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.862093 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnjf6\" (UniqueName: \"kubernetes.io/projected/dbd0cb1a-1401-4031-b980-ef43103702a3-kube-api-access-wnjf6\") pod \"dnsmasq-dns-fcfdd6f9f-vx96n\" (UID: \"dbd0cb1a-1401-4031-b980-ef43103702a3\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-vx96n" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.862127 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8d2c212d-7062-4b49-948d-ee5ebccf9701-config-data\") pod \"horizon-676c488777-8p6b4\" (UID: \"8d2c212d-7062-4b49-948d-ee5ebccf9701\") " pod="openstack/horizon-676c488777-8p6b4" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.862152 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sz6n\" (UniqueName: \"kubernetes.io/projected/1f102ba4-81e9-4d61-b5a7-90126ab9dd80-kube-api-access-5sz6n\") pod \"barbican-db-sync-7sjq8\" (UID: \"1f102ba4-81e9-4d61-b5a7-90126ab9dd80\") " pod="openstack/barbican-db-sync-7sjq8" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.862179 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01da8e85-cc82-46af-9370-419a710b2a8d-logs\") pod \"placement-db-sync-669lm\" (UID: \"01da8e85-cc82-46af-9370-419a710b2a8d\") " pod="openstack/placement-db-sync-669lm" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.862200 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01da8e85-cc82-46af-9370-419a710b2a8d-scripts\") pod \"placement-db-sync-669lm\" (UID: \"01da8e85-cc82-46af-9370-419a710b2a8d\") " pod="openstack/placement-db-sync-669lm" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.862222 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dm6c\" (UniqueName: \"kubernetes.io/projected/8d2c212d-7062-4b49-948d-ee5ebccf9701-kube-api-access-4dm6c\") pod \"horizon-676c488777-8p6b4\" (UID: \"8d2c212d-7062-4b49-948d-ee5ebccf9701\") " pod="openstack/horizon-676c488777-8p6b4" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.862242 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d2c212d-7062-4b49-948d-ee5ebccf9701-logs\") pod \"horizon-676c488777-8p6b4\" (UID: \"8d2c212d-7062-4b49-948d-ee5ebccf9701\") " pod="openstack/horizon-676c488777-8p6b4" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.862269 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dbd0cb1a-1401-4031-b980-ef43103702a3-dns-svc\") pod \"dnsmasq-dns-fcfdd6f9f-vx96n\" (UID: \"dbd0cb1a-1401-4031-b980-ef43103702a3\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-vx96n" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.862288 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1f102ba4-81e9-4d61-b5a7-90126ab9dd80-db-sync-config-data\") pod \"barbican-db-sync-7sjq8\" (UID: \"1f102ba4-81e9-4d61-b5a7-90126ab9dd80\") " pod="openstack/barbican-db-sync-7sjq8" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.862309 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gld2z\" (UniqueName: \"kubernetes.io/projected/01da8e85-cc82-46af-9370-419a710b2a8d-kube-api-access-gld2z\") pod \"placement-db-sync-669lm\" (UID: \"01da8e85-cc82-46af-9370-419a710b2a8d\") " pod="openstack/placement-db-sync-669lm" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.865340 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d2c212d-7062-4b49-948d-ee5ebccf9701-logs\") pod \"horizon-676c488777-8p6b4\" (UID: \"8d2c212d-7062-4b49-948d-ee5ebccf9701\") " pod="openstack/horizon-676c488777-8p6b4" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.866220 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8d2c212d-7062-4b49-948d-ee5ebccf9701-scripts\") pod \"horizon-676c488777-8p6b4\" (UID: \"8d2c212d-7062-4b49-948d-ee5ebccf9701\") " pod="openstack/horizon-676c488777-8p6b4" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.866376 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8d2c212d-7062-4b49-948d-ee5ebccf9701-config-data\") pod \"horizon-676c488777-8p6b4\" (UID: \"8d2c212d-7062-4b49-948d-ee5ebccf9701\") " pod="openstack/horizon-676c488777-8p6b4" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.868172 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8d2c212d-7062-4b49-948d-ee5ebccf9701-horizon-secret-key\") pod \"horizon-676c488777-8p6b4\" (UID: \"8d2c212d-7062-4b49-948d-ee5ebccf9701\") " pod="openstack/horizon-676c488777-8p6b4" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.880870 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d73d18a0-25b2-4b77-a62d-b6766b700f3b-kube-api-access-b7fsq" (OuterVolumeSpecName: "kube-api-access-b7fsq") pod "d73d18a0-25b2-4b77-a62d-b6766b700f3b" (UID: "d73d18a0-25b2-4b77-a62d-b6766b700f3b"). InnerVolumeSpecName "kube-api-access-b7fsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.887611 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1f102ba4-81e9-4d61-b5a7-90126ab9dd80-db-sync-config-data\") pod \"barbican-db-sync-7sjq8\" (UID: \"1f102ba4-81e9-4d61-b5a7-90126ab9dd80\") " pod="openstack/barbican-db-sync-7sjq8" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.888942 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f102ba4-81e9-4d61-b5a7-90126ab9dd80-combined-ca-bundle\") pod \"barbican-db-sync-7sjq8\" (UID: \"1f102ba4-81e9-4d61-b5a7-90126ab9dd80\") " pod="openstack/barbican-db-sync-7sjq8" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.897635 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dm6c\" (UniqueName: \"kubernetes.io/projected/8d2c212d-7062-4b49-948d-ee5ebccf9701-kube-api-access-4dm6c\") pod \"horizon-676c488777-8p6b4\" (UID: \"8d2c212d-7062-4b49-948d-ee5ebccf9701\") " pod="openstack/horizon-676c488777-8p6b4" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.911748 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sz6n\" (UniqueName: \"kubernetes.io/projected/1f102ba4-81e9-4d61-b5a7-90126ab9dd80-kube-api-access-5sz6n\") pod \"barbican-db-sync-7sjq8\" (UID: \"1f102ba4-81e9-4d61-b5a7-90126ab9dd80\") " pod="openstack/barbican-db-sync-7sjq8" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.963117 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01da8e85-cc82-46af-9370-419a710b2a8d-logs\") pod \"placement-db-sync-669lm\" (UID: \"01da8e85-cc82-46af-9370-419a710b2a8d\") " pod="openstack/placement-db-sync-669lm" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.963179 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01da8e85-cc82-46af-9370-419a710b2a8d-scripts\") pod \"placement-db-sync-669lm\" (UID: \"01da8e85-cc82-46af-9370-419a710b2a8d\") " pod="openstack/placement-db-sync-669lm" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.963221 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dbd0cb1a-1401-4031-b980-ef43103702a3-dns-svc\") pod \"dnsmasq-dns-fcfdd6f9f-vx96n\" (UID: \"dbd0cb1a-1401-4031-b980-ef43103702a3\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-vx96n" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.963252 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gld2z\" (UniqueName: \"kubernetes.io/projected/01da8e85-cc82-46af-9370-419a710b2a8d-kube-api-access-gld2z\") pod \"placement-db-sync-669lm\" (UID: \"01da8e85-cc82-46af-9370-419a710b2a8d\") " pod="openstack/placement-db-sync-669lm" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.963294 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbd0cb1a-1401-4031-b980-ef43103702a3-config\") pod \"dnsmasq-dns-fcfdd6f9f-vx96n\" (UID: \"dbd0cb1a-1401-4031-b980-ef43103702a3\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-vx96n" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.963338 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01da8e85-cc82-46af-9370-419a710b2a8d-combined-ca-bundle\") pod \"placement-db-sync-669lm\" (UID: \"01da8e85-cc82-46af-9370-419a710b2a8d\") " pod="openstack/placement-db-sync-669lm" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.963360 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dbd0cb1a-1401-4031-b980-ef43103702a3-ovsdbserver-nb\") pod \"dnsmasq-dns-fcfdd6f9f-vx96n\" (UID: \"dbd0cb1a-1401-4031-b980-ef43103702a3\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-vx96n" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.963374 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dbd0cb1a-1401-4031-b980-ef43103702a3-dns-swift-storage-0\") pod \"dnsmasq-dns-fcfdd6f9f-vx96n\" (UID: \"dbd0cb1a-1401-4031-b980-ef43103702a3\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-vx96n" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.963393 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dbd0cb1a-1401-4031-b980-ef43103702a3-ovsdbserver-sb\") pod \"dnsmasq-dns-fcfdd6f9f-vx96n\" (UID: \"dbd0cb1a-1401-4031-b980-ef43103702a3\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-vx96n" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.963422 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01da8e85-cc82-46af-9370-419a710b2a8d-config-data\") pod \"placement-db-sync-669lm\" (UID: \"01da8e85-cc82-46af-9370-419a710b2a8d\") " pod="openstack/placement-db-sync-669lm" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.963437 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnjf6\" (UniqueName: \"kubernetes.io/projected/dbd0cb1a-1401-4031-b980-ef43103702a3-kube-api-access-wnjf6\") pod \"dnsmasq-dns-fcfdd6f9f-vx96n\" (UID: \"dbd0cb1a-1401-4031-b980-ef43103702a3\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-vx96n" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.963496 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7fsq\" (UniqueName: \"kubernetes.io/projected/d73d18a0-25b2-4b77-a62d-b6766b700f3b-kube-api-access-b7fsq\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.964477 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01da8e85-cc82-46af-9370-419a710b2a8d-logs\") pod \"placement-db-sync-669lm\" (UID: \"01da8e85-cc82-46af-9370-419a710b2a8d\") " pod="openstack/placement-db-sync-669lm" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.966403 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dbd0cb1a-1401-4031-b980-ef43103702a3-dns-svc\") pod \"dnsmasq-dns-fcfdd6f9f-vx96n\" (UID: \"dbd0cb1a-1401-4031-b980-ef43103702a3\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-vx96n" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.967291 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbd0cb1a-1401-4031-b980-ef43103702a3-config\") pod \"dnsmasq-dns-fcfdd6f9f-vx96n\" (UID: \"dbd0cb1a-1401-4031-b980-ef43103702a3\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-vx96n" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.967523 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dbd0cb1a-1401-4031-b980-ef43103702a3-dns-swift-storage-0\") pod \"dnsmasq-dns-fcfdd6f9f-vx96n\" (UID: \"dbd0cb1a-1401-4031-b980-ef43103702a3\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-vx96n" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.968948 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dbd0cb1a-1401-4031-b980-ef43103702a3-ovsdbserver-sb\") pod \"dnsmasq-dns-fcfdd6f9f-vx96n\" (UID: \"dbd0cb1a-1401-4031-b980-ef43103702a3\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-vx96n" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.969692 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dbd0cb1a-1401-4031-b980-ef43103702a3-ovsdbserver-nb\") pod \"dnsmasq-dns-fcfdd6f9f-vx96n\" (UID: \"dbd0cb1a-1401-4031-b980-ef43103702a3\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-vx96n" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.971764 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.980345 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-7sjq8" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.985165 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01da8e85-cc82-46af-9370-419a710b2a8d-config-data\") pod \"placement-db-sync-669lm\" (UID: \"01da8e85-cc82-46af-9370-419a710b2a8d\") " pod="openstack/placement-db-sync-669lm" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.991270 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnjf6\" (UniqueName: \"kubernetes.io/projected/dbd0cb1a-1401-4031-b980-ef43103702a3-kube-api-access-wnjf6\") pod \"dnsmasq-dns-fcfdd6f9f-vx96n\" (UID: \"dbd0cb1a-1401-4031-b980-ef43103702a3\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-vx96n" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.991384 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01da8e85-cc82-46af-9370-419a710b2a8d-scripts\") pod \"placement-db-sync-669lm\" (UID: \"01da8e85-cc82-46af-9370-419a710b2a8d\") " pod="openstack/placement-db-sync-669lm" Dec 06 01:22:15 crc kubenswrapper[4991]: I1206 01:22:15.991569 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01da8e85-cc82-46af-9370-419a710b2a8d-combined-ca-bundle\") pod \"placement-db-sync-669lm\" (UID: \"01da8e85-cc82-46af-9370-419a710b2a8d\") " pod="openstack/placement-db-sync-669lm" Dec 06 01:22:16 crc kubenswrapper[4991]: I1206 01:22:16.002480 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gld2z\" (UniqueName: \"kubernetes.io/projected/01da8e85-cc82-46af-9370-419a710b2a8d-kube-api-access-gld2z\") pod \"placement-db-sync-669lm\" (UID: \"01da8e85-cc82-46af-9370-419a710b2a8d\") " pod="openstack/placement-db-sync-669lm" Dec 06 01:22:16 crc kubenswrapper[4991]: I1206 01:22:16.047685 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-676c488777-8p6b4" Dec 06 01:22:16 crc kubenswrapper[4991]: I1206 01:22:16.080659 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d73d18a0-25b2-4b77-a62d-b6766b700f3b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d73d18a0-25b2-4b77-a62d-b6766b700f3b" (UID: "d73d18a0-25b2-4b77-a62d-b6766b700f3b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:22:16 crc kubenswrapper[4991]: I1206 01:22:16.095396 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d73d18a0-25b2-4b77-a62d-b6766b700f3b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d73d18a0-25b2-4b77-a62d-b6766b700f3b" (UID: "d73d18a0-25b2-4b77-a62d-b6766b700f3b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:22:16 crc kubenswrapper[4991]: I1206 01:22:16.109416 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d73d18a0-25b2-4b77-a62d-b6766b700f3b-config" (OuterVolumeSpecName: "config") pod "d73d18a0-25b2-4b77-a62d-b6766b700f3b" (UID: "d73d18a0-25b2-4b77-a62d-b6766b700f3b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:22:16 crc kubenswrapper[4991]: I1206 01:22:16.113932 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fcfdd6f9f-vx96n" Dec 06 01:22:16 crc kubenswrapper[4991]: I1206 01:22:16.117549 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d73d18a0-25b2-4b77-a62d-b6766b700f3b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d73d18a0-25b2-4b77-a62d-b6766b700f3b" (UID: "d73d18a0-25b2-4b77-a62d-b6766b700f3b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:22:16 crc kubenswrapper[4991]: I1206 01:22:16.131192 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d73d18a0-25b2-4b77-a62d-b6766b700f3b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d73d18a0-25b2-4b77-a62d-b6766b700f3b" (UID: "d73d18a0-25b2-4b77-a62d-b6766b700f3b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:22:16 crc kubenswrapper[4991]: I1206 01:22:16.141340 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-669lm" Dec 06 01:22:16 crc kubenswrapper[4991]: I1206 01:22:16.166419 4991 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d73d18a0-25b2-4b77-a62d-b6766b700f3b-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:16 crc kubenswrapper[4991]: I1206 01:22:16.166787 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d73d18a0-25b2-4b77-a62d-b6766b700f3b-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:16 crc kubenswrapper[4991]: I1206 01:22:16.166797 4991 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d73d18a0-25b2-4b77-a62d-b6766b700f3b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:16 crc kubenswrapper[4991]: I1206 01:22:16.166809 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d73d18a0-25b2-4b77-a62d-b6766b700f3b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:16 crc kubenswrapper[4991]: I1206 01:22:16.166818 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d73d18a0-25b2-4b77-a62d-b6766b700f3b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:16 crc kubenswrapper[4991]: I1206 01:22:16.190508 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-zdfxt"] Dec 06 01:22:16 crc kubenswrapper[4991]: W1206 01:22:16.263110 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b098106_2ef2_4f7a_adad_6b9b3349d846.slice/crio-da4646b7a2c88c5f9669aec83c0359f25e5853516f44c46b42b81ab183985d5c WatchSource:0}: Error finding container da4646b7a2c88c5f9669aec83c0359f25e5853516f44c46b42b81ab183985d5c: Status 404 returned error can't find the container with id da4646b7a2c88c5f9669aec83c0359f25e5853516f44c46b42b81ab183985d5c Dec 06 01:22:16 crc kubenswrapper[4991]: I1206 01:22:16.331055 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f8c45789f-7gnlm"] Dec 06 01:22:16 crc kubenswrapper[4991]: I1206 01:22:16.501821 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-59788c8c95-f8d4v"] Dec 06 01:22:16 crc kubenswrapper[4991]: I1206 01:22:16.507275 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-5hg4t"] Dec 06 01:22:16 crc kubenswrapper[4991]: W1206 01:22:16.512759 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48107f10_c266_4cf9_b65f_bca87b8cdc86.slice/crio-6d16d8ab377f9b32f561bc66edcb3e0e9b4a729b25d1fe2ba217cfc40323e1f8 WatchSource:0}: Error finding container 6d16d8ab377f9b32f561bc66edcb3e0e9b4a729b25d1fe2ba217cfc40323e1f8: Status 404 returned error can't find the container with id 6d16d8ab377f9b32f561bc66edcb3e0e9b4a729b25d1fe2ba217cfc40323e1f8 Dec 06 01:22:16 crc kubenswrapper[4991]: W1206 01:22:16.517815 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41dee50a_eb94_4447_b27a_9480e91a10e1.slice/crio-02a37bf99d541af405ec5f17b2d3785b9082b685c3a73f51e4e453e3e6d73f9c WatchSource:0}: Error finding container 02a37bf99d541af405ec5f17b2d3785b9082b685c3a73f51e4e453e3e6d73f9c: Status 404 returned error can't find the container with id 02a37bf99d541af405ec5f17b2d3785b9082b685c3a73f51e4e453e3e6d73f9c Dec 06 01:22:16 crc kubenswrapper[4991]: I1206 01:22:16.722518 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-znbg5"] Dec 06 01:22:16 crc kubenswrapper[4991]: I1206 01:22:16.739680 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 01:22:16 crc kubenswrapper[4991]: I1206 01:22:16.739753 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 01:22:16 crc kubenswrapper[4991]: W1206 01:22:16.749463 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41629f2e_563a_4e76_8ac2_530e8d204578.slice/crio-3ade1b923629e9a6e33edd628f00e4382beae678cf0d08c7ada1f262e685f674 WatchSource:0}: Error finding container 3ade1b923629e9a6e33edd628f00e4382beae678cf0d08c7ada1f262e685f674: Status 404 returned error can't find the container with id 3ade1b923629e9a6e33edd628f00e4382beae678cf0d08c7ada1f262e685f674 Dec 06 01:22:16 crc kubenswrapper[4991]: I1206 01:22:16.826258 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-5hg4t" event={"ID":"48107f10-c266-4cf9-b65f-bca87b8cdc86","Type":"ContainerStarted","Data":"54571095ffb1dce420e52b47fa807af289281627c618d64453b102d3d8876e94"} Dec 06 01:22:16 crc kubenswrapper[4991]: I1206 01:22:16.826841 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-5hg4t" event={"ID":"48107f10-c266-4cf9-b65f-bca87b8cdc86","Type":"ContainerStarted","Data":"6d16d8ab377f9b32f561bc66edcb3e0e9b4a729b25d1fe2ba217cfc40323e1f8"} Dec 06 01:22:16 crc kubenswrapper[4991]: I1206 01:22:16.830578 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-rssqk" event={"ID":"1befea2d-2e39-4901-98ef-da500dcb82f9","Type":"ContainerStarted","Data":"53ca5ffaa63419d3c128994a5584380b438b4f851235cc4a9fef77cc891d9f7f"} Dec 06 01:22:16 crc kubenswrapper[4991]: I1206 01:22:16.837885 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:22:16 crc kubenswrapper[4991]: I1206 01:22:16.841469 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-zdfxt" event={"ID":"4b098106-2ef2-4f7a-adad-6b9b3349d846","Type":"ContainerStarted","Data":"56968dca98b9f302ced297b2fdc62f37175ba969001325560b448e6f9b944ad6"} Dec 06 01:22:16 crc kubenswrapper[4991]: I1206 01:22:16.841498 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-zdfxt" event={"ID":"4b098106-2ef2-4f7a-adad-6b9b3349d846","Type":"ContainerStarted","Data":"da4646b7a2c88c5f9669aec83c0359f25e5853516f44c46b42b81ab183985d5c"} Dec 06 01:22:16 crc kubenswrapper[4991]: I1206 01:22:16.852814 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-7sjq8"] Dec 06 01:22:16 crc kubenswrapper[4991]: I1206 01:22:16.858275 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-znbg5" event={"ID":"41629f2e-563a-4e76-8ac2-530e8d204578","Type":"ContainerStarted","Data":"3ade1b923629e9a6e33edd628f00e4382beae678cf0d08c7ada1f262e685f674"} Dec 06 01:22:16 crc kubenswrapper[4991]: I1206 01:22:16.860115 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-59788c8c95-f8d4v" event={"ID":"41dee50a-eb94-4447-b27a-9480e91a10e1","Type":"ContainerStarted","Data":"02a37bf99d541af405ec5f17b2d3785b9082b685c3a73f51e4e453e3e6d73f9c"} Dec 06 01:22:16 crc kubenswrapper[4991]: I1206 01:22:16.883489 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f8c45789f-7gnlm" event={"ID":"c8d73ecd-6cba-4db9-b60e-e8613f2ee903","Type":"ContainerStarted","Data":"9876dbcea9b33d867922019678cd2f942388aa853ff13839b24d83af7d0cfa86"} Dec 06 01:22:16 crc kubenswrapper[4991]: I1206 01:22:16.892185 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f8c45789f-7gnlm" event={"ID":"c8d73ecd-6cba-4db9-b60e-e8613f2ee903","Type":"ContainerStarted","Data":"30c6cd7054e90a165266a7f9111eb73a71eb8b98b8b6f3f49df65dbd28ebfdf1"} Dec 06 01:22:16 crc kubenswrapper[4991]: I1206 01:22:16.887771 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-5hg4t" podStartSLOduration=1.887741036 podStartE2EDuration="1.887741036s" podCreationTimestamp="2025-12-06 01:22:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:22:16.862103265 +0000 UTC m=+1210.212393753" watchObservedRunningTime="2025-12-06 01:22:16.887741036 +0000 UTC m=+1210.238031504" Dec 06 01:22:16 crc kubenswrapper[4991]: I1206 01:22:16.904976 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-lnjwh" event={"ID":"d73d18a0-25b2-4b77-a62d-b6766b700f3b","Type":"ContainerDied","Data":"cc8aaae54f96ff7fb89e833fb967b2f7ac0da27a17eb86686caad2f92eaf85c9"} Dec 06 01:22:16 crc kubenswrapper[4991]: I1206 01:22:16.905140 4991 scope.go:117] "RemoveContainer" containerID="e028cfba7f3e414ae2c808b5c301de7c336744f3f8cfb27f6230c285427ebd4c" Dec 06 01:22:16 crc kubenswrapper[4991]: I1206 01:22:16.905680 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-lnjwh" Dec 06 01:22:16 crc kubenswrapper[4991]: I1206 01:22:16.920967 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-rssqk" podStartSLOduration=3.635422026 podStartE2EDuration="34.920869723s" podCreationTimestamp="2025-12-06 01:21:42 +0000 UTC" firstStartedPulling="2025-12-06 01:21:43.31866628 +0000 UTC m=+1176.668956758" lastFinishedPulling="2025-12-06 01:22:14.604113987 +0000 UTC m=+1207.954404455" observedRunningTime="2025-12-06 01:22:16.89440254 +0000 UTC m=+1210.244693018" watchObservedRunningTime="2025-12-06 01:22:16.920869723 +0000 UTC m=+1210.271160191" Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.001470 4991 scope.go:117] "RemoveContainer" containerID="eeb10dcd78cd2d90546d566f0e0b821475691080a4e0457fe3cd11aec25b4dee" Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.042216 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-zdfxt" podStartSLOduration=2.04218666 podStartE2EDuration="2.04218666s" podCreationTimestamp="2025-12-06 01:22:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:22:16.91614939 +0000 UTC m=+1210.266439858" watchObservedRunningTime="2025-12-06 01:22:17.04218666 +0000 UTC m=+1210.392477128" Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.149077 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-lnjwh"] Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.202328 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-lnjwh"] Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.232158 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-676c488777-8p6b4"] Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.291278 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-669lm"] Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.324403 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fcfdd6f9f-vx96n"] Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.472407 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-59788c8c95-f8d4v"] Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.554445 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5b44fc996f-ntcqm"] Dec 06 01:22:17 crc kubenswrapper[4991]: E1206 01:22:17.555074 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d73d18a0-25b2-4b77-a62d-b6766b700f3b" containerName="init" Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.555090 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="d73d18a0-25b2-4b77-a62d-b6766b700f3b" containerName="init" Dec 06 01:22:17 crc kubenswrapper[4991]: E1206 01:22:17.555114 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d73d18a0-25b2-4b77-a62d-b6766b700f3b" containerName="dnsmasq-dns" Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.555125 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="d73d18a0-25b2-4b77-a62d-b6766b700f3b" containerName="dnsmasq-dns" Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.555362 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="d73d18a0-25b2-4b77-a62d-b6766b700f3b" containerName="dnsmasq-dns" Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.556641 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5b44fc996f-ntcqm" Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.575681 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5b44fc996f-ntcqm"] Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.624395 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.691113 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f8c45789f-7gnlm" Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.735167 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/860826f9-f426-4670-be89-0ae09e917d65-config-data\") pod \"horizon-5b44fc996f-ntcqm\" (UID: \"860826f9-f426-4670-be89-0ae09e917d65\") " pod="openstack/horizon-5b44fc996f-ntcqm" Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.735217 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/860826f9-f426-4670-be89-0ae09e917d65-logs\") pod \"horizon-5b44fc996f-ntcqm\" (UID: \"860826f9-f426-4670-be89-0ae09e917d65\") " pod="openstack/horizon-5b44fc996f-ntcqm" Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.735267 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/860826f9-f426-4670-be89-0ae09e917d65-horizon-secret-key\") pod \"horizon-5b44fc996f-ntcqm\" (UID: \"860826f9-f426-4670-be89-0ae09e917d65\") " pod="openstack/horizon-5b44fc996f-ntcqm" Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.735314 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgjqs\" (UniqueName: \"kubernetes.io/projected/860826f9-f426-4670-be89-0ae09e917d65-kube-api-access-zgjqs\") pod \"horizon-5b44fc996f-ntcqm\" (UID: \"860826f9-f426-4670-be89-0ae09e917d65\") " pod="openstack/horizon-5b44fc996f-ntcqm" Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.735383 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/860826f9-f426-4670-be89-0ae09e917d65-scripts\") pod \"horizon-5b44fc996f-ntcqm\" (UID: \"860826f9-f426-4670-be89-0ae09e917d65\") " pod="openstack/horizon-5b44fc996f-ntcqm" Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.835926 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8d73ecd-6cba-4db9-b60e-e8613f2ee903-dns-svc\") pod \"c8d73ecd-6cba-4db9-b60e-e8613f2ee903\" (UID: \"c8d73ecd-6cba-4db9-b60e-e8613f2ee903\") " Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.836011 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tp8tc\" (UniqueName: \"kubernetes.io/projected/c8d73ecd-6cba-4db9-b60e-e8613f2ee903-kube-api-access-tp8tc\") pod \"c8d73ecd-6cba-4db9-b60e-e8613f2ee903\" (UID: \"c8d73ecd-6cba-4db9-b60e-e8613f2ee903\") " Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.836166 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8d73ecd-6cba-4db9-b60e-e8613f2ee903-ovsdbserver-nb\") pod \"c8d73ecd-6cba-4db9-b60e-e8613f2ee903\" (UID: \"c8d73ecd-6cba-4db9-b60e-e8613f2ee903\") " Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.836260 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8d73ecd-6cba-4db9-b60e-e8613f2ee903-ovsdbserver-sb\") pod \"c8d73ecd-6cba-4db9-b60e-e8613f2ee903\" (UID: \"c8d73ecd-6cba-4db9-b60e-e8613f2ee903\") " Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.836281 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8d73ecd-6cba-4db9-b60e-e8613f2ee903-config\") pod \"c8d73ecd-6cba-4db9-b60e-e8613f2ee903\" (UID: \"c8d73ecd-6cba-4db9-b60e-e8613f2ee903\") " Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.836360 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c8d73ecd-6cba-4db9-b60e-e8613f2ee903-dns-swift-storage-0\") pod \"c8d73ecd-6cba-4db9-b60e-e8613f2ee903\" (UID: \"c8d73ecd-6cba-4db9-b60e-e8613f2ee903\") " Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.836580 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/860826f9-f426-4670-be89-0ae09e917d65-scripts\") pod \"horizon-5b44fc996f-ntcqm\" (UID: \"860826f9-f426-4670-be89-0ae09e917d65\") " pod="openstack/horizon-5b44fc996f-ntcqm" Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.836627 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/860826f9-f426-4670-be89-0ae09e917d65-config-data\") pod \"horizon-5b44fc996f-ntcqm\" (UID: \"860826f9-f426-4670-be89-0ae09e917d65\") " pod="openstack/horizon-5b44fc996f-ntcqm" Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.836649 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/860826f9-f426-4670-be89-0ae09e917d65-logs\") pod \"horizon-5b44fc996f-ntcqm\" (UID: \"860826f9-f426-4670-be89-0ae09e917d65\") " pod="openstack/horizon-5b44fc996f-ntcqm" Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.837126 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/860826f9-f426-4670-be89-0ae09e917d65-horizon-secret-key\") pod \"horizon-5b44fc996f-ntcqm\" (UID: \"860826f9-f426-4670-be89-0ae09e917d65\") " pod="openstack/horizon-5b44fc996f-ntcqm" Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.837170 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgjqs\" (UniqueName: \"kubernetes.io/projected/860826f9-f426-4670-be89-0ae09e917d65-kube-api-access-zgjqs\") pod \"horizon-5b44fc996f-ntcqm\" (UID: \"860826f9-f426-4670-be89-0ae09e917d65\") " pod="openstack/horizon-5b44fc996f-ntcqm" Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.839673 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/860826f9-f426-4670-be89-0ae09e917d65-scripts\") pod \"horizon-5b44fc996f-ntcqm\" (UID: \"860826f9-f426-4670-be89-0ae09e917d65\") " pod="openstack/horizon-5b44fc996f-ntcqm" Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.840113 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/860826f9-f426-4670-be89-0ae09e917d65-logs\") pod \"horizon-5b44fc996f-ntcqm\" (UID: \"860826f9-f426-4670-be89-0ae09e917d65\") " pod="openstack/horizon-5b44fc996f-ntcqm" Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.841556 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/860826f9-f426-4670-be89-0ae09e917d65-config-data\") pod \"horizon-5b44fc996f-ntcqm\" (UID: \"860826f9-f426-4670-be89-0ae09e917d65\") " pod="openstack/horizon-5b44fc996f-ntcqm" Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.843340 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8d73ecd-6cba-4db9-b60e-e8613f2ee903-kube-api-access-tp8tc" (OuterVolumeSpecName: "kube-api-access-tp8tc") pod "c8d73ecd-6cba-4db9-b60e-e8613f2ee903" (UID: "c8d73ecd-6cba-4db9-b60e-e8613f2ee903"). InnerVolumeSpecName "kube-api-access-tp8tc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.863581 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/860826f9-f426-4670-be89-0ae09e917d65-horizon-secret-key\") pod \"horizon-5b44fc996f-ntcqm\" (UID: \"860826f9-f426-4670-be89-0ae09e917d65\") " pod="openstack/horizon-5b44fc996f-ntcqm" Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.883198 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgjqs\" (UniqueName: \"kubernetes.io/projected/860826f9-f426-4670-be89-0ae09e917d65-kube-api-access-zgjqs\") pod \"horizon-5b44fc996f-ntcqm\" (UID: \"860826f9-f426-4670-be89-0ae09e917d65\") " pod="openstack/horizon-5b44fc996f-ntcqm" Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.897147 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8d73ecd-6cba-4db9-b60e-e8613f2ee903-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c8d73ecd-6cba-4db9-b60e-e8613f2ee903" (UID: "c8d73ecd-6cba-4db9-b60e-e8613f2ee903"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.906919 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8d73ecd-6cba-4db9-b60e-e8613f2ee903-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c8d73ecd-6cba-4db9-b60e-e8613f2ee903" (UID: "c8d73ecd-6cba-4db9-b60e-e8613f2ee903"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.923630 4991 generic.go:334] "Generic (PLEG): container finished" podID="c8d73ecd-6cba-4db9-b60e-e8613f2ee903" containerID="9876dbcea9b33d867922019678cd2f942388aa853ff13839b24d83af7d0cfa86" exitCode=0 Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.924579 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f8c45789f-7gnlm" event={"ID":"c8d73ecd-6cba-4db9-b60e-e8613f2ee903","Type":"ContainerDied","Data":"9876dbcea9b33d867922019678cd2f942388aa853ff13839b24d83af7d0cfa86"} Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.930525 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f8c45789f-7gnlm" event={"ID":"c8d73ecd-6cba-4db9-b60e-e8613f2ee903","Type":"ContainerDied","Data":"30c6cd7054e90a165266a7f9111eb73a71eb8b98b8b6f3f49df65dbd28ebfdf1"} Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.930586 4991 scope.go:117] "RemoveContainer" containerID="9876dbcea9b33d867922019678cd2f942388aa853ff13839b24d83af7d0cfa86" Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.931001 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f8c45789f-7gnlm" Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.936874 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-7sjq8" event={"ID":"1f102ba4-81e9-4d61-b5a7-90126ab9dd80","Type":"ContainerStarted","Data":"b5284fb9c382af08233d541cce1c61397b12a6ccd2cbc14469ff8f1064cb392d"} Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.939412 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8d73ecd-6cba-4db9-b60e-e8613f2ee903-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.939435 4991 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8d73ecd-6cba-4db9-b60e-e8613f2ee903-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.939447 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tp8tc\" (UniqueName: \"kubernetes.io/projected/c8d73ecd-6cba-4db9-b60e-e8613f2ee903-kube-api-access-tp8tc\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.943398 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"62fc1d5f-d757-46ae-bea1-2a68ae61a794","Type":"ContainerStarted","Data":"07382850d344806885e3e3fe0fa323ed1da45cb7fd0f390e2e8ab1fb6912c0e5"} Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.946721 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5b44fc996f-ntcqm" Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.947899 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8d73ecd-6cba-4db9-b60e-e8613f2ee903-config" (OuterVolumeSpecName: "config") pod "c8d73ecd-6cba-4db9-b60e-e8613f2ee903" (UID: "c8d73ecd-6cba-4db9-b60e-e8613f2ee903"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.953250 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-676c488777-8p6b4" event={"ID":"8d2c212d-7062-4b49-948d-ee5ebccf9701","Type":"ContainerStarted","Data":"8b1501d6ecc916adfe7db0d6aa36c050c3302e1c889090db572844054c8b5dae"} Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.957544 4991 generic.go:334] "Generic (PLEG): container finished" podID="dbd0cb1a-1401-4031-b980-ef43103702a3" containerID="b98e9f30a746c9a4915cf9a938394e3fb83ee03190d0c733709a89aca75599c2" exitCode=0 Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.957610 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcfdd6f9f-vx96n" event={"ID":"dbd0cb1a-1401-4031-b980-ef43103702a3","Type":"ContainerDied","Data":"b98e9f30a746c9a4915cf9a938394e3fb83ee03190d0c733709a89aca75599c2"} Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.957641 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcfdd6f9f-vx96n" event={"ID":"dbd0cb1a-1401-4031-b980-ef43103702a3","Type":"ContainerStarted","Data":"7f972dbb7405c1b3ef4ece8f0b690c43e3eaaa07d80583640deebd6aa787acfa"} Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.961781 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-669lm" event={"ID":"01da8e85-cc82-46af-9370-419a710b2a8d","Type":"ContainerStarted","Data":"37e5f2f512e37b70fa5a33e5507bd1fdb5abdd9ec0f6282946c2551e6c0715c2"} Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.979764 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8d73ecd-6cba-4db9-b60e-e8613f2ee903-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c8d73ecd-6cba-4db9-b60e-e8613f2ee903" (UID: "c8d73ecd-6cba-4db9-b60e-e8613f2ee903"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.982497 4991 scope.go:117] "RemoveContainer" containerID="9876dbcea9b33d867922019678cd2f942388aa853ff13839b24d83af7d0cfa86" Dec 06 01:22:17 crc kubenswrapper[4991]: E1206 01:22:17.982919 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9876dbcea9b33d867922019678cd2f942388aa853ff13839b24d83af7d0cfa86\": container with ID starting with 9876dbcea9b33d867922019678cd2f942388aa853ff13839b24d83af7d0cfa86 not found: ID does not exist" containerID="9876dbcea9b33d867922019678cd2f942388aa853ff13839b24d83af7d0cfa86" Dec 06 01:22:17 crc kubenswrapper[4991]: I1206 01:22:17.982945 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9876dbcea9b33d867922019678cd2f942388aa853ff13839b24d83af7d0cfa86"} err="failed to get container status \"9876dbcea9b33d867922019678cd2f942388aa853ff13839b24d83af7d0cfa86\": rpc error: code = NotFound desc = could not find container \"9876dbcea9b33d867922019678cd2f942388aa853ff13839b24d83af7d0cfa86\": container with ID starting with 9876dbcea9b33d867922019678cd2f942388aa853ff13839b24d83af7d0cfa86 not found: ID does not exist" Dec 06 01:22:18 crc kubenswrapper[4991]: I1206 01:22:18.004961 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8d73ecd-6cba-4db9-b60e-e8613f2ee903-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c8d73ecd-6cba-4db9-b60e-e8613f2ee903" (UID: "c8d73ecd-6cba-4db9-b60e-e8613f2ee903"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:22:18 crc kubenswrapper[4991]: I1206 01:22:18.041911 4991 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c8d73ecd-6cba-4db9-b60e-e8613f2ee903-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:18 crc kubenswrapper[4991]: I1206 01:22:18.041953 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8d73ecd-6cba-4db9-b60e-e8613f2ee903-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:18 crc kubenswrapper[4991]: I1206 01:22:18.041968 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8d73ecd-6cba-4db9-b60e-e8613f2ee903-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:18 crc kubenswrapper[4991]: I1206 01:22:18.343794 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f8c45789f-7gnlm"] Dec 06 01:22:18 crc kubenswrapper[4991]: I1206 01:22:18.366323 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f8c45789f-7gnlm"] Dec 06 01:22:18 crc kubenswrapper[4991]: I1206 01:22:18.501635 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5b44fc996f-ntcqm"] Dec 06 01:22:18 crc kubenswrapper[4991]: I1206 01:22:18.979439 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b44fc996f-ntcqm" event={"ID":"860826f9-f426-4670-be89-0ae09e917d65","Type":"ContainerStarted","Data":"fef4336a4fed8478aa64e1f031e811544433cc8cd84bd579b11d9973b2e7de2d"} Dec 06 01:22:18 crc kubenswrapper[4991]: I1206 01:22:18.997856 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcfdd6f9f-vx96n" event={"ID":"dbd0cb1a-1401-4031-b980-ef43103702a3","Type":"ContainerStarted","Data":"9c934d2d7ee53dbcc0b8a8a439ed753250da5345430007a106d20855b1af3ec1"} Dec 06 01:22:19 crc kubenswrapper[4991]: I1206 01:22:19.017443 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-fcfdd6f9f-vx96n" podStartSLOduration=4.017419346 podStartE2EDuration="4.017419346s" podCreationTimestamp="2025-12-06 01:22:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:22:19.016209874 +0000 UTC m=+1212.366500352" watchObservedRunningTime="2025-12-06 01:22:19.017419346 +0000 UTC m=+1212.367709814" Dec 06 01:22:19 crc kubenswrapper[4991]: I1206 01:22:19.057725 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8d73ecd-6cba-4db9-b60e-e8613f2ee903" path="/var/lib/kubelet/pods/c8d73ecd-6cba-4db9-b60e-e8613f2ee903/volumes" Dec 06 01:22:19 crc kubenswrapper[4991]: I1206 01:22:19.058278 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d73d18a0-25b2-4b77-a62d-b6766b700f3b" path="/var/lib/kubelet/pods/d73d18a0-25b2-4b77-a62d-b6766b700f3b/volumes" Dec 06 01:22:20 crc kubenswrapper[4991]: I1206 01:22:20.029571 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-fcfdd6f9f-vx96n" Dec 06 01:22:23 crc kubenswrapper[4991]: I1206 01:22:23.104081 4991 generic.go:334] "Generic (PLEG): container finished" podID="4b098106-2ef2-4f7a-adad-6b9b3349d846" containerID="56968dca98b9f302ced297b2fdc62f37175ba969001325560b448e6f9b944ad6" exitCode=0 Dec 06 01:22:23 crc kubenswrapper[4991]: I1206 01:22:23.104625 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-zdfxt" event={"ID":"4b098106-2ef2-4f7a-adad-6b9b3349d846","Type":"ContainerDied","Data":"56968dca98b9f302ced297b2fdc62f37175ba969001325560b448e6f9b944ad6"} Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.103229 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-676c488777-8p6b4"] Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.137426 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-b4cd4b7d8-99kjr"] Dec 06 01:22:24 crc kubenswrapper[4991]: E1206 01:22:24.137833 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8d73ecd-6cba-4db9-b60e-e8613f2ee903" containerName="init" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.137846 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8d73ecd-6cba-4db9-b60e-e8613f2ee903" containerName="init" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.138021 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8d73ecd-6cba-4db9-b60e-e8613f2ee903" containerName="init" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.139009 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b4cd4b7d8-99kjr" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.143545 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.168551 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-b4cd4b7d8-99kjr"] Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.197687 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff6a4088-3509-40aa-a3f6-917e289cd44f-logs\") pod \"horizon-b4cd4b7d8-99kjr\" (UID: \"ff6a4088-3509-40aa-a3f6-917e289cd44f\") " pod="openstack/horizon-b4cd4b7d8-99kjr" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.197784 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ff6a4088-3509-40aa-a3f6-917e289cd44f-config-data\") pod \"horizon-b4cd4b7d8-99kjr\" (UID: \"ff6a4088-3509-40aa-a3f6-917e289cd44f\") " pod="openstack/horizon-b4cd4b7d8-99kjr" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.198044 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvl2d\" (UniqueName: \"kubernetes.io/projected/ff6a4088-3509-40aa-a3f6-917e289cd44f-kube-api-access-xvl2d\") pod \"horizon-b4cd4b7d8-99kjr\" (UID: \"ff6a4088-3509-40aa-a3f6-917e289cd44f\") " pod="openstack/horizon-b4cd4b7d8-99kjr" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.198840 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff6a4088-3509-40aa-a3f6-917e289cd44f-scripts\") pod \"horizon-b4cd4b7d8-99kjr\" (UID: \"ff6a4088-3509-40aa-a3f6-917e289cd44f\") " pod="openstack/horizon-b4cd4b7d8-99kjr" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.198894 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff6a4088-3509-40aa-a3f6-917e289cd44f-horizon-tls-certs\") pod \"horizon-b4cd4b7d8-99kjr\" (UID: \"ff6a4088-3509-40aa-a3f6-917e289cd44f\") " pod="openstack/horizon-b4cd4b7d8-99kjr" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.198974 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ff6a4088-3509-40aa-a3f6-917e289cd44f-horizon-secret-key\") pod \"horizon-b4cd4b7d8-99kjr\" (UID: \"ff6a4088-3509-40aa-a3f6-917e289cd44f\") " pod="openstack/horizon-b4cd4b7d8-99kjr" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.199048 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff6a4088-3509-40aa-a3f6-917e289cd44f-combined-ca-bundle\") pod \"horizon-b4cd4b7d8-99kjr\" (UID: \"ff6a4088-3509-40aa-a3f6-917e289cd44f\") " pod="openstack/horizon-b4cd4b7d8-99kjr" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.252215 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5b44fc996f-ntcqm"] Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.301204 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff6a4088-3509-40aa-a3f6-917e289cd44f-logs\") pod \"horizon-b4cd4b7d8-99kjr\" (UID: \"ff6a4088-3509-40aa-a3f6-917e289cd44f\") " pod="openstack/horizon-b4cd4b7d8-99kjr" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.301292 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ff6a4088-3509-40aa-a3f6-917e289cd44f-config-data\") pod \"horizon-b4cd4b7d8-99kjr\" (UID: \"ff6a4088-3509-40aa-a3f6-917e289cd44f\") " pod="openstack/horizon-b4cd4b7d8-99kjr" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.301335 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvl2d\" (UniqueName: \"kubernetes.io/projected/ff6a4088-3509-40aa-a3f6-917e289cd44f-kube-api-access-xvl2d\") pod \"horizon-b4cd4b7d8-99kjr\" (UID: \"ff6a4088-3509-40aa-a3f6-917e289cd44f\") " pod="openstack/horizon-b4cd4b7d8-99kjr" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.301425 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff6a4088-3509-40aa-a3f6-917e289cd44f-scripts\") pod \"horizon-b4cd4b7d8-99kjr\" (UID: \"ff6a4088-3509-40aa-a3f6-917e289cd44f\") " pod="openstack/horizon-b4cd4b7d8-99kjr" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.301453 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff6a4088-3509-40aa-a3f6-917e289cd44f-horizon-tls-certs\") pod \"horizon-b4cd4b7d8-99kjr\" (UID: \"ff6a4088-3509-40aa-a3f6-917e289cd44f\") " pod="openstack/horizon-b4cd4b7d8-99kjr" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.301486 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ff6a4088-3509-40aa-a3f6-917e289cd44f-horizon-secret-key\") pod \"horizon-b4cd4b7d8-99kjr\" (UID: \"ff6a4088-3509-40aa-a3f6-917e289cd44f\") " pod="openstack/horizon-b4cd4b7d8-99kjr" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.301515 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff6a4088-3509-40aa-a3f6-917e289cd44f-combined-ca-bundle\") pod \"horizon-b4cd4b7d8-99kjr\" (UID: \"ff6a4088-3509-40aa-a3f6-917e289cd44f\") " pod="openstack/horizon-b4cd4b7d8-99kjr" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.305045 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ff6a4088-3509-40aa-a3f6-917e289cd44f-config-data\") pod \"horizon-b4cd4b7d8-99kjr\" (UID: \"ff6a4088-3509-40aa-a3f6-917e289cd44f\") " pod="openstack/horizon-b4cd4b7d8-99kjr" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.312468 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff6a4088-3509-40aa-a3f6-917e289cd44f-logs\") pod \"horizon-b4cd4b7d8-99kjr\" (UID: \"ff6a4088-3509-40aa-a3f6-917e289cd44f\") " pod="openstack/horizon-b4cd4b7d8-99kjr" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.314771 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff6a4088-3509-40aa-a3f6-917e289cd44f-scripts\") pod \"horizon-b4cd4b7d8-99kjr\" (UID: \"ff6a4088-3509-40aa-a3f6-917e289cd44f\") " pod="openstack/horizon-b4cd4b7d8-99kjr" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.320875 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ff6a4088-3509-40aa-a3f6-917e289cd44f-horizon-secret-key\") pod \"horizon-b4cd4b7d8-99kjr\" (UID: \"ff6a4088-3509-40aa-a3f6-917e289cd44f\") " pod="openstack/horizon-b4cd4b7d8-99kjr" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.323622 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff6a4088-3509-40aa-a3f6-917e289cd44f-combined-ca-bundle\") pod \"horizon-b4cd4b7d8-99kjr\" (UID: \"ff6a4088-3509-40aa-a3f6-917e289cd44f\") " pod="openstack/horizon-b4cd4b7d8-99kjr" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.325394 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff6a4088-3509-40aa-a3f6-917e289cd44f-horizon-tls-certs\") pod \"horizon-b4cd4b7d8-99kjr\" (UID: \"ff6a4088-3509-40aa-a3f6-917e289cd44f\") " pod="openstack/horizon-b4cd4b7d8-99kjr" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.334961 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvl2d\" (UniqueName: \"kubernetes.io/projected/ff6a4088-3509-40aa-a3f6-917e289cd44f-kube-api-access-xvl2d\") pod \"horizon-b4cd4b7d8-99kjr\" (UID: \"ff6a4088-3509-40aa-a3f6-917e289cd44f\") " pod="openstack/horizon-b4cd4b7d8-99kjr" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.348841 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-577fd657dd-psmdm"] Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.359817 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-577fd657dd-psmdm" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.389078 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-577fd657dd-psmdm"] Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.402932 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15de5638-5e2b-4ce1-9d25-8f2830cd9241-logs\") pod \"horizon-577fd657dd-psmdm\" (UID: \"15de5638-5e2b-4ce1-9d25-8f2830cd9241\") " pod="openstack/horizon-577fd657dd-psmdm" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.403005 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/15de5638-5e2b-4ce1-9d25-8f2830cd9241-horizon-tls-certs\") pod \"horizon-577fd657dd-psmdm\" (UID: \"15de5638-5e2b-4ce1-9d25-8f2830cd9241\") " pod="openstack/horizon-577fd657dd-psmdm" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.403069 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/15de5638-5e2b-4ce1-9d25-8f2830cd9241-scripts\") pod \"horizon-577fd657dd-psmdm\" (UID: \"15de5638-5e2b-4ce1-9d25-8f2830cd9241\") " pod="openstack/horizon-577fd657dd-psmdm" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.403115 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v5kv\" (UniqueName: \"kubernetes.io/projected/15de5638-5e2b-4ce1-9d25-8f2830cd9241-kube-api-access-6v5kv\") pod \"horizon-577fd657dd-psmdm\" (UID: \"15de5638-5e2b-4ce1-9d25-8f2830cd9241\") " pod="openstack/horizon-577fd657dd-psmdm" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.403138 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/15de5638-5e2b-4ce1-9d25-8f2830cd9241-config-data\") pod \"horizon-577fd657dd-psmdm\" (UID: \"15de5638-5e2b-4ce1-9d25-8f2830cd9241\") " pod="openstack/horizon-577fd657dd-psmdm" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.403163 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15de5638-5e2b-4ce1-9d25-8f2830cd9241-combined-ca-bundle\") pod \"horizon-577fd657dd-psmdm\" (UID: \"15de5638-5e2b-4ce1-9d25-8f2830cd9241\") " pod="openstack/horizon-577fd657dd-psmdm" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.403210 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/15de5638-5e2b-4ce1-9d25-8f2830cd9241-horizon-secret-key\") pod \"horizon-577fd657dd-psmdm\" (UID: \"15de5638-5e2b-4ce1-9d25-8f2830cd9241\") " pod="openstack/horizon-577fd657dd-psmdm" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.485895 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b4cd4b7d8-99kjr" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.506235 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/15de5638-5e2b-4ce1-9d25-8f2830cd9241-scripts\") pod \"horizon-577fd657dd-psmdm\" (UID: \"15de5638-5e2b-4ce1-9d25-8f2830cd9241\") " pod="openstack/horizon-577fd657dd-psmdm" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.506299 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v5kv\" (UniqueName: \"kubernetes.io/projected/15de5638-5e2b-4ce1-9d25-8f2830cd9241-kube-api-access-6v5kv\") pod \"horizon-577fd657dd-psmdm\" (UID: \"15de5638-5e2b-4ce1-9d25-8f2830cd9241\") " pod="openstack/horizon-577fd657dd-psmdm" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.506330 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/15de5638-5e2b-4ce1-9d25-8f2830cd9241-config-data\") pod \"horizon-577fd657dd-psmdm\" (UID: \"15de5638-5e2b-4ce1-9d25-8f2830cd9241\") " pod="openstack/horizon-577fd657dd-psmdm" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.506361 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15de5638-5e2b-4ce1-9d25-8f2830cd9241-combined-ca-bundle\") pod \"horizon-577fd657dd-psmdm\" (UID: \"15de5638-5e2b-4ce1-9d25-8f2830cd9241\") " pod="openstack/horizon-577fd657dd-psmdm" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.506416 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/15de5638-5e2b-4ce1-9d25-8f2830cd9241-horizon-secret-key\") pod \"horizon-577fd657dd-psmdm\" (UID: \"15de5638-5e2b-4ce1-9d25-8f2830cd9241\") " pod="openstack/horizon-577fd657dd-psmdm" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.506473 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15de5638-5e2b-4ce1-9d25-8f2830cd9241-logs\") pod \"horizon-577fd657dd-psmdm\" (UID: \"15de5638-5e2b-4ce1-9d25-8f2830cd9241\") " pod="openstack/horizon-577fd657dd-psmdm" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.506495 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/15de5638-5e2b-4ce1-9d25-8f2830cd9241-horizon-tls-certs\") pod \"horizon-577fd657dd-psmdm\" (UID: \"15de5638-5e2b-4ce1-9d25-8f2830cd9241\") " pod="openstack/horizon-577fd657dd-psmdm" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.507541 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/15de5638-5e2b-4ce1-9d25-8f2830cd9241-scripts\") pod \"horizon-577fd657dd-psmdm\" (UID: \"15de5638-5e2b-4ce1-9d25-8f2830cd9241\") " pod="openstack/horizon-577fd657dd-psmdm" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.508228 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15de5638-5e2b-4ce1-9d25-8f2830cd9241-logs\") pod \"horizon-577fd657dd-psmdm\" (UID: \"15de5638-5e2b-4ce1-9d25-8f2830cd9241\") " pod="openstack/horizon-577fd657dd-psmdm" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.508340 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/15de5638-5e2b-4ce1-9d25-8f2830cd9241-config-data\") pod \"horizon-577fd657dd-psmdm\" (UID: \"15de5638-5e2b-4ce1-9d25-8f2830cd9241\") " pod="openstack/horizon-577fd657dd-psmdm" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.511890 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/15de5638-5e2b-4ce1-9d25-8f2830cd9241-horizon-secret-key\") pod \"horizon-577fd657dd-psmdm\" (UID: \"15de5638-5e2b-4ce1-9d25-8f2830cd9241\") " pod="openstack/horizon-577fd657dd-psmdm" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.512301 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/15de5638-5e2b-4ce1-9d25-8f2830cd9241-horizon-tls-certs\") pod \"horizon-577fd657dd-psmdm\" (UID: \"15de5638-5e2b-4ce1-9d25-8f2830cd9241\") " pod="openstack/horizon-577fd657dd-psmdm" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.512778 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15de5638-5e2b-4ce1-9d25-8f2830cd9241-combined-ca-bundle\") pod \"horizon-577fd657dd-psmdm\" (UID: \"15de5638-5e2b-4ce1-9d25-8f2830cd9241\") " pod="openstack/horizon-577fd657dd-psmdm" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.540139 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v5kv\" (UniqueName: \"kubernetes.io/projected/15de5638-5e2b-4ce1-9d25-8f2830cd9241-kube-api-access-6v5kv\") pod \"horizon-577fd657dd-psmdm\" (UID: \"15de5638-5e2b-4ce1-9d25-8f2830cd9241\") " pod="openstack/horizon-577fd657dd-psmdm" Dec 06 01:22:24 crc kubenswrapper[4991]: I1206 01:22:24.745679 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-577fd657dd-psmdm" Dec 06 01:22:26 crc kubenswrapper[4991]: I1206 01:22:26.116150 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-fcfdd6f9f-vx96n" Dec 06 01:22:26 crc kubenswrapper[4991]: I1206 01:22:26.193038 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-2wbr6"] Dec 06 01:22:26 crc kubenswrapper[4991]: I1206 01:22:26.193420 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-2wbr6" podUID="257d40ce-9d2a-4cc4-9e89-4cb777582c47" containerName="dnsmasq-dns" containerID="cri-o://4446c7bbe395dc5b4d0b88f1e738a6013f2ff5fd7470461981530e62075d68d1" gracePeriod=10 Dec 06 01:22:27 crc kubenswrapper[4991]: I1206 01:22:27.180619 4991 generic.go:334] "Generic (PLEG): container finished" podID="257d40ce-9d2a-4cc4-9e89-4cb777582c47" containerID="4446c7bbe395dc5b4d0b88f1e738a6013f2ff5fd7470461981530e62075d68d1" exitCode=0 Dec 06 01:22:27 crc kubenswrapper[4991]: I1206 01:22:27.180722 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-2wbr6" event={"ID":"257d40ce-9d2a-4cc4-9e89-4cb777582c47","Type":"ContainerDied","Data":"4446c7bbe395dc5b4d0b88f1e738a6013f2ff5fd7470461981530e62075d68d1"} Dec 06 01:22:29 crc kubenswrapper[4991]: I1206 01:22:29.476974 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-2wbr6" podUID="257d40ce-9d2a-4cc4-9e89-4cb777582c47" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: connect: connection refused" Dec 06 01:22:32 crc kubenswrapper[4991]: I1206 01:22:32.237235 4991 generic.go:334] "Generic (PLEG): container finished" podID="1befea2d-2e39-4901-98ef-da500dcb82f9" containerID="53ca5ffaa63419d3c128994a5584380b438b4f851235cc4a9fef77cc891d9f7f" exitCode=0 Dec 06 01:22:32 crc kubenswrapper[4991]: I1206 01:22:32.237733 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-rssqk" event={"ID":"1befea2d-2e39-4901-98ef-da500dcb82f9","Type":"ContainerDied","Data":"53ca5ffaa63419d3c128994a5584380b438b4f851235cc4a9fef77cc891d9f7f"} Dec 06 01:22:32 crc kubenswrapper[4991]: E1206 01:22:32.598467 4991 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 06 01:22:32 crc kubenswrapper[4991]: E1206 01:22:32.598690 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n98hf8h647h699h5d5h55chf8h7h66dhbch5fch5dch76h64dhb7h659hc6h58ch6bhc7h574h55bh587h569h8dh576h75h659h64h569hb4h5cbq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rgf9h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-59788c8c95-f8d4v_openstack(41dee50a-eb94-4447-b27a-9480e91a10e1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 01:22:32 crc kubenswrapper[4991]: E1206 01:22:32.601276 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-59788c8c95-f8d4v" podUID="41dee50a-eb94-4447-b27a-9480e91a10e1" Dec 06 01:22:32 crc kubenswrapper[4991]: E1206 01:22:32.617628 4991 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 06 01:22:32 crc kubenswrapper[4991]: E1206 01:22:32.617846 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n546h57h57dh5d4hdchb6h688h5f7h68h64fh688hd9hb5hd5hbh5cfh699h658h688h6bh5dchbbh56chf4h65fh666h687hb9hb4hbbhc5h577q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zgjqs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-5b44fc996f-ntcqm_openstack(860826f9-f426-4670-be89-0ae09e917d65): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 01:22:32 crc kubenswrapper[4991]: E1206 01:22:32.620588 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-5b44fc996f-ntcqm" podUID="860826f9-f426-4670-be89-0ae09e917d65" Dec 06 01:22:34 crc kubenswrapper[4991]: I1206 01:22:34.477355 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-2wbr6" podUID="257d40ce-9d2a-4cc4-9e89-4cb777582c47" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: connect: connection refused" Dec 06 01:22:34 crc kubenswrapper[4991]: E1206 01:22:34.814479 4991 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Dec 06 01:22:34 crc kubenswrapper[4991]: E1206 01:22:34.814808 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n7h8h595h699h69h669h554h549h5dbh548h567h5f8hbh54h68bh59fh8ch55dhc7h57hdch55ch5bbh67h9ch5d8hb9h5cbh5d4h5bh5f7h9q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g7pcc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(62fc1d5f-d757-46ae-bea1-2a68ae61a794): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 01:22:34 crc kubenswrapper[4991]: E1206 01:22:34.845245 4991 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 06 01:22:34 crc kubenswrapper[4991]: E1206 01:22:34.845461 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n599h56ch696h584h584h5f8h556hb6h678h89hc7h5dh654h9fh9fhd7hfdh74h65fh65bh58fh695h5bfh57ch5c5hch58fh9fh55dh666h57dh548q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4dm6c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-676c488777-8p6b4_openstack(8d2c212d-7062-4b49-948d-ee5ebccf9701): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 01:22:34 crc kubenswrapper[4991]: E1206 01:22:34.847773 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-676c488777-8p6b4" podUID="8d2c212d-7062-4b49-948d-ee5ebccf9701" Dec 06 01:22:34 crc kubenswrapper[4991]: I1206 01:22:34.900839 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zdfxt" Dec 06 01:22:34 crc kubenswrapper[4991]: I1206 01:22:34.957767 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4b098106-2ef2-4f7a-adad-6b9b3349d846-fernet-keys\") pod \"4b098106-2ef2-4f7a-adad-6b9b3349d846\" (UID: \"4b098106-2ef2-4f7a-adad-6b9b3349d846\") " Dec 06 01:22:34 crc kubenswrapper[4991]: I1206 01:22:34.958217 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b098106-2ef2-4f7a-adad-6b9b3349d846-scripts\") pod \"4b098106-2ef2-4f7a-adad-6b9b3349d846\" (UID: \"4b098106-2ef2-4f7a-adad-6b9b3349d846\") " Dec 06 01:22:34 crc kubenswrapper[4991]: I1206 01:22:34.958323 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4b098106-2ef2-4f7a-adad-6b9b3349d846-credential-keys\") pod \"4b098106-2ef2-4f7a-adad-6b9b3349d846\" (UID: \"4b098106-2ef2-4f7a-adad-6b9b3349d846\") " Dec 06 01:22:34 crc kubenswrapper[4991]: I1206 01:22:34.958390 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b098106-2ef2-4f7a-adad-6b9b3349d846-combined-ca-bundle\") pod \"4b098106-2ef2-4f7a-adad-6b9b3349d846\" (UID: \"4b098106-2ef2-4f7a-adad-6b9b3349d846\") " Dec 06 01:22:34 crc kubenswrapper[4991]: I1206 01:22:34.958554 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b098106-2ef2-4f7a-adad-6b9b3349d846-config-data\") pod \"4b098106-2ef2-4f7a-adad-6b9b3349d846\" (UID: \"4b098106-2ef2-4f7a-adad-6b9b3349d846\") " Dec 06 01:22:34 crc kubenswrapper[4991]: I1206 01:22:34.958615 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jbkt\" (UniqueName: \"kubernetes.io/projected/4b098106-2ef2-4f7a-adad-6b9b3349d846-kube-api-access-2jbkt\") pod \"4b098106-2ef2-4f7a-adad-6b9b3349d846\" (UID: \"4b098106-2ef2-4f7a-adad-6b9b3349d846\") " Dec 06 01:22:34 crc kubenswrapper[4991]: I1206 01:22:34.966796 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b098106-2ef2-4f7a-adad-6b9b3349d846-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4b098106-2ef2-4f7a-adad-6b9b3349d846" (UID: "4b098106-2ef2-4f7a-adad-6b9b3349d846"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:22:34 crc kubenswrapper[4991]: I1206 01:22:34.967865 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b098106-2ef2-4f7a-adad-6b9b3349d846-kube-api-access-2jbkt" (OuterVolumeSpecName: "kube-api-access-2jbkt") pod "4b098106-2ef2-4f7a-adad-6b9b3349d846" (UID: "4b098106-2ef2-4f7a-adad-6b9b3349d846"). InnerVolumeSpecName "kube-api-access-2jbkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:22:34 crc kubenswrapper[4991]: I1206 01:22:34.968351 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b098106-2ef2-4f7a-adad-6b9b3349d846-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "4b098106-2ef2-4f7a-adad-6b9b3349d846" (UID: "4b098106-2ef2-4f7a-adad-6b9b3349d846"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:22:34 crc kubenswrapper[4991]: I1206 01:22:34.968563 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b098106-2ef2-4f7a-adad-6b9b3349d846-scripts" (OuterVolumeSpecName: "scripts") pod "4b098106-2ef2-4f7a-adad-6b9b3349d846" (UID: "4b098106-2ef2-4f7a-adad-6b9b3349d846"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:22:34 crc kubenswrapper[4991]: I1206 01:22:34.992540 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b098106-2ef2-4f7a-adad-6b9b3349d846-config-data" (OuterVolumeSpecName: "config-data") pod "4b098106-2ef2-4f7a-adad-6b9b3349d846" (UID: "4b098106-2ef2-4f7a-adad-6b9b3349d846"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:22:34 crc kubenswrapper[4991]: I1206 01:22:34.997966 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b098106-2ef2-4f7a-adad-6b9b3349d846-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b098106-2ef2-4f7a-adad-6b9b3349d846" (UID: "4b098106-2ef2-4f7a-adad-6b9b3349d846"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:22:35 crc kubenswrapper[4991]: I1206 01:22:35.062344 4991 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4b098106-2ef2-4f7a-adad-6b9b3349d846-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:35 crc kubenswrapper[4991]: I1206 01:22:35.062399 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b098106-2ef2-4f7a-adad-6b9b3349d846-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:35 crc kubenswrapper[4991]: I1206 01:22:35.062419 4991 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4b098106-2ef2-4f7a-adad-6b9b3349d846-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:35 crc kubenswrapper[4991]: I1206 01:22:35.062441 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b098106-2ef2-4f7a-adad-6b9b3349d846-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:35 crc kubenswrapper[4991]: I1206 01:22:35.062460 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b098106-2ef2-4f7a-adad-6b9b3349d846-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:35 crc kubenswrapper[4991]: I1206 01:22:35.062479 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jbkt\" (UniqueName: \"kubernetes.io/projected/4b098106-2ef2-4f7a-adad-6b9b3349d846-kube-api-access-2jbkt\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:35 crc kubenswrapper[4991]: I1206 01:22:35.277859 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zdfxt" Dec 06 01:22:35 crc kubenswrapper[4991]: I1206 01:22:35.278071 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-zdfxt" event={"ID":"4b098106-2ef2-4f7a-adad-6b9b3349d846","Type":"ContainerDied","Data":"da4646b7a2c88c5f9669aec83c0359f25e5853516f44c46b42b81ab183985d5c"} Dec 06 01:22:35 crc kubenswrapper[4991]: I1206 01:22:35.278211 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da4646b7a2c88c5f9669aec83c0359f25e5853516f44c46b42b81ab183985d5c" Dec 06 01:22:35 crc kubenswrapper[4991]: I1206 01:22:35.984423 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-zdfxt"] Dec 06 01:22:35 crc kubenswrapper[4991]: I1206 01:22:35.992852 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-zdfxt"] Dec 06 01:22:36 crc kubenswrapper[4991]: I1206 01:22:36.088210 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-zmlwh"] Dec 06 01:22:36 crc kubenswrapper[4991]: E1206 01:22:36.088632 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b098106-2ef2-4f7a-adad-6b9b3349d846" containerName="keystone-bootstrap" Dec 06 01:22:36 crc kubenswrapper[4991]: I1206 01:22:36.088649 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b098106-2ef2-4f7a-adad-6b9b3349d846" containerName="keystone-bootstrap" Dec 06 01:22:36 crc kubenswrapper[4991]: I1206 01:22:36.088846 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b098106-2ef2-4f7a-adad-6b9b3349d846" containerName="keystone-bootstrap" Dec 06 01:22:36 crc kubenswrapper[4991]: I1206 01:22:36.089362 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zmlwh" Dec 06 01:22:36 crc kubenswrapper[4991]: I1206 01:22:36.093741 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 06 01:22:36 crc kubenswrapper[4991]: I1206 01:22:36.095249 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 06 01:22:36 crc kubenswrapper[4991]: I1206 01:22:36.095630 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 06 01:22:36 crc kubenswrapper[4991]: I1206 01:22:36.095790 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 06 01:22:36 crc kubenswrapper[4991]: I1206 01:22:36.096431 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-z245k" Dec 06 01:22:36 crc kubenswrapper[4991]: I1206 01:22:36.106605 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-zmlwh"] Dec 06 01:22:36 crc kubenswrapper[4991]: I1206 01:22:36.195123 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zps6\" (UniqueName: \"kubernetes.io/projected/ae198f8f-705c-406d-af09-387de07cb200-kube-api-access-7zps6\") pod \"keystone-bootstrap-zmlwh\" (UID: \"ae198f8f-705c-406d-af09-387de07cb200\") " pod="openstack/keystone-bootstrap-zmlwh" Dec 06 01:22:36 crc kubenswrapper[4991]: I1206 01:22:36.195213 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ae198f8f-705c-406d-af09-387de07cb200-credential-keys\") pod \"keystone-bootstrap-zmlwh\" (UID: \"ae198f8f-705c-406d-af09-387de07cb200\") " pod="openstack/keystone-bootstrap-zmlwh" Dec 06 01:22:36 crc kubenswrapper[4991]: I1206 01:22:36.195383 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae198f8f-705c-406d-af09-387de07cb200-scripts\") pod \"keystone-bootstrap-zmlwh\" (UID: \"ae198f8f-705c-406d-af09-387de07cb200\") " pod="openstack/keystone-bootstrap-zmlwh" Dec 06 01:22:36 crc kubenswrapper[4991]: I1206 01:22:36.195589 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae198f8f-705c-406d-af09-387de07cb200-combined-ca-bundle\") pod \"keystone-bootstrap-zmlwh\" (UID: \"ae198f8f-705c-406d-af09-387de07cb200\") " pod="openstack/keystone-bootstrap-zmlwh" Dec 06 01:22:36 crc kubenswrapper[4991]: I1206 01:22:36.195637 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae198f8f-705c-406d-af09-387de07cb200-config-data\") pod \"keystone-bootstrap-zmlwh\" (UID: \"ae198f8f-705c-406d-af09-387de07cb200\") " pod="openstack/keystone-bootstrap-zmlwh" Dec 06 01:22:36 crc kubenswrapper[4991]: I1206 01:22:36.195656 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ae198f8f-705c-406d-af09-387de07cb200-fernet-keys\") pod \"keystone-bootstrap-zmlwh\" (UID: \"ae198f8f-705c-406d-af09-387de07cb200\") " pod="openstack/keystone-bootstrap-zmlwh" Dec 06 01:22:36 crc kubenswrapper[4991]: I1206 01:22:36.288257 4991 generic.go:334] "Generic (PLEG): container finished" podID="48107f10-c266-4cf9-b65f-bca87b8cdc86" containerID="54571095ffb1dce420e52b47fa807af289281627c618d64453b102d3d8876e94" exitCode=0 Dec 06 01:22:36 crc kubenswrapper[4991]: I1206 01:22:36.288334 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-5hg4t" event={"ID":"48107f10-c266-4cf9-b65f-bca87b8cdc86","Type":"ContainerDied","Data":"54571095ffb1dce420e52b47fa807af289281627c618d64453b102d3d8876e94"} Dec 06 01:22:36 crc kubenswrapper[4991]: I1206 01:22:36.297028 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae198f8f-705c-406d-af09-387de07cb200-scripts\") pod \"keystone-bootstrap-zmlwh\" (UID: \"ae198f8f-705c-406d-af09-387de07cb200\") " pod="openstack/keystone-bootstrap-zmlwh" Dec 06 01:22:36 crc kubenswrapper[4991]: I1206 01:22:36.297132 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae198f8f-705c-406d-af09-387de07cb200-combined-ca-bundle\") pod \"keystone-bootstrap-zmlwh\" (UID: \"ae198f8f-705c-406d-af09-387de07cb200\") " pod="openstack/keystone-bootstrap-zmlwh" Dec 06 01:22:36 crc kubenswrapper[4991]: I1206 01:22:36.297169 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae198f8f-705c-406d-af09-387de07cb200-config-data\") pod \"keystone-bootstrap-zmlwh\" (UID: \"ae198f8f-705c-406d-af09-387de07cb200\") " pod="openstack/keystone-bootstrap-zmlwh" Dec 06 01:22:36 crc kubenswrapper[4991]: I1206 01:22:36.297190 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ae198f8f-705c-406d-af09-387de07cb200-fernet-keys\") pod \"keystone-bootstrap-zmlwh\" (UID: \"ae198f8f-705c-406d-af09-387de07cb200\") " pod="openstack/keystone-bootstrap-zmlwh" Dec 06 01:22:36 crc kubenswrapper[4991]: I1206 01:22:36.297257 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zps6\" (UniqueName: \"kubernetes.io/projected/ae198f8f-705c-406d-af09-387de07cb200-kube-api-access-7zps6\") pod \"keystone-bootstrap-zmlwh\" (UID: \"ae198f8f-705c-406d-af09-387de07cb200\") " pod="openstack/keystone-bootstrap-zmlwh" Dec 06 01:22:36 crc kubenswrapper[4991]: I1206 01:22:36.297300 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ae198f8f-705c-406d-af09-387de07cb200-credential-keys\") pod \"keystone-bootstrap-zmlwh\" (UID: \"ae198f8f-705c-406d-af09-387de07cb200\") " pod="openstack/keystone-bootstrap-zmlwh" Dec 06 01:22:36 crc kubenswrapper[4991]: I1206 01:22:36.304409 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae198f8f-705c-406d-af09-387de07cb200-scripts\") pod \"keystone-bootstrap-zmlwh\" (UID: \"ae198f8f-705c-406d-af09-387de07cb200\") " pod="openstack/keystone-bootstrap-zmlwh" Dec 06 01:22:36 crc kubenswrapper[4991]: I1206 01:22:36.305550 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ae198f8f-705c-406d-af09-387de07cb200-credential-keys\") pod \"keystone-bootstrap-zmlwh\" (UID: \"ae198f8f-705c-406d-af09-387de07cb200\") " pod="openstack/keystone-bootstrap-zmlwh" Dec 06 01:22:36 crc kubenswrapper[4991]: I1206 01:22:36.305626 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ae198f8f-705c-406d-af09-387de07cb200-fernet-keys\") pod \"keystone-bootstrap-zmlwh\" (UID: \"ae198f8f-705c-406d-af09-387de07cb200\") " pod="openstack/keystone-bootstrap-zmlwh" Dec 06 01:22:36 crc kubenswrapper[4991]: I1206 01:22:36.309514 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae198f8f-705c-406d-af09-387de07cb200-config-data\") pod \"keystone-bootstrap-zmlwh\" (UID: \"ae198f8f-705c-406d-af09-387de07cb200\") " pod="openstack/keystone-bootstrap-zmlwh" Dec 06 01:22:36 crc kubenswrapper[4991]: I1206 01:22:36.309898 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae198f8f-705c-406d-af09-387de07cb200-combined-ca-bundle\") pod \"keystone-bootstrap-zmlwh\" (UID: \"ae198f8f-705c-406d-af09-387de07cb200\") " pod="openstack/keystone-bootstrap-zmlwh" Dec 06 01:22:36 crc kubenswrapper[4991]: I1206 01:22:36.314442 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zps6\" (UniqueName: \"kubernetes.io/projected/ae198f8f-705c-406d-af09-387de07cb200-kube-api-access-7zps6\") pod \"keystone-bootstrap-zmlwh\" (UID: \"ae198f8f-705c-406d-af09-387de07cb200\") " pod="openstack/keystone-bootstrap-zmlwh" Dec 06 01:22:36 crc kubenswrapper[4991]: I1206 01:22:36.418344 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zmlwh" Dec 06 01:22:37 crc kubenswrapper[4991]: I1206 01:22:37.052613 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b098106-2ef2-4f7a-adad-6b9b3349d846" path="/var/lib/kubelet/pods/4b098106-2ef2-4f7a-adad-6b9b3349d846/volumes" Dec 06 01:22:39 crc kubenswrapper[4991]: I1206 01:22:39.476923 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-2wbr6" podUID="257d40ce-9d2a-4cc4-9e89-4cb777582c47" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: connect: connection refused" Dec 06 01:22:39 crc kubenswrapper[4991]: I1206 01:22:39.477530 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-2wbr6" Dec 06 01:22:44 crc kubenswrapper[4991]: I1206 01:22:44.477224 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-2wbr6" podUID="257d40ce-9d2a-4cc4-9e89-4cb777582c47" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: connect: connection refused" Dec 06 01:22:45 crc kubenswrapper[4991]: E1206 01:22:45.328531 4991 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Dec 06 01:22:45 crc kubenswrapper[4991]: E1206 01:22:45.330405 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5sz6n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-7sjq8_openstack(1f102ba4-81e9-4d61-b5a7-90126ab9dd80): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 01:22:45 crc kubenswrapper[4991]: E1206 01:22:45.333815 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-7sjq8" podUID="1f102ba4-81e9-4d61-b5a7-90126ab9dd80" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.368225 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-59788c8c95-f8d4v" event={"ID":"41dee50a-eb94-4447-b27a-9480e91a10e1","Type":"ContainerDied","Data":"02a37bf99d541af405ec5f17b2d3785b9082b685c3a73f51e4e453e3e6d73f9c"} Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.368282 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02a37bf99d541af405ec5f17b2d3785b9082b685c3a73f51e4e453e3e6d73f9c" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.369292 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-5hg4t" event={"ID":"48107f10-c266-4cf9-b65f-bca87b8cdc86","Type":"ContainerDied","Data":"6d16d8ab377f9b32f561bc66edcb3e0e9b4a729b25d1fe2ba217cfc40323e1f8"} Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.369316 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d16d8ab377f9b32f561bc66edcb3e0e9b4a729b25d1fe2ba217cfc40323e1f8" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.370617 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-rssqk" event={"ID":"1befea2d-2e39-4901-98ef-da500dcb82f9","Type":"ContainerDied","Data":"28d10f495de102a4cf1980ed911194090f411c05e1ab318930f4a5f09f7989ac"} Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.370659 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28d10f495de102a4cf1980ed911194090f411c05e1ab318930f4a5f09f7989ac" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.376006 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-676c488777-8p6b4" event={"ID":"8d2c212d-7062-4b49-948d-ee5ebccf9701","Type":"ContainerDied","Data":"8b1501d6ecc916adfe7db0d6aa36c050c3302e1c889090db572844054c8b5dae"} Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.376034 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b1501d6ecc916adfe7db0d6aa36c050c3302e1c889090db572844054c8b5dae" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.379861 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b44fc996f-ntcqm" event={"ID":"860826f9-f426-4670-be89-0ae09e917d65","Type":"ContainerDied","Data":"fef4336a4fed8478aa64e1f031e811544433cc8cd84bd579b11d9973b2e7de2d"} Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.379907 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fef4336a4fed8478aa64e1f031e811544433cc8cd84bd579b11d9973b2e7de2d" Dec 06 01:22:45 crc kubenswrapper[4991]: E1206 01:22:45.380914 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-7sjq8" podUID="1f102ba4-81e9-4d61-b5a7-90126ab9dd80" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.447447 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-rssqk" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.449248 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-59788c8c95-f8d4v" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.455900 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5b44fc996f-ntcqm" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.464108 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-676c488777-8p6b4" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.494969 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/41dee50a-eb94-4447-b27a-9480e91a10e1-horizon-secret-key\") pod \"41dee50a-eb94-4447-b27a-9480e91a10e1\" (UID: \"41dee50a-eb94-4447-b27a-9480e91a10e1\") " Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.495060 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7crl2\" (UniqueName: \"kubernetes.io/projected/1befea2d-2e39-4901-98ef-da500dcb82f9-kube-api-access-7crl2\") pod \"1befea2d-2e39-4901-98ef-da500dcb82f9\" (UID: \"1befea2d-2e39-4901-98ef-da500dcb82f9\") " Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.495086 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgf9h\" (UniqueName: \"kubernetes.io/projected/41dee50a-eb94-4447-b27a-9480e91a10e1-kube-api-access-rgf9h\") pod \"41dee50a-eb94-4447-b27a-9480e91a10e1\" (UID: \"41dee50a-eb94-4447-b27a-9480e91a10e1\") " Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.495204 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/41dee50a-eb94-4447-b27a-9480e91a10e1-scripts\") pod \"41dee50a-eb94-4447-b27a-9480e91a10e1\" (UID: \"41dee50a-eb94-4447-b27a-9480e91a10e1\") " Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.495252 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1befea2d-2e39-4901-98ef-da500dcb82f9-combined-ca-bundle\") pod \"1befea2d-2e39-4901-98ef-da500dcb82f9\" (UID: \"1befea2d-2e39-4901-98ef-da500dcb82f9\") " Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.495301 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41dee50a-eb94-4447-b27a-9480e91a10e1-logs\") pod \"41dee50a-eb94-4447-b27a-9480e91a10e1\" (UID: \"41dee50a-eb94-4447-b27a-9480e91a10e1\") " Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.495334 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1befea2d-2e39-4901-98ef-da500dcb82f9-config-data\") pod \"1befea2d-2e39-4901-98ef-da500dcb82f9\" (UID: \"1befea2d-2e39-4901-98ef-da500dcb82f9\") " Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.495411 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/41dee50a-eb94-4447-b27a-9480e91a10e1-config-data\") pod \"41dee50a-eb94-4447-b27a-9480e91a10e1\" (UID: \"41dee50a-eb94-4447-b27a-9480e91a10e1\") " Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.495458 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1befea2d-2e39-4901-98ef-da500dcb82f9-db-sync-config-data\") pod \"1befea2d-2e39-4901-98ef-da500dcb82f9\" (UID: \"1befea2d-2e39-4901-98ef-da500dcb82f9\") " Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.498003 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41dee50a-eb94-4447-b27a-9480e91a10e1-scripts" (OuterVolumeSpecName: "scripts") pod "41dee50a-eb94-4447-b27a-9480e91a10e1" (UID: "41dee50a-eb94-4447-b27a-9480e91a10e1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.498197 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41dee50a-eb94-4447-b27a-9480e91a10e1-logs" (OuterVolumeSpecName: "logs") pod "41dee50a-eb94-4447-b27a-9480e91a10e1" (UID: "41dee50a-eb94-4447-b27a-9480e91a10e1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.498351 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41dee50a-eb94-4447-b27a-9480e91a10e1-config-data" (OuterVolumeSpecName: "config-data") pod "41dee50a-eb94-4447-b27a-9480e91a10e1" (UID: "41dee50a-eb94-4447-b27a-9480e91a10e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.501065 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-5hg4t" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.505254 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1befea2d-2e39-4901-98ef-da500dcb82f9-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "1befea2d-2e39-4901-98ef-da500dcb82f9" (UID: "1befea2d-2e39-4901-98ef-da500dcb82f9"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.506194 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1befea2d-2e39-4901-98ef-da500dcb82f9-kube-api-access-7crl2" (OuterVolumeSpecName: "kube-api-access-7crl2") pod "1befea2d-2e39-4901-98ef-da500dcb82f9" (UID: "1befea2d-2e39-4901-98ef-da500dcb82f9"). InnerVolumeSpecName "kube-api-access-7crl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.506595 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41dee50a-eb94-4447-b27a-9480e91a10e1-kube-api-access-rgf9h" (OuterVolumeSpecName: "kube-api-access-rgf9h") pod "41dee50a-eb94-4447-b27a-9480e91a10e1" (UID: "41dee50a-eb94-4447-b27a-9480e91a10e1"). InnerVolumeSpecName "kube-api-access-rgf9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.510259 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41dee50a-eb94-4447-b27a-9480e91a10e1-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "41dee50a-eb94-4447-b27a-9480e91a10e1" (UID: "41dee50a-eb94-4447-b27a-9480e91a10e1"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.543203 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1befea2d-2e39-4901-98ef-da500dcb82f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1befea2d-2e39-4901-98ef-da500dcb82f9" (UID: "1befea2d-2e39-4901-98ef-da500dcb82f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.596919 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/860826f9-f426-4670-be89-0ae09e917d65-scripts\") pod \"860826f9-f426-4670-be89-0ae09e917d65\" (UID: \"860826f9-f426-4670-be89-0ae09e917d65\") " Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.598102 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/860826f9-f426-4670-be89-0ae09e917d65-logs\") pod \"860826f9-f426-4670-be89-0ae09e917d65\" (UID: \"860826f9-f426-4670-be89-0ae09e917d65\") " Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.598243 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dm6c\" (UniqueName: \"kubernetes.io/projected/8d2c212d-7062-4b49-948d-ee5ebccf9701-kube-api-access-4dm6c\") pod \"8d2c212d-7062-4b49-948d-ee5ebccf9701\" (UID: \"8d2c212d-7062-4b49-948d-ee5ebccf9701\") " Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.598273 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8d2c212d-7062-4b49-948d-ee5ebccf9701-config-data\") pod \"8d2c212d-7062-4b49-948d-ee5ebccf9701\" (UID: \"8d2c212d-7062-4b49-948d-ee5ebccf9701\") " Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.598291 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgjqs\" (UniqueName: \"kubernetes.io/projected/860826f9-f426-4670-be89-0ae09e917d65-kube-api-access-zgjqs\") pod \"860826f9-f426-4670-be89-0ae09e917d65\" (UID: \"860826f9-f426-4670-be89-0ae09e917d65\") " Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.598331 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/860826f9-f426-4670-be89-0ae09e917d65-horizon-secret-key\") pod \"860826f9-f426-4670-be89-0ae09e917d65\" (UID: \"860826f9-f426-4670-be89-0ae09e917d65\") " Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.598353 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/860826f9-f426-4670-be89-0ae09e917d65-config-data\") pod \"860826f9-f426-4670-be89-0ae09e917d65\" (UID: \"860826f9-f426-4670-be89-0ae09e917d65\") " Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.598387 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6r5j\" (UniqueName: \"kubernetes.io/projected/48107f10-c266-4cf9-b65f-bca87b8cdc86-kube-api-access-z6r5j\") pod \"48107f10-c266-4cf9-b65f-bca87b8cdc86\" (UID: \"48107f10-c266-4cf9-b65f-bca87b8cdc86\") " Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.598458 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/48107f10-c266-4cf9-b65f-bca87b8cdc86-config\") pod \"48107f10-c266-4cf9-b65f-bca87b8cdc86\" (UID: \"48107f10-c266-4cf9-b65f-bca87b8cdc86\") " Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.598486 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d2c212d-7062-4b49-948d-ee5ebccf9701-logs\") pod \"8d2c212d-7062-4b49-948d-ee5ebccf9701\" (UID: \"8d2c212d-7062-4b49-948d-ee5ebccf9701\") " Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.598532 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8d2c212d-7062-4b49-948d-ee5ebccf9701-horizon-secret-key\") pod \"8d2c212d-7062-4b49-948d-ee5ebccf9701\" (UID: \"8d2c212d-7062-4b49-948d-ee5ebccf9701\") " Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.598562 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48107f10-c266-4cf9-b65f-bca87b8cdc86-combined-ca-bundle\") pod \"48107f10-c266-4cf9-b65f-bca87b8cdc86\" (UID: \"48107f10-c266-4cf9-b65f-bca87b8cdc86\") " Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.598596 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8d2c212d-7062-4b49-948d-ee5ebccf9701-scripts\") pod \"8d2c212d-7062-4b49-948d-ee5ebccf9701\" (UID: \"8d2c212d-7062-4b49-948d-ee5ebccf9701\") " Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.598768 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/860826f9-f426-4670-be89-0ae09e917d65-scripts" (OuterVolumeSpecName: "scripts") pod "860826f9-f426-4670-be89-0ae09e917d65" (UID: "860826f9-f426-4670-be89-0ae09e917d65"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.599168 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/860826f9-f426-4670-be89-0ae09e917d65-config-data" (OuterVolumeSpecName: "config-data") pod "860826f9-f426-4670-be89-0ae09e917d65" (UID: "860826f9-f426-4670-be89-0ae09e917d65"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.599541 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d2c212d-7062-4b49-948d-ee5ebccf9701-logs" (OuterVolumeSpecName: "logs") pod "8d2c212d-7062-4b49-948d-ee5ebccf9701" (UID: "8d2c212d-7062-4b49-948d-ee5ebccf9701"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.599746 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/860826f9-f426-4670-be89-0ae09e917d65-logs" (OuterVolumeSpecName: "logs") pod "860826f9-f426-4670-be89-0ae09e917d65" (UID: "860826f9-f426-4670-be89-0ae09e917d65"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.600023 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/41dee50a-eb94-4447-b27a-9480e91a10e1-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.600172 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/860826f9-f426-4670-be89-0ae09e917d65-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.600294 4991 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/860826f9-f426-4670-be89-0ae09e917d65-logs\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.600425 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1befea2d-2e39-4901-98ef-da500dcb82f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.600543 4991 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41dee50a-eb94-4447-b27a-9480e91a10e1-logs\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.601583 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/41dee50a-eb94-4447-b27a-9480e91a10e1-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.600248 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d2c212d-7062-4b49-948d-ee5ebccf9701-scripts" (OuterVolumeSpecName: "scripts") pod "8d2c212d-7062-4b49-948d-ee5ebccf9701" (UID: "8d2c212d-7062-4b49-948d-ee5ebccf9701"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.600630 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d2c212d-7062-4b49-948d-ee5ebccf9701-config-data" (OuterVolumeSpecName: "config-data") pod "8d2c212d-7062-4b49-948d-ee5ebccf9701" (UID: "8d2c212d-7062-4b49-948d-ee5ebccf9701"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.601701 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/860826f9-f426-4670-be89-0ae09e917d65-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.601754 4991 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1befea2d-2e39-4901-98ef-da500dcb82f9-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.601769 4991 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d2c212d-7062-4b49-948d-ee5ebccf9701-logs\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.601782 4991 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/41dee50a-eb94-4447-b27a-9480e91a10e1-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.601793 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7crl2\" (UniqueName: \"kubernetes.io/projected/1befea2d-2e39-4901-98ef-da500dcb82f9-kube-api-access-7crl2\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.601805 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgf9h\" (UniqueName: \"kubernetes.io/projected/41dee50a-eb94-4447-b27a-9480e91a10e1-kube-api-access-rgf9h\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.603486 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d2c212d-7062-4b49-948d-ee5ebccf9701-kube-api-access-4dm6c" (OuterVolumeSpecName: "kube-api-access-4dm6c") pod "8d2c212d-7062-4b49-948d-ee5ebccf9701" (UID: "8d2c212d-7062-4b49-948d-ee5ebccf9701"). InnerVolumeSpecName "kube-api-access-4dm6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.605151 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48107f10-c266-4cf9-b65f-bca87b8cdc86-kube-api-access-z6r5j" (OuterVolumeSpecName: "kube-api-access-z6r5j") pod "48107f10-c266-4cf9-b65f-bca87b8cdc86" (UID: "48107f10-c266-4cf9-b65f-bca87b8cdc86"). InnerVolumeSpecName "kube-api-access-z6r5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.605216 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/860826f9-f426-4670-be89-0ae09e917d65-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "860826f9-f426-4670-be89-0ae09e917d65" (UID: "860826f9-f426-4670-be89-0ae09e917d65"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.605550 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/860826f9-f426-4670-be89-0ae09e917d65-kube-api-access-zgjqs" (OuterVolumeSpecName: "kube-api-access-zgjqs") pod "860826f9-f426-4670-be89-0ae09e917d65" (UID: "860826f9-f426-4670-be89-0ae09e917d65"). InnerVolumeSpecName "kube-api-access-zgjqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.606621 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d2c212d-7062-4b49-948d-ee5ebccf9701-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "8d2c212d-7062-4b49-948d-ee5ebccf9701" (UID: "8d2c212d-7062-4b49-948d-ee5ebccf9701"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.606854 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1befea2d-2e39-4901-98ef-da500dcb82f9-config-data" (OuterVolumeSpecName: "config-data") pod "1befea2d-2e39-4901-98ef-da500dcb82f9" (UID: "1befea2d-2e39-4901-98ef-da500dcb82f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.636539 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48107f10-c266-4cf9-b65f-bca87b8cdc86-config" (OuterVolumeSpecName: "config") pod "48107f10-c266-4cf9-b65f-bca87b8cdc86" (UID: "48107f10-c266-4cf9-b65f-bca87b8cdc86"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.639237 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48107f10-c266-4cf9-b65f-bca87b8cdc86-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48107f10-c266-4cf9-b65f-bca87b8cdc86" (UID: "48107f10-c266-4cf9-b65f-bca87b8cdc86"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.724153 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1befea2d-2e39-4901-98ef-da500dcb82f9-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.724192 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dm6c\" (UniqueName: \"kubernetes.io/projected/8d2c212d-7062-4b49-948d-ee5ebccf9701-kube-api-access-4dm6c\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.724209 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8d2c212d-7062-4b49-948d-ee5ebccf9701-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.724221 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgjqs\" (UniqueName: \"kubernetes.io/projected/860826f9-f426-4670-be89-0ae09e917d65-kube-api-access-zgjqs\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.724232 4991 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/860826f9-f426-4670-be89-0ae09e917d65-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.724241 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6r5j\" (UniqueName: \"kubernetes.io/projected/48107f10-c266-4cf9-b65f-bca87b8cdc86-kube-api-access-z6r5j\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.724250 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/48107f10-c266-4cf9-b65f-bca87b8cdc86-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.724258 4991 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8d2c212d-7062-4b49-948d-ee5ebccf9701-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.724267 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48107f10-c266-4cf9-b65f-bca87b8cdc86-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:45 crc kubenswrapper[4991]: I1206 01:22:45.724278 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8d2c212d-7062-4b49-948d-ee5ebccf9701-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:46 crc kubenswrapper[4991]: I1206 01:22:46.386318 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-59788c8c95-f8d4v" Dec 06 01:22:46 crc kubenswrapper[4991]: I1206 01:22:46.386355 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-676c488777-8p6b4" Dec 06 01:22:46 crc kubenswrapper[4991]: I1206 01:22:46.386331 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-rssqk" Dec 06 01:22:46 crc kubenswrapper[4991]: I1206 01:22:46.386527 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5b44fc996f-ntcqm" Dec 06 01:22:46 crc kubenswrapper[4991]: I1206 01:22:46.386318 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-5hg4t" Dec 06 01:22:46 crc kubenswrapper[4991]: I1206 01:22:46.621599 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-676c488777-8p6b4"] Dec 06 01:22:46 crc kubenswrapper[4991]: I1206 01:22:46.643309 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-676c488777-8p6b4"] Dec 06 01:22:46 crc kubenswrapper[4991]: I1206 01:22:46.703950 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-59788c8c95-f8d4v"] Dec 06 01:22:46 crc kubenswrapper[4991]: I1206 01:22:46.722084 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-59788c8c95-f8d4v"] Dec 06 01:22:46 crc kubenswrapper[4991]: I1206 01:22:46.743374 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 01:22:46 crc kubenswrapper[4991]: I1206 01:22:46.743448 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 01:22:46 crc kubenswrapper[4991]: I1206 01:22:46.754645 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5b44fc996f-ntcqm"] Dec 06 01:22:46 crc kubenswrapper[4991]: I1206 01:22:46.764816 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5b44fc996f-ntcqm"] Dec 06 01:22:46 crc kubenswrapper[4991]: I1206 01:22:46.779211 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6664c6795f-tjg96"] Dec 06 01:22:46 crc kubenswrapper[4991]: E1206 01:22:46.779674 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1befea2d-2e39-4901-98ef-da500dcb82f9" containerName="glance-db-sync" Dec 06 01:22:46 crc kubenswrapper[4991]: I1206 01:22:46.779689 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="1befea2d-2e39-4901-98ef-da500dcb82f9" containerName="glance-db-sync" Dec 06 01:22:46 crc kubenswrapper[4991]: E1206 01:22:46.779710 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48107f10-c266-4cf9-b65f-bca87b8cdc86" containerName="neutron-db-sync" Dec 06 01:22:46 crc kubenswrapper[4991]: I1206 01:22:46.779716 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="48107f10-c266-4cf9-b65f-bca87b8cdc86" containerName="neutron-db-sync" Dec 06 01:22:46 crc kubenswrapper[4991]: I1206 01:22:46.779916 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="48107f10-c266-4cf9-b65f-bca87b8cdc86" containerName="neutron-db-sync" Dec 06 01:22:46 crc kubenswrapper[4991]: I1206 01:22:46.779962 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="1befea2d-2e39-4901-98ef-da500dcb82f9" containerName="glance-db-sync" Dec 06 01:22:46 crc kubenswrapper[4991]: I1206 01:22:46.781028 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6664c6795f-tjg96" Dec 06 01:22:46 crc kubenswrapper[4991]: I1206 01:22:46.789887 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6664c6795f-tjg96"] Dec 06 01:22:46 crc kubenswrapper[4991]: I1206 01:22:46.878958 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a4881ee-332e-454c-bc1d-b99c932ad458-config\") pod \"dnsmasq-dns-6664c6795f-tjg96\" (UID: \"1a4881ee-332e-454c-bc1d-b99c932ad458\") " pod="openstack/dnsmasq-dns-6664c6795f-tjg96" Dec 06 01:22:46 crc kubenswrapper[4991]: I1206 01:22:46.879103 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a4881ee-332e-454c-bc1d-b99c932ad458-ovsdbserver-sb\") pod \"dnsmasq-dns-6664c6795f-tjg96\" (UID: \"1a4881ee-332e-454c-bc1d-b99c932ad458\") " pod="openstack/dnsmasq-dns-6664c6795f-tjg96" Dec 06 01:22:46 crc kubenswrapper[4991]: I1206 01:22:46.879182 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58f59\" (UniqueName: \"kubernetes.io/projected/1a4881ee-332e-454c-bc1d-b99c932ad458-kube-api-access-58f59\") pod \"dnsmasq-dns-6664c6795f-tjg96\" (UID: \"1a4881ee-332e-454c-bc1d-b99c932ad458\") " pod="openstack/dnsmasq-dns-6664c6795f-tjg96" Dec 06 01:22:46 crc kubenswrapper[4991]: I1206 01:22:46.879202 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1a4881ee-332e-454c-bc1d-b99c932ad458-dns-swift-storage-0\") pod \"dnsmasq-dns-6664c6795f-tjg96\" (UID: \"1a4881ee-332e-454c-bc1d-b99c932ad458\") " pod="openstack/dnsmasq-dns-6664c6795f-tjg96" Dec 06 01:22:46 crc kubenswrapper[4991]: I1206 01:22:46.879242 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a4881ee-332e-454c-bc1d-b99c932ad458-dns-svc\") pod \"dnsmasq-dns-6664c6795f-tjg96\" (UID: \"1a4881ee-332e-454c-bc1d-b99c932ad458\") " pod="openstack/dnsmasq-dns-6664c6795f-tjg96" Dec 06 01:22:46 crc kubenswrapper[4991]: I1206 01:22:46.879408 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a4881ee-332e-454c-bc1d-b99c932ad458-ovsdbserver-nb\") pod \"dnsmasq-dns-6664c6795f-tjg96\" (UID: \"1a4881ee-332e-454c-bc1d-b99c932ad458\") " pod="openstack/dnsmasq-dns-6664c6795f-tjg96" Dec 06 01:22:46 crc kubenswrapper[4991]: E1206 01:22:46.887513 4991 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41dee50a_eb94_4447_b27a_9480e91a10e1.slice\": RecentStats: unable to find data in memory cache]" Dec 06 01:22:46 crc kubenswrapper[4991]: I1206 01:22:46.968249 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-84bc9ddf6-jd669"] Dec 06 01:22:46 crc kubenswrapper[4991]: I1206 01:22:46.971358 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-84bc9ddf6-jd669" Dec 06 01:22:46 crc kubenswrapper[4991]: I1206 01:22:46.974736 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 06 01:22:46 crc kubenswrapper[4991]: I1206 01:22:46.975031 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 06 01:22:46 crc kubenswrapper[4991]: I1206 01:22:46.975229 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-kdjvx" Dec 06 01:22:46 crc kubenswrapper[4991]: I1206 01:22:46.975346 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 06 01:22:46 crc kubenswrapper[4991]: I1206 01:22:46.985187 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58f59\" (UniqueName: \"kubernetes.io/projected/1a4881ee-332e-454c-bc1d-b99c932ad458-kube-api-access-58f59\") pod \"dnsmasq-dns-6664c6795f-tjg96\" (UID: \"1a4881ee-332e-454c-bc1d-b99c932ad458\") " pod="openstack/dnsmasq-dns-6664c6795f-tjg96" Dec 06 01:22:46 crc kubenswrapper[4991]: I1206 01:22:46.985222 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1a4881ee-332e-454c-bc1d-b99c932ad458-dns-swift-storage-0\") pod \"dnsmasq-dns-6664c6795f-tjg96\" (UID: \"1a4881ee-332e-454c-bc1d-b99c932ad458\") " pod="openstack/dnsmasq-dns-6664c6795f-tjg96" Dec 06 01:22:46 crc kubenswrapper[4991]: I1206 01:22:46.985251 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a4881ee-332e-454c-bc1d-b99c932ad458-dns-svc\") pod \"dnsmasq-dns-6664c6795f-tjg96\" (UID: \"1a4881ee-332e-454c-bc1d-b99c932ad458\") " pod="openstack/dnsmasq-dns-6664c6795f-tjg96" Dec 06 01:22:46 crc kubenswrapper[4991]: I1206 01:22:46.985316 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a4881ee-332e-454c-bc1d-b99c932ad458-ovsdbserver-nb\") pod \"dnsmasq-dns-6664c6795f-tjg96\" (UID: \"1a4881ee-332e-454c-bc1d-b99c932ad458\") " pod="openstack/dnsmasq-dns-6664c6795f-tjg96" Dec 06 01:22:46 crc kubenswrapper[4991]: I1206 01:22:46.985350 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a4881ee-332e-454c-bc1d-b99c932ad458-config\") pod \"dnsmasq-dns-6664c6795f-tjg96\" (UID: \"1a4881ee-332e-454c-bc1d-b99c932ad458\") " pod="openstack/dnsmasq-dns-6664c6795f-tjg96" Dec 06 01:22:46 crc kubenswrapper[4991]: I1206 01:22:46.985395 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a4881ee-332e-454c-bc1d-b99c932ad458-ovsdbserver-sb\") pod \"dnsmasq-dns-6664c6795f-tjg96\" (UID: \"1a4881ee-332e-454c-bc1d-b99c932ad458\") " pod="openstack/dnsmasq-dns-6664c6795f-tjg96" Dec 06 01:22:46 crc kubenswrapper[4991]: I1206 01:22:46.986251 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a4881ee-332e-454c-bc1d-b99c932ad458-ovsdbserver-sb\") pod \"dnsmasq-dns-6664c6795f-tjg96\" (UID: \"1a4881ee-332e-454c-bc1d-b99c932ad458\") " pod="openstack/dnsmasq-dns-6664c6795f-tjg96" Dec 06 01:22:46 crc kubenswrapper[4991]: I1206 01:22:46.986806 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a4881ee-332e-454c-bc1d-b99c932ad458-dns-svc\") pod \"dnsmasq-dns-6664c6795f-tjg96\" (UID: \"1a4881ee-332e-454c-bc1d-b99c932ad458\") " pod="openstack/dnsmasq-dns-6664c6795f-tjg96" Dec 06 01:22:46 crc kubenswrapper[4991]: I1206 01:22:46.993559 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a4881ee-332e-454c-bc1d-b99c932ad458-ovsdbserver-nb\") pod \"dnsmasq-dns-6664c6795f-tjg96\" (UID: \"1a4881ee-332e-454c-bc1d-b99c932ad458\") " pod="openstack/dnsmasq-dns-6664c6795f-tjg96" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.004043 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a4881ee-332e-454c-bc1d-b99c932ad458-config\") pod \"dnsmasq-dns-6664c6795f-tjg96\" (UID: \"1a4881ee-332e-454c-bc1d-b99c932ad458\") " pod="openstack/dnsmasq-dns-6664c6795f-tjg96" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.004132 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-84bc9ddf6-jd669"] Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.005667 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1a4881ee-332e-454c-bc1d-b99c932ad458-dns-swift-storage-0\") pod \"dnsmasq-dns-6664c6795f-tjg96\" (UID: \"1a4881ee-332e-454c-bc1d-b99c932ad458\") " pod="openstack/dnsmasq-dns-6664c6795f-tjg96" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.045121 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58f59\" (UniqueName: \"kubernetes.io/projected/1a4881ee-332e-454c-bc1d-b99c932ad458-kube-api-access-58f59\") pod \"dnsmasq-dns-6664c6795f-tjg96\" (UID: \"1a4881ee-332e-454c-bc1d-b99c932ad458\") " pod="openstack/dnsmasq-dns-6664c6795f-tjg96" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.070328 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41dee50a-eb94-4447-b27a-9480e91a10e1" path="/var/lib/kubelet/pods/41dee50a-eb94-4447-b27a-9480e91a10e1/volumes" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.070910 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="860826f9-f426-4670-be89-0ae09e917d65" path="/var/lib/kubelet/pods/860826f9-f426-4670-be89-0ae09e917d65/volumes" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.071469 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d2c212d-7062-4b49-948d-ee5ebccf9701" path="/var/lib/kubelet/pods/8d2c212d-7062-4b49-948d-ee5ebccf9701/volumes" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.087831 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n7fd\" (UniqueName: \"kubernetes.io/projected/b305d8ce-5702-42c2-979a-15e0f7e46cae-kube-api-access-9n7fd\") pod \"neutron-84bc9ddf6-jd669\" (UID: \"b305d8ce-5702-42c2-979a-15e0f7e46cae\") " pod="openstack/neutron-84bc9ddf6-jd669" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.087883 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b305d8ce-5702-42c2-979a-15e0f7e46cae-combined-ca-bundle\") pod \"neutron-84bc9ddf6-jd669\" (UID: \"b305d8ce-5702-42c2-979a-15e0f7e46cae\") " pod="openstack/neutron-84bc9ddf6-jd669" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.087913 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b305d8ce-5702-42c2-979a-15e0f7e46cae-httpd-config\") pod \"neutron-84bc9ddf6-jd669\" (UID: \"b305d8ce-5702-42c2-979a-15e0f7e46cae\") " pod="openstack/neutron-84bc9ddf6-jd669" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.087954 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b305d8ce-5702-42c2-979a-15e0f7e46cae-ovndb-tls-certs\") pod \"neutron-84bc9ddf6-jd669\" (UID: \"b305d8ce-5702-42c2-979a-15e0f7e46cae\") " pod="openstack/neutron-84bc9ddf6-jd669" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.088264 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b305d8ce-5702-42c2-979a-15e0f7e46cae-config\") pod \"neutron-84bc9ddf6-jd669\" (UID: \"b305d8ce-5702-42c2-979a-15e0f7e46cae\") " pod="openstack/neutron-84bc9ddf6-jd669" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.099757 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6664c6795f-tjg96"] Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.100793 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6664c6795f-tjg96" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.120444 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-k5mql"] Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.126901 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-k5mql" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.189846 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n7fd\" (UniqueName: \"kubernetes.io/projected/b305d8ce-5702-42c2-979a-15e0f7e46cae-kube-api-access-9n7fd\") pod \"neutron-84bc9ddf6-jd669\" (UID: \"b305d8ce-5702-42c2-979a-15e0f7e46cae\") " pod="openstack/neutron-84bc9ddf6-jd669" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.189910 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b305d8ce-5702-42c2-979a-15e0f7e46cae-combined-ca-bundle\") pod \"neutron-84bc9ddf6-jd669\" (UID: \"b305d8ce-5702-42c2-979a-15e0f7e46cae\") " pod="openstack/neutron-84bc9ddf6-jd669" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.189944 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b305d8ce-5702-42c2-979a-15e0f7e46cae-httpd-config\") pod \"neutron-84bc9ddf6-jd669\" (UID: \"b305d8ce-5702-42c2-979a-15e0f7e46cae\") " pod="openstack/neutron-84bc9ddf6-jd669" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.190028 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b305d8ce-5702-42c2-979a-15e0f7e46cae-ovndb-tls-certs\") pod \"neutron-84bc9ddf6-jd669\" (UID: \"b305d8ce-5702-42c2-979a-15e0f7e46cae\") " pod="openstack/neutron-84bc9ddf6-jd669" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.190127 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b305d8ce-5702-42c2-979a-15e0f7e46cae-config\") pod \"neutron-84bc9ddf6-jd669\" (UID: \"b305d8ce-5702-42c2-979a-15e0f7e46cae\") " pod="openstack/neutron-84bc9ddf6-jd669" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.206640 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b305d8ce-5702-42c2-979a-15e0f7e46cae-ovndb-tls-certs\") pod \"neutron-84bc9ddf6-jd669\" (UID: \"b305d8ce-5702-42c2-979a-15e0f7e46cae\") " pod="openstack/neutron-84bc9ddf6-jd669" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.213755 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b305d8ce-5702-42c2-979a-15e0f7e46cae-httpd-config\") pod \"neutron-84bc9ddf6-jd669\" (UID: \"b305d8ce-5702-42c2-979a-15e0f7e46cae\") " pod="openstack/neutron-84bc9ddf6-jd669" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.214409 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-k5mql"] Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.239349 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b305d8ce-5702-42c2-979a-15e0f7e46cae-combined-ca-bundle\") pod \"neutron-84bc9ddf6-jd669\" (UID: \"b305d8ce-5702-42c2-979a-15e0f7e46cae\") " pod="openstack/neutron-84bc9ddf6-jd669" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.242064 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b305d8ce-5702-42c2-979a-15e0f7e46cae-config\") pod \"neutron-84bc9ddf6-jd669\" (UID: \"b305d8ce-5702-42c2-979a-15e0f7e46cae\") " pod="openstack/neutron-84bc9ddf6-jd669" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.245013 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n7fd\" (UniqueName: \"kubernetes.io/projected/b305d8ce-5702-42c2-979a-15e0f7e46cae-kube-api-access-9n7fd\") pod \"neutron-84bc9ddf6-jd669\" (UID: \"b305d8ce-5702-42c2-979a-15e0f7e46cae\") " pod="openstack/neutron-84bc9ddf6-jd669" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.294130 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42ea5f2a-f150-4554-9f87-7ac46f04fd5b-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-k5mql\" (UID: \"42ea5f2a-f150-4554-9f87-7ac46f04fd5b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-k5mql" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.294199 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42ea5f2a-f150-4554-9f87-7ac46f04fd5b-config\") pod \"dnsmasq-dns-5ccc5c4795-k5mql\" (UID: \"42ea5f2a-f150-4554-9f87-7ac46f04fd5b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-k5mql" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.294225 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42ea5f2a-f150-4554-9f87-7ac46f04fd5b-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-k5mql\" (UID: \"42ea5f2a-f150-4554-9f87-7ac46f04fd5b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-k5mql" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.294264 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tsrm\" (UniqueName: \"kubernetes.io/projected/42ea5f2a-f150-4554-9f87-7ac46f04fd5b-kube-api-access-6tsrm\") pod \"dnsmasq-dns-5ccc5c4795-k5mql\" (UID: \"42ea5f2a-f150-4554-9f87-7ac46f04fd5b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-k5mql" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.294286 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42ea5f2a-f150-4554-9f87-7ac46f04fd5b-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-k5mql\" (UID: \"42ea5f2a-f150-4554-9f87-7ac46f04fd5b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-k5mql" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.294376 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/42ea5f2a-f150-4554-9f87-7ac46f04fd5b-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-k5mql\" (UID: \"42ea5f2a-f150-4554-9f87-7ac46f04fd5b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-k5mql" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.310557 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-84bc9ddf6-jd669" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.409915 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42ea5f2a-f150-4554-9f87-7ac46f04fd5b-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-k5mql\" (UID: \"42ea5f2a-f150-4554-9f87-7ac46f04fd5b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-k5mql" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.410005 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42ea5f2a-f150-4554-9f87-7ac46f04fd5b-config\") pod \"dnsmasq-dns-5ccc5c4795-k5mql\" (UID: \"42ea5f2a-f150-4554-9f87-7ac46f04fd5b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-k5mql" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.410060 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42ea5f2a-f150-4554-9f87-7ac46f04fd5b-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-k5mql\" (UID: \"42ea5f2a-f150-4554-9f87-7ac46f04fd5b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-k5mql" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.410115 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tsrm\" (UniqueName: \"kubernetes.io/projected/42ea5f2a-f150-4554-9f87-7ac46f04fd5b-kube-api-access-6tsrm\") pod \"dnsmasq-dns-5ccc5c4795-k5mql\" (UID: \"42ea5f2a-f150-4554-9f87-7ac46f04fd5b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-k5mql" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.410148 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42ea5f2a-f150-4554-9f87-7ac46f04fd5b-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-k5mql\" (UID: \"42ea5f2a-f150-4554-9f87-7ac46f04fd5b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-k5mql" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.410370 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/42ea5f2a-f150-4554-9f87-7ac46f04fd5b-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-k5mql\" (UID: \"42ea5f2a-f150-4554-9f87-7ac46f04fd5b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-k5mql" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.411427 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/42ea5f2a-f150-4554-9f87-7ac46f04fd5b-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-k5mql\" (UID: \"42ea5f2a-f150-4554-9f87-7ac46f04fd5b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-k5mql" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.416634 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42ea5f2a-f150-4554-9f87-7ac46f04fd5b-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-k5mql\" (UID: \"42ea5f2a-f150-4554-9f87-7ac46f04fd5b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-k5mql" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.417505 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42ea5f2a-f150-4554-9f87-7ac46f04fd5b-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-k5mql\" (UID: \"42ea5f2a-f150-4554-9f87-7ac46f04fd5b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-k5mql" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.419082 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42ea5f2a-f150-4554-9f87-7ac46f04fd5b-config\") pod \"dnsmasq-dns-5ccc5c4795-k5mql\" (UID: \"42ea5f2a-f150-4554-9f87-7ac46f04fd5b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-k5mql" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.420402 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42ea5f2a-f150-4554-9f87-7ac46f04fd5b-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-k5mql\" (UID: \"42ea5f2a-f150-4554-9f87-7ac46f04fd5b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-k5mql" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.434795 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tsrm\" (UniqueName: \"kubernetes.io/projected/42ea5f2a-f150-4554-9f87-7ac46f04fd5b-kube-api-access-6tsrm\") pod \"dnsmasq-dns-5ccc5c4795-k5mql\" (UID: \"42ea5f2a-f150-4554-9f87-7ac46f04fd5b\") " pod="openstack/dnsmasq-dns-5ccc5c4795-k5mql" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.598648 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-k5mql" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.879802 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.883507 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.888725 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.889540 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-88l6p" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.889699 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 06 01:22:47 crc kubenswrapper[4991]: I1206 01:22:47.898448 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.034322 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltn46\" (UniqueName: \"kubernetes.io/projected/e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a-kube-api-access-ltn46\") pod \"glance-default-external-api-0\" (UID: \"e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a\") " pod="openstack/glance-default-external-api-0" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.034445 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a-scripts\") pod \"glance-default-external-api-0\" (UID: \"e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a\") " pod="openstack/glance-default-external-api-0" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.034500 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a\") " pod="openstack/glance-default-external-api-0" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.034543 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a-config-data\") pod \"glance-default-external-api-0\" (UID: \"e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a\") " pod="openstack/glance-default-external-api-0" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.034747 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a\") " pod="openstack/glance-default-external-api-0" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.034852 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a-logs\") pod \"glance-default-external-api-0\" (UID: \"e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a\") " pod="openstack/glance-default-external-api-0" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.035069 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a\") " pod="openstack/glance-default-external-api-0" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.109883 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-2wbr6" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.136880 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a\") " pod="openstack/glance-default-external-api-0" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.136950 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltn46\" (UniqueName: \"kubernetes.io/projected/e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a-kube-api-access-ltn46\") pod \"glance-default-external-api-0\" (UID: \"e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a\") " pod="openstack/glance-default-external-api-0" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.137041 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a-scripts\") pod \"glance-default-external-api-0\" (UID: \"e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a\") " pod="openstack/glance-default-external-api-0" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.137096 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a\") " pod="openstack/glance-default-external-api-0" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.137146 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a-config-data\") pod \"glance-default-external-api-0\" (UID: \"e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a\") " pod="openstack/glance-default-external-api-0" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.137197 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a\") " pod="openstack/glance-default-external-api-0" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.137220 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a-logs\") pod \"glance-default-external-api-0\" (UID: \"e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a\") " pod="openstack/glance-default-external-api-0" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.138433 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a-logs\") pod \"glance-default-external-api-0\" (UID: \"e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a\") " pod="openstack/glance-default-external-api-0" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.138944 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a\") " pod="openstack/glance-default-external-api-0" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.139532 4991 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.142091 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a\") " pod="openstack/glance-default-external-api-0" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.142861 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a-scripts\") pod \"glance-default-external-api-0\" (UID: \"e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a\") " pod="openstack/glance-default-external-api-0" Dec 06 01:22:48 crc kubenswrapper[4991]: E1206 01:22:48.143745 4991 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 06 01:22:48 crc kubenswrapper[4991]: E1206 01:22:48.143978 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lfw6n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-znbg5_openstack(41629f2e-563a-4e76-8ac2-530e8d204578): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 01:22:48 crc kubenswrapper[4991]: E1206 01:22:48.147868 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-znbg5" podUID="41629f2e-563a-4e76-8ac2-530e8d204578" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.157703 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a-config-data\") pod \"glance-default-external-api-0\" (UID: \"e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a\") " pod="openstack/glance-default-external-api-0" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.162003 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltn46\" (UniqueName: \"kubernetes.io/projected/e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a-kube-api-access-ltn46\") pod \"glance-default-external-api-0\" (UID: \"e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a\") " pod="openstack/glance-default-external-api-0" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.174150 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a\") " pod="openstack/glance-default-external-api-0" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.238739 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/257d40ce-9d2a-4cc4-9e89-4cb777582c47-dns-svc\") pod \"257d40ce-9d2a-4cc4-9e89-4cb777582c47\" (UID: \"257d40ce-9d2a-4cc4-9e89-4cb777582c47\") " Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.238824 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/257d40ce-9d2a-4cc4-9e89-4cb777582c47-config\") pod \"257d40ce-9d2a-4cc4-9e89-4cb777582c47\" (UID: \"257d40ce-9d2a-4cc4-9e89-4cb777582c47\") " Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.238881 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/257d40ce-9d2a-4cc4-9e89-4cb777582c47-ovsdbserver-sb\") pod \"257d40ce-9d2a-4cc4-9e89-4cb777582c47\" (UID: \"257d40ce-9d2a-4cc4-9e89-4cb777582c47\") " Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.239016 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlc9z\" (UniqueName: \"kubernetes.io/projected/257d40ce-9d2a-4cc4-9e89-4cb777582c47-kube-api-access-tlc9z\") pod \"257d40ce-9d2a-4cc4-9e89-4cb777582c47\" (UID: \"257d40ce-9d2a-4cc4-9e89-4cb777582c47\") " Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.239056 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/257d40ce-9d2a-4cc4-9e89-4cb777582c47-ovsdbserver-nb\") pod \"257d40ce-9d2a-4cc4-9e89-4cb777582c47\" (UID: \"257d40ce-9d2a-4cc4-9e89-4cb777582c47\") " Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.243728 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/257d40ce-9d2a-4cc4-9e89-4cb777582c47-kube-api-access-tlc9z" (OuterVolumeSpecName: "kube-api-access-tlc9z") pod "257d40ce-9d2a-4cc4-9e89-4cb777582c47" (UID: "257d40ce-9d2a-4cc4-9e89-4cb777582c47"). InnerVolumeSpecName "kube-api-access-tlc9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.253566 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.288582 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 01:22:48 crc kubenswrapper[4991]: E1206 01:22:48.289870 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="257d40ce-9d2a-4cc4-9e89-4cb777582c47" containerName="dnsmasq-dns" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.289897 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="257d40ce-9d2a-4cc4-9e89-4cb777582c47" containerName="dnsmasq-dns" Dec 06 01:22:48 crc kubenswrapper[4991]: E1206 01:22:48.289915 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="257d40ce-9d2a-4cc4-9e89-4cb777582c47" containerName="init" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.289923 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="257d40ce-9d2a-4cc4-9e89-4cb777582c47" containerName="init" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.290294 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="257d40ce-9d2a-4cc4-9e89-4cb777582c47" containerName="dnsmasq-dns" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.291596 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.295171 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.309453 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/257d40ce-9d2a-4cc4-9e89-4cb777582c47-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "257d40ce-9d2a-4cc4-9e89-4cb777582c47" (UID: "257d40ce-9d2a-4cc4-9e89-4cb777582c47"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.314450 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/257d40ce-9d2a-4cc4-9e89-4cb777582c47-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "257d40ce-9d2a-4cc4-9e89-4cb777582c47" (UID: "257d40ce-9d2a-4cc4-9e89-4cb777582c47"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.314751 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.341082 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/257d40ce-9d2a-4cc4-9e89-4cb777582c47-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "257d40ce-9d2a-4cc4-9e89-4cb777582c47" (UID: "257d40ce-9d2a-4cc4-9e89-4cb777582c47"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.341663 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/257d40ce-9d2a-4cc4-9e89-4cb777582c47-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.341845 4991 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/257d40ce-9d2a-4cc4-9e89-4cb777582c47-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.341858 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/257d40ce-9d2a-4cc4-9e89-4cb777582c47-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.341868 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlc9z\" (UniqueName: \"kubernetes.io/projected/257d40ce-9d2a-4cc4-9e89-4cb777582c47-kube-api-access-tlc9z\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.357623 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/257d40ce-9d2a-4cc4-9e89-4cb777582c47-config" (OuterVolumeSpecName: "config") pod "257d40ce-9d2a-4cc4-9e89-4cb777582c47" (UID: "257d40ce-9d2a-4cc4-9e89-4cb777582c47"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.413734 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-2wbr6" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.413779 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-2wbr6" event={"ID":"257d40ce-9d2a-4cc4-9e89-4cb777582c47","Type":"ContainerDied","Data":"aa13a8613f1e22def58732e570dd15827bf8ab0abc516669d48f6cd17ee5c575"} Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.413824 4991 scope.go:117] "RemoveContainer" containerID="4446c7bbe395dc5b4d0b88f1e738a6013f2ff5fd7470461981530e62075d68d1" Dec 06 01:22:48 crc kubenswrapper[4991]: E1206 01:22:48.415927 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-znbg5" podUID="41629f2e-563a-4e76-8ac2-530e8d204578" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.447214 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"b1287126-1f1f-48cf-b13d-9be925e1c44c\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.447309 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1287126-1f1f-48cf-b13d-9be925e1c44c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b1287126-1f1f-48cf-b13d-9be925e1c44c\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.447348 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1287126-1f1f-48cf-b13d-9be925e1c44c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b1287126-1f1f-48cf-b13d-9be925e1c44c\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.447384 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1287126-1f1f-48cf-b13d-9be925e1c44c-logs\") pod \"glance-default-internal-api-0\" (UID: \"b1287126-1f1f-48cf-b13d-9be925e1c44c\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.447469 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg2l2\" (UniqueName: \"kubernetes.io/projected/b1287126-1f1f-48cf-b13d-9be925e1c44c-kube-api-access-gg2l2\") pod \"glance-default-internal-api-0\" (UID: \"b1287126-1f1f-48cf-b13d-9be925e1c44c\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.447499 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1287126-1f1f-48cf-b13d-9be925e1c44c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b1287126-1f1f-48cf-b13d-9be925e1c44c\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.447694 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b1287126-1f1f-48cf-b13d-9be925e1c44c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b1287126-1f1f-48cf-b13d-9be925e1c44c\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.448075 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/257d40ce-9d2a-4cc4-9e89-4cb777582c47-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.456968 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-2wbr6"] Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.463618 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-2wbr6"] Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.549016 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg2l2\" (UniqueName: \"kubernetes.io/projected/b1287126-1f1f-48cf-b13d-9be925e1c44c-kube-api-access-gg2l2\") pod \"glance-default-internal-api-0\" (UID: \"b1287126-1f1f-48cf-b13d-9be925e1c44c\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.549077 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1287126-1f1f-48cf-b13d-9be925e1c44c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b1287126-1f1f-48cf-b13d-9be925e1c44c\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.549138 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b1287126-1f1f-48cf-b13d-9be925e1c44c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b1287126-1f1f-48cf-b13d-9be925e1c44c\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.549592 4991 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"b1287126-1f1f-48cf-b13d-9be925e1c44c\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.549229 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"b1287126-1f1f-48cf-b13d-9be925e1c44c\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.549894 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1287126-1f1f-48cf-b13d-9be925e1c44c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b1287126-1f1f-48cf-b13d-9be925e1c44c\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.549926 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1287126-1f1f-48cf-b13d-9be925e1c44c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b1287126-1f1f-48cf-b13d-9be925e1c44c\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.549966 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1287126-1f1f-48cf-b13d-9be925e1c44c-logs\") pod \"glance-default-internal-api-0\" (UID: \"b1287126-1f1f-48cf-b13d-9be925e1c44c\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.550571 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1287126-1f1f-48cf-b13d-9be925e1c44c-logs\") pod \"glance-default-internal-api-0\" (UID: \"b1287126-1f1f-48cf-b13d-9be925e1c44c\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.550631 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b1287126-1f1f-48cf-b13d-9be925e1c44c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b1287126-1f1f-48cf-b13d-9be925e1c44c\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.555155 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1287126-1f1f-48cf-b13d-9be925e1c44c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b1287126-1f1f-48cf-b13d-9be925e1c44c\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.556298 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1287126-1f1f-48cf-b13d-9be925e1c44c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b1287126-1f1f-48cf-b13d-9be925e1c44c\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.562929 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1287126-1f1f-48cf-b13d-9be925e1c44c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b1287126-1f1f-48cf-b13d-9be925e1c44c\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.567820 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg2l2\" (UniqueName: \"kubernetes.io/projected/b1287126-1f1f-48cf-b13d-9be925e1c44c-kube-api-access-gg2l2\") pod \"glance-default-internal-api-0\" (UID: \"b1287126-1f1f-48cf-b13d-9be925e1c44c\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.576084 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"b1287126-1f1f-48cf-b13d-9be925e1c44c\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.652236 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 01:22:48 crc kubenswrapper[4991]: I1206 01:22:48.712841 4991 scope.go:117] "RemoveContainer" containerID="5491c90568a001a1638a6e6e8efe9b049d0706709eacdb70c2bedc4c77bb7654" Dec 06 01:22:49 crc kubenswrapper[4991]: I1206 01:22:49.480919 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="257d40ce-9d2a-4cc4-9e89-4cb777582c47" path="/var/lib/kubelet/pods/257d40ce-9d2a-4cc4-9e89-4cb777582c47/volumes" Dec 06 01:22:50 crc kubenswrapper[4991]: I1206 01:22:50.124626 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6664c6795f-tjg96"] Dec 06 01:22:50 crc kubenswrapper[4991]: I1206 01:22:50.155898 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-k5mql"] Dec 06 01:22:50 crc kubenswrapper[4991]: I1206 01:22:50.176146 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-b4cd4b7d8-99kjr"] Dec 06 01:22:50 crc kubenswrapper[4991]: I1206 01:22:50.214060 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-zmlwh"] Dec 06 01:22:50 crc kubenswrapper[4991]: I1206 01:22:50.275369 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 01:22:50 crc kubenswrapper[4991]: W1206 01:22:50.286551 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5660e75_7f9d_4fc8_89f5_dd3d7f521f0a.slice/crio-8ef40676e291f3e494ca9ce1491ba0b3a61e9608616d2d36bc0948775817cde3 WatchSource:0}: Error finding container 8ef40676e291f3e494ca9ce1491ba0b3a61e9608616d2d36bc0948775817cde3: Status 404 returned error can't find the container with id 8ef40676e291f3e494ca9ce1491ba0b3a61e9608616d2d36bc0948775817cde3 Dec 06 01:22:50 crc kubenswrapper[4991]: I1206 01:22:50.336843 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-577fd657dd-psmdm"] Dec 06 01:22:50 crc kubenswrapper[4991]: I1206 01:22:50.512950 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6664c6795f-tjg96" event={"ID":"1a4881ee-332e-454c-bc1d-b99c932ad458","Type":"ContainerStarted","Data":"d65bb4ee5c75d55d011c56142544eab30f4ddb746b82b1f9e4c2c07ffac6e799"} Dec 06 01:22:50 crc kubenswrapper[4991]: I1206 01:22:50.520359 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-577fd657dd-psmdm" event={"ID":"15de5638-5e2b-4ce1-9d25-8f2830cd9241","Type":"ContainerStarted","Data":"7cfec5a4e1ac00a97ac1a435486fe92adc4836cd69a9772f9f99b5190f557875"} Dec 06 01:22:50 crc kubenswrapper[4991]: I1206 01:22:50.533465 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-84bc9ddf6-jd669"] Dec 06 01:22:50 crc kubenswrapper[4991]: I1206 01:22:50.534768 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-zmlwh" event={"ID":"ae198f8f-705c-406d-af09-387de07cb200","Type":"ContainerStarted","Data":"2639371b3400e7e6d0c69bd14e14e60fee297734f7d9c4bd9804a3f57e6af534"} Dec 06 01:22:50 crc kubenswrapper[4991]: I1206 01:22:50.536921 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b4cd4b7d8-99kjr" event={"ID":"ff6a4088-3509-40aa-a3f6-917e289cd44f","Type":"ContainerStarted","Data":"5343ad0f9b14637fe0702df3ffd775e7921cceca8119e9ad5a88b4e1f5d15d44"} Dec 06 01:22:50 crc kubenswrapper[4991]: I1206 01:22:50.543460 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"62fc1d5f-d757-46ae-bea1-2a68ae61a794","Type":"ContainerStarted","Data":"50d14529429efe3966e15ac4dbd2bcfc747722bc46039f193a48291063f08624"} Dec 06 01:22:50 crc kubenswrapper[4991]: W1206 01:22:50.546331 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb305d8ce_5702_42c2_979a_15e0f7e46cae.slice/crio-3eb09c7d4664210572ecc25efb6e48b6166dc54613ca340bce7a1acf4114a777 WatchSource:0}: Error finding container 3eb09c7d4664210572ecc25efb6e48b6166dc54613ca340bce7a1acf4114a777: Status 404 returned error can't find the container with id 3eb09c7d4664210572ecc25efb6e48b6166dc54613ca340bce7a1acf4114a777 Dec 06 01:22:50 crc kubenswrapper[4991]: I1206 01:22:50.547413 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a","Type":"ContainerStarted","Data":"8ef40676e291f3e494ca9ce1491ba0b3a61e9608616d2d36bc0948775817cde3"} Dec 06 01:22:50 crc kubenswrapper[4991]: I1206 01:22:50.550049 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-k5mql" event={"ID":"42ea5f2a-f150-4554-9f87-7ac46f04fd5b","Type":"ContainerStarted","Data":"8ac3748ed9bdac5e1bc0a6e2e02af985a2594df00de594efc4338f9cf2a8b1c9"} Dec 06 01:22:50 crc kubenswrapper[4991]: I1206 01:22:50.553510 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-669lm" event={"ID":"01da8e85-cc82-46af-9370-419a710b2a8d","Type":"ContainerStarted","Data":"10da77ec5bb248db647aef422fbfa7a37863c9390171bce36e5c692118d93337"} Dec 06 01:22:50 crc kubenswrapper[4991]: I1206 01:22:50.574200 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-669lm" podStartSLOduration=4.728283163 podStartE2EDuration="35.574175528s" podCreationTimestamp="2025-12-06 01:22:15 +0000 UTC" firstStartedPulling="2025-12-06 01:22:17.171954497 +0000 UTC m=+1210.522244965" lastFinishedPulling="2025-12-06 01:22:48.017846862 +0000 UTC m=+1241.368137330" observedRunningTime="2025-12-06 01:22:50.570190964 +0000 UTC m=+1243.920481432" watchObservedRunningTime="2025-12-06 01:22:50.574175528 +0000 UTC m=+1243.924465996" Dec 06 01:22:51 crc kubenswrapper[4991]: I1206 01:22:51.546936 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 01:22:51 crc kubenswrapper[4991]: I1206 01:22:51.576366 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-577fd657dd-psmdm" event={"ID":"15de5638-5e2b-4ce1-9d25-8f2830cd9241","Type":"ContainerStarted","Data":"a3bb2433d9547ce7d6f58b3baf423c985ac7a0a025f072e494003c2e6e5628a1"} Dec 06 01:22:51 crc kubenswrapper[4991]: I1206 01:22:51.577406 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-zmlwh" event={"ID":"ae198f8f-705c-406d-af09-387de07cb200","Type":"ContainerStarted","Data":"eee10b7a9a72aa2cf7fe4e5cdd174929dfe17242e129ac00d999664b94fd1593"} Dec 06 01:22:51 crc kubenswrapper[4991]: I1206 01:22:51.602345 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84bc9ddf6-jd669" event={"ID":"b305d8ce-5702-42c2-979a-15e0f7e46cae","Type":"ContainerStarted","Data":"83cd19e892c5b1c1a9d83ebc85f9d716d6a9f10363145fa246234c4943a9eae3"} Dec 06 01:22:51 crc kubenswrapper[4991]: I1206 01:22:51.602402 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84bc9ddf6-jd669" event={"ID":"b305d8ce-5702-42c2-979a-15e0f7e46cae","Type":"ContainerStarted","Data":"7c79c7eb4c22d5c28f1d73c3f542bad213a7899a8004af58de5dd084179087bf"} Dec 06 01:22:51 crc kubenswrapper[4991]: I1206 01:22:51.602412 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84bc9ddf6-jd669" event={"ID":"b305d8ce-5702-42c2-979a-15e0f7e46cae","Type":"ContainerStarted","Data":"3eb09c7d4664210572ecc25efb6e48b6166dc54613ca340bce7a1acf4114a777"} Dec 06 01:22:51 crc kubenswrapper[4991]: I1206 01:22:51.603368 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-84bc9ddf6-jd669" Dec 06 01:22:51 crc kubenswrapper[4991]: I1206 01:22:51.626978 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b4cd4b7d8-99kjr" event={"ID":"ff6a4088-3509-40aa-a3f6-917e289cd44f","Type":"ContainerStarted","Data":"9c8d337db25b8faca94ece7dc0776b6fb45830b2b42f2b7d52e4a5a4778a214e"} Dec 06 01:22:51 crc kubenswrapper[4991]: I1206 01:22:51.638267 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a","Type":"ContainerStarted","Data":"52102f446d27d906d5444ba68a67ce3450c1d64c2f4f20c8eb54f9f7f73e9dda"} Dec 06 01:22:51 crc kubenswrapper[4991]: I1206 01:22:51.648899 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 01:22:51 crc kubenswrapper[4991]: I1206 01:22:51.649877 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-zmlwh" podStartSLOduration=15.649867483 podStartE2EDuration="15.649867483s" podCreationTimestamp="2025-12-06 01:22:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:22:51.638685704 +0000 UTC m=+1244.988976172" watchObservedRunningTime="2025-12-06 01:22:51.649867483 +0000 UTC m=+1245.000157951" Dec 06 01:22:51 crc kubenswrapper[4991]: I1206 01:22:51.681277 4991 generic.go:334] "Generic (PLEG): container finished" podID="42ea5f2a-f150-4554-9f87-7ac46f04fd5b" containerID="7902b390abdafc3527d4e237f3a3caff8d6e88c20108f4a1a1e904674c3803d6" exitCode=0 Dec 06 01:22:51 crc kubenswrapper[4991]: I1206 01:22:51.681350 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-k5mql" event={"ID":"42ea5f2a-f150-4554-9f87-7ac46f04fd5b","Type":"ContainerDied","Data":"7902b390abdafc3527d4e237f3a3caff8d6e88c20108f4a1a1e904674c3803d6"} Dec 06 01:22:51 crc kubenswrapper[4991]: I1206 01:22:51.693402 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-84bc9ddf6-jd669" podStartSLOduration=5.693387019 podStartE2EDuration="5.693387019s" podCreationTimestamp="2025-12-06 01:22:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:22:51.692719392 +0000 UTC m=+1245.043009880" watchObservedRunningTime="2025-12-06 01:22:51.693387019 +0000 UTC m=+1245.043677487" Dec 06 01:22:51 crc kubenswrapper[4991]: W1206 01:22:51.710175 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1287126_1f1f_48cf_b13d_9be925e1c44c.slice/crio-13bfc6d26050fde2df38be4f2157dcb539e5fac94b132e6e454279b8d55e6ffe WatchSource:0}: Error finding container 13bfc6d26050fde2df38be4f2157dcb539e5fac94b132e6e454279b8d55e6ffe: Status 404 returned error can't find the container with id 13bfc6d26050fde2df38be4f2157dcb539e5fac94b132e6e454279b8d55e6ffe Dec 06 01:22:51 crc kubenswrapper[4991]: I1206 01:22:51.778390 4991 generic.go:334] "Generic (PLEG): container finished" podID="1a4881ee-332e-454c-bc1d-b99c932ad458" containerID="cdcc0e5d60fa8595d6f2404cba76d8f2076e0319ef2b848e17b5470dae67927d" exitCode=0 Dec 06 01:22:51 crc kubenswrapper[4991]: I1206 01:22:51.779323 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6664c6795f-tjg96" event={"ID":"1a4881ee-332e-454c-bc1d-b99c932ad458","Type":"ContainerDied","Data":"cdcc0e5d60fa8595d6f2404cba76d8f2076e0319ef2b848e17b5470dae67927d"} Dec 06 01:22:51 crc kubenswrapper[4991]: I1206 01:22:51.957082 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:52.448025 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6664c6795f-tjg96" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:52.503153 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a4881ee-332e-454c-bc1d-b99c932ad458-dns-svc\") pod \"1a4881ee-332e-454c-bc1d-b99c932ad458\" (UID: \"1a4881ee-332e-454c-bc1d-b99c932ad458\") " Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:52.503274 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58f59\" (UniqueName: \"kubernetes.io/projected/1a4881ee-332e-454c-bc1d-b99c932ad458-kube-api-access-58f59\") pod \"1a4881ee-332e-454c-bc1d-b99c932ad458\" (UID: \"1a4881ee-332e-454c-bc1d-b99c932ad458\") " Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:52.503308 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a4881ee-332e-454c-bc1d-b99c932ad458-config\") pod \"1a4881ee-332e-454c-bc1d-b99c932ad458\" (UID: \"1a4881ee-332e-454c-bc1d-b99c932ad458\") " Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:52.503470 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1a4881ee-332e-454c-bc1d-b99c932ad458-dns-swift-storage-0\") pod \"1a4881ee-332e-454c-bc1d-b99c932ad458\" (UID: \"1a4881ee-332e-454c-bc1d-b99c932ad458\") " Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:52.503546 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a4881ee-332e-454c-bc1d-b99c932ad458-ovsdbserver-sb\") pod \"1a4881ee-332e-454c-bc1d-b99c932ad458\" (UID: \"1a4881ee-332e-454c-bc1d-b99c932ad458\") " Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:52.503613 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a4881ee-332e-454c-bc1d-b99c932ad458-ovsdbserver-nb\") pod \"1a4881ee-332e-454c-bc1d-b99c932ad458\" (UID: \"1a4881ee-332e-454c-bc1d-b99c932ad458\") " Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:52.537898 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a4881ee-332e-454c-bc1d-b99c932ad458-kube-api-access-58f59" (OuterVolumeSpecName: "kube-api-access-58f59") pod "1a4881ee-332e-454c-bc1d-b99c932ad458" (UID: "1a4881ee-332e-454c-bc1d-b99c932ad458"). InnerVolumeSpecName "kube-api-access-58f59". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:52.578829 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a4881ee-332e-454c-bc1d-b99c932ad458-config" (OuterVolumeSpecName: "config") pod "1a4881ee-332e-454c-bc1d-b99c932ad458" (UID: "1a4881ee-332e-454c-bc1d-b99c932ad458"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:52.597535 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a4881ee-332e-454c-bc1d-b99c932ad458-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1a4881ee-332e-454c-bc1d-b99c932ad458" (UID: "1a4881ee-332e-454c-bc1d-b99c932ad458"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:52.615916 4991 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1a4881ee-332e-454c-bc1d-b99c932ad458-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:52.615961 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58f59\" (UniqueName: \"kubernetes.io/projected/1a4881ee-332e-454c-bc1d-b99c932ad458-kube-api-access-58f59\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:52.617974 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a4881ee-332e-454c-bc1d-b99c932ad458-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:52.626640 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a4881ee-332e-454c-bc1d-b99c932ad458-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1a4881ee-332e-454c-bc1d-b99c932ad458" (UID: "1a4881ee-332e-454c-bc1d-b99c932ad458"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:52.645832 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a4881ee-332e-454c-bc1d-b99c932ad458-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1a4881ee-332e-454c-bc1d-b99c932ad458" (UID: "1a4881ee-332e-454c-bc1d-b99c932ad458"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:52.718315 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a4881ee-332e-454c-bc1d-b99c932ad458-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1a4881ee-332e-454c-bc1d-b99c932ad458" (UID: "1a4881ee-332e-454c-bc1d-b99c932ad458"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:52.719959 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a4881ee-332e-454c-bc1d-b99c932ad458-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:52.720396 4991 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a4881ee-332e-454c-bc1d-b99c932ad458-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:52.720412 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a4881ee-332e-454c-bc1d-b99c932ad458-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:52.805179 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b1287126-1f1f-48cf-b13d-9be925e1c44c","Type":"ContainerStarted","Data":"13bfc6d26050fde2df38be4f2157dcb539e5fac94b132e6e454279b8d55e6ffe"} Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:52.810198 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-k5mql" event={"ID":"42ea5f2a-f150-4554-9f87-7ac46f04fd5b","Type":"ContainerStarted","Data":"84e655dd752044e98c8eb4073341bbe508f91f987fa6dc42189d09912c7132e2"} Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:52.811911 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc5c4795-k5mql" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:52.814363 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6664c6795f-tjg96" event={"ID":"1a4881ee-332e-454c-bc1d-b99c932ad458","Type":"ContainerDied","Data":"d65bb4ee5c75d55d011c56142544eab30f4ddb746b82b1f9e4c2c07ffac6e799"} Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:52.814406 4991 scope.go:117] "RemoveContainer" containerID="cdcc0e5d60fa8595d6f2404cba76d8f2076e0319ef2b848e17b5470dae67927d" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:52.814577 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6664c6795f-tjg96" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:52.851509 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-577fd657dd-psmdm" event={"ID":"15de5638-5e2b-4ce1-9d25-8f2830cd9241","Type":"ContainerStarted","Data":"0fa79223f4d4c75cd7969c823d5d50f351df675117d6acd8052b9e7513fe1a4f"} Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:52.892092 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc5c4795-k5mql" podStartSLOduration=5.892039095 podStartE2EDuration="5.892039095s" podCreationTimestamp="2025-12-06 01:22:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:22:52.85238532 +0000 UTC m=+1246.202675808" watchObservedRunningTime="2025-12-06 01:22:52.892039095 +0000 UTC m=+1246.242329573" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:52.911256 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b4cd4b7d8-99kjr" event={"ID":"ff6a4088-3509-40aa-a3f6-917e289cd44f","Type":"ContainerStarted","Data":"696c3f31706c7bcf6be1fafe3f512c3ea184441fcb34083126b168a2bdddc61d"} Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:52.943979 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a","Type":"ContainerStarted","Data":"617ba6166f048fef93eb1abedd6c9d0d8681c45dc8eb75b37fd3abba0c864ce9"} Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:52.948790 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a" containerName="glance-log" containerID="cri-o://52102f446d27d906d5444ba68a67ce3450c1d64c2f4f20c8eb54f9f7f73e9dda" gracePeriod=30 Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:52.950127 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a" containerName="glance-httpd" containerID="cri-o://617ba6166f048fef93eb1abedd6c9d0d8681c45dc8eb75b37fd3abba0c864ce9" gracePeriod=30 Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:52.972242 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6664c6795f-tjg96"] Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:52.995487 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6664c6795f-tjg96"] Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:53.003295 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-577fd657dd-psmdm" podStartSLOduration=28.506436824 podStartE2EDuration="29.003265552s" podCreationTimestamp="2025-12-06 01:22:24 +0000 UTC" firstStartedPulling="2025-12-06 01:22:50.398391642 +0000 UTC m=+1243.748682100" lastFinishedPulling="2025-12-06 01:22:50.89522036 +0000 UTC m=+1244.245510828" observedRunningTime="2025-12-06 01:22:52.925679875 +0000 UTC m=+1246.275970343" watchObservedRunningTime="2025-12-06 01:22:53.003265552 +0000 UTC m=+1246.353556010" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:53.024904 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-b4cd4b7d8-99kjr" podStartSLOduration=28.474444436 podStartE2EDuration="29.024887641s" podCreationTimestamp="2025-12-06 01:22:24 +0000 UTC" firstStartedPulling="2025-12-06 01:22:50.159389591 +0000 UTC m=+1243.509680059" lastFinishedPulling="2025-12-06 01:22:50.709832796 +0000 UTC m=+1244.060123264" observedRunningTime="2025-12-06 01:22:52.979654331 +0000 UTC m=+1246.329944799" watchObservedRunningTime="2025-12-06 01:22:53.024887641 +0000 UTC m=+1246.375178109" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:53.052208 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.052179287 podStartE2EDuration="7.052179287s" podCreationTimestamp="2025-12-06 01:22:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:22:53.019770619 +0000 UTC m=+1246.370061087" watchObservedRunningTime="2025-12-06 01:22:53.052179287 +0000 UTC m=+1246.402469745" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:53.070497 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a4881ee-332e-454c-bc1d-b99c932ad458" path="/var/lib/kubelet/pods/1a4881ee-332e-454c-bc1d-b99c932ad458/volumes" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:53.325358 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-779776dbb5-wnlcw"] Dec 06 01:22:53 crc kubenswrapper[4991]: E1206 01:22:53.326034 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a4881ee-332e-454c-bc1d-b99c932ad458" containerName="init" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:53.326047 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a4881ee-332e-454c-bc1d-b99c932ad458" containerName="init" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:53.328352 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a4881ee-332e-454c-bc1d-b99c932ad458" containerName="init" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:53.332678 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-779776dbb5-wnlcw" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:53.337136 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:53.337478 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:53.346438 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-779776dbb5-wnlcw"] Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:53.442095 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0027a35f-14c5-42ff-b153-359f129f0d2b-config\") pod \"neutron-779776dbb5-wnlcw\" (UID: \"0027a35f-14c5-42ff-b153-359f129f0d2b\") " pod="openstack/neutron-779776dbb5-wnlcw" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:53.442437 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0027a35f-14c5-42ff-b153-359f129f0d2b-internal-tls-certs\") pod \"neutron-779776dbb5-wnlcw\" (UID: \"0027a35f-14c5-42ff-b153-359f129f0d2b\") " pod="openstack/neutron-779776dbb5-wnlcw" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:53.442488 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0027a35f-14c5-42ff-b153-359f129f0d2b-combined-ca-bundle\") pod \"neutron-779776dbb5-wnlcw\" (UID: \"0027a35f-14c5-42ff-b153-359f129f0d2b\") " pod="openstack/neutron-779776dbb5-wnlcw" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:53.442546 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg4xw\" (UniqueName: \"kubernetes.io/projected/0027a35f-14c5-42ff-b153-359f129f0d2b-kube-api-access-sg4xw\") pod \"neutron-779776dbb5-wnlcw\" (UID: \"0027a35f-14c5-42ff-b153-359f129f0d2b\") " pod="openstack/neutron-779776dbb5-wnlcw" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:53.442601 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0027a35f-14c5-42ff-b153-359f129f0d2b-httpd-config\") pod \"neutron-779776dbb5-wnlcw\" (UID: \"0027a35f-14c5-42ff-b153-359f129f0d2b\") " pod="openstack/neutron-779776dbb5-wnlcw" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:53.442643 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0027a35f-14c5-42ff-b153-359f129f0d2b-ovndb-tls-certs\") pod \"neutron-779776dbb5-wnlcw\" (UID: \"0027a35f-14c5-42ff-b153-359f129f0d2b\") " pod="openstack/neutron-779776dbb5-wnlcw" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:53.442703 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0027a35f-14c5-42ff-b153-359f129f0d2b-public-tls-certs\") pod \"neutron-779776dbb5-wnlcw\" (UID: \"0027a35f-14c5-42ff-b153-359f129f0d2b\") " pod="openstack/neutron-779776dbb5-wnlcw" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:53.543707 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sg4xw\" (UniqueName: \"kubernetes.io/projected/0027a35f-14c5-42ff-b153-359f129f0d2b-kube-api-access-sg4xw\") pod \"neutron-779776dbb5-wnlcw\" (UID: \"0027a35f-14c5-42ff-b153-359f129f0d2b\") " pod="openstack/neutron-779776dbb5-wnlcw" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:53.543755 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0027a35f-14c5-42ff-b153-359f129f0d2b-httpd-config\") pod \"neutron-779776dbb5-wnlcw\" (UID: \"0027a35f-14c5-42ff-b153-359f129f0d2b\") " pod="openstack/neutron-779776dbb5-wnlcw" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:53.543800 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0027a35f-14c5-42ff-b153-359f129f0d2b-ovndb-tls-certs\") pod \"neutron-779776dbb5-wnlcw\" (UID: \"0027a35f-14c5-42ff-b153-359f129f0d2b\") " pod="openstack/neutron-779776dbb5-wnlcw" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:53.543854 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0027a35f-14c5-42ff-b153-359f129f0d2b-public-tls-certs\") pod \"neutron-779776dbb5-wnlcw\" (UID: \"0027a35f-14c5-42ff-b153-359f129f0d2b\") " pod="openstack/neutron-779776dbb5-wnlcw" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:53.543919 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0027a35f-14c5-42ff-b153-359f129f0d2b-config\") pod \"neutron-779776dbb5-wnlcw\" (UID: \"0027a35f-14c5-42ff-b153-359f129f0d2b\") " pod="openstack/neutron-779776dbb5-wnlcw" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:53.543942 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0027a35f-14c5-42ff-b153-359f129f0d2b-internal-tls-certs\") pod \"neutron-779776dbb5-wnlcw\" (UID: \"0027a35f-14c5-42ff-b153-359f129f0d2b\") " pod="openstack/neutron-779776dbb5-wnlcw" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:53.543974 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0027a35f-14c5-42ff-b153-359f129f0d2b-combined-ca-bundle\") pod \"neutron-779776dbb5-wnlcw\" (UID: \"0027a35f-14c5-42ff-b153-359f129f0d2b\") " pod="openstack/neutron-779776dbb5-wnlcw" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:53.551783 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0027a35f-14c5-42ff-b153-359f129f0d2b-public-tls-certs\") pod \"neutron-779776dbb5-wnlcw\" (UID: \"0027a35f-14c5-42ff-b153-359f129f0d2b\") " pod="openstack/neutron-779776dbb5-wnlcw" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:53.555554 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0027a35f-14c5-42ff-b153-359f129f0d2b-internal-tls-certs\") pod \"neutron-779776dbb5-wnlcw\" (UID: \"0027a35f-14c5-42ff-b153-359f129f0d2b\") " pod="openstack/neutron-779776dbb5-wnlcw" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:53.559534 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0027a35f-14c5-42ff-b153-359f129f0d2b-combined-ca-bundle\") pod \"neutron-779776dbb5-wnlcw\" (UID: \"0027a35f-14c5-42ff-b153-359f129f0d2b\") " pod="openstack/neutron-779776dbb5-wnlcw" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:53.561121 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0027a35f-14c5-42ff-b153-359f129f0d2b-ovndb-tls-certs\") pod \"neutron-779776dbb5-wnlcw\" (UID: \"0027a35f-14c5-42ff-b153-359f129f0d2b\") " pod="openstack/neutron-779776dbb5-wnlcw" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:53.561571 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0027a35f-14c5-42ff-b153-359f129f0d2b-httpd-config\") pod \"neutron-779776dbb5-wnlcw\" (UID: \"0027a35f-14c5-42ff-b153-359f129f0d2b\") " pod="openstack/neutron-779776dbb5-wnlcw" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:53.561941 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg4xw\" (UniqueName: \"kubernetes.io/projected/0027a35f-14c5-42ff-b153-359f129f0d2b-kube-api-access-sg4xw\") pod \"neutron-779776dbb5-wnlcw\" (UID: \"0027a35f-14c5-42ff-b153-359f129f0d2b\") " pod="openstack/neutron-779776dbb5-wnlcw" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:53.578465 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0027a35f-14c5-42ff-b153-359f129f0d2b-config\") pod \"neutron-779776dbb5-wnlcw\" (UID: \"0027a35f-14c5-42ff-b153-359f129f0d2b\") " pod="openstack/neutron-779776dbb5-wnlcw" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:53.686633 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-779776dbb5-wnlcw" Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:53.969963 4991 generic.go:334] "Generic (PLEG): container finished" podID="e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a" containerID="617ba6166f048fef93eb1abedd6c9d0d8681c45dc8eb75b37fd3abba0c864ce9" exitCode=0 Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:53.970947 4991 generic.go:334] "Generic (PLEG): container finished" podID="e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a" containerID="52102f446d27d906d5444ba68a67ce3450c1d64c2f4f20c8eb54f9f7f73e9dda" exitCode=143 Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:53.970892 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a","Type":"ContainerDied","Data":"617ba6166f048fef93eb1abedd6c9d0d8681c45dc8eb75b37fd3abba0c864ce9"} Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:53.971290 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a","Type":"ContainerDied","Data":"52102f446d27d906d5444ba68a67ce3450c1d64c2f4f20c8eb54f9f7f73e9dda"} Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:53.980277 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b1287126-1f1f-48cf-b13d-9be925e1c44c","Type":"ContainerStarted","Data":"5408590a89781a53168f67595a52025fdbc07e06f27bffa972a4406ef0fc689e"} Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:53.986599 4991 generic.go:334] "Generic (PLEG): container finished" podID="01da8e85-cc82-46af-9370-419a710b2a8d" containerID="10da77ec5bb248db647aef422fbfa7a37863c9390171bce36e5c692118d93337" exitCode=0 Dec 06 01:22:53 crc kubenswrapper[4991]: I1206 01:22:53.986723 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-669lm" event={"ID":"01da8e85-cc82-46af-9370-419a710b2a8d","Type":"ContainerDied","Data":"10da77ec5bb248db647aef422fbfa7a37863c9390171bce36e5c692118d93337"} Dec 06 01:22:54 crc kubenswrapper[4991]: I1206 01:22:54.328294 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 01:22:54 crc kubenswrapper[4991]: I1206 01:22:54.384862 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a-scripts\") pod \"e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a\" (UID: \"e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a\") " Dec 06 01:22:54 crc kubenswrapper[4991]: I1206 01:22:54.385094 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a-logs\") pod \"e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a\" (UID: \"e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a\") " Dec 06 01:22:54 crc kubenswrapper[4991]: I1206 01:22:54.385196 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a-httpd-run\") pod \"e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a\" (UID: \"e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a\") " Dec 06 01:22:54 crc kubenswrapper[4991]: I1206 01:22:54.385246 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a-config-data\") pod \"e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a\" (UID: \"e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a\") " Dec 06 01:22:54 crc kubenswrapper[4991]: I1206 01:22:54.385321 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a\" (UID: \"e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a\") " Dec 06 01:22:54 crc kubenswrapper[4991]: I1206 01:22:54.385410 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a-combined-ca-bundle\") pod \"e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a\" (UID: \"e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a\") " Dec 06 01:22:54 crc kubenswrapper[4991]: I1206 01:22:54.385502 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltn46\" (UniqueName: \"kubernetes.io/projected/e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a-kube-api-access-ltn46\") pod \"e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a\" (UID: \"e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a\") " Dec 06 01:22:54 crc kubenswrapper[4991]: I1206 01:22:54.387578 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a" (UID: "e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:22:54 crc kubenswrapper[4991]: I1206 01:22:54.392564 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a-logs" (OuterVolumeSpecName: "logs") pod "e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a" (UID: "e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:22:54 crc kubenswrapper[4991]: I1206 01:22:54.398260 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a-scripts" (OuterVolumeSpecName: "scripts") pod "e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a" (UID: "e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:22:54 crc kubenswrapper[4991]: I1206 01:22:54.415614 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a" (UID: "e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 06 01:22:54 crc kubenswrapper[4991]: I1206 01:22:54.417500 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a-kube-api-access-ltn46" (OuterVolumeSpecName: "kube-api-access-ltn46") pod "e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a" (UID: "e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a"). InnerVolumeSpecName "kube-api-access-ltn46". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:22:54 crc kubenswrapper[4991]: I1206 01:22:54.469927 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a-config-data" (OuterVolumeSpecName: "config-data") pod "e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a" (UID: "e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:22:54 crc kubenswrapper[4991]: I1206 01:22:54.487305 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-b4cd4b7d8-99kjr" Dec 06 01:22:54 crc kubenswrapper[4991]: I1206 01:22:54.487438 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-b4cd4b7d8-99kjr" Dec 06 01:22:54 crc kubenswrapper[4991]: I1206 01:22:54.489864 4991 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a-logs\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:54 crc kubenswrapper[4991]: I1206 01:22:54.490886 4991 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:54 crc kubenswrapper[4991]: I1206 01:22:54.490898 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:54 crc kubenswrapper[4991]: I1206 01:22:54.490924 4991 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 06 01:22:54 crc kubenswrapper[4991]: I1206 01:22:54.490934 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltn46\" (UniqueName: \"kubernetes.io/projected/e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a-kube-api-access-ltn46\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:54 crc kubenswrapper[4991]: I1206 01:22:54.490945 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:54 crc kubenswrapper[4991]: I1206 01:22:54.499198 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a" (UID: "e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:22:54 crc kubenswrapper[4991]: I1206 01:22:54.529617 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-779776dbb5-wnlcw"] Dec 06 01:22:54 crc kubenswrapper[4991]: I1206 01:22:54.565314 4991 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 06 01:22:54 crc kubenswrapper[4991]: I1206 01:22:54.593837 4991 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:54 crc kubenswrapper[4991]: I1206 01:22:54.593900 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:54 crc kubenswrapper[4991]: I1206 01:22:54.746461 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-577fd657dd-psmdm" Dec 06 01:22:54 crc kubenswrapper[4991]: I1206 01:22:54.746537 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-577fd657dd-psmdm" Dec 06 01:22:55 crc kubenswrapper[4991]: I1206 01:22:55.011070 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 01:22:55 crc kubenswrapper[4991]: I1206 01:22:55.011341 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a","Type":"ContainerDied","Data":"8ef40676e291f3e494ca9ce1491ba0b3a61e9608616d2d36bc0948775817cde3"} Dec 06 01:22:55 crc kubenswrapper[4991]: I1206 01:22:55.011408 4991 scope.go:117] "RemoveContainer" containerID="617ba6166f048fef93eb1abedd6c9d0d8681c45dc8eb75b37fd3abba0c864ce9" Dec 06 01:22:55 crc kubenswrapper[4991]: I1206 01:22:55.028969 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b1287126-1f1f-48cf-b13d-9be925e1c44c","Type":"ContainerStarted","Data":"59306b6b6a04260df3dcfe78b88c94bca0cfcc51a9ef321ccd89bc1b703df33d"} Dec 06 01:22:55 crc kubenswrapper[4991]: I1206 01:22:55.029162 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b1287126-1f1f-48cf-b13d-9be925e1c44c" containerName="glance-log" containerID="cri-o://5408590a89781a53168f67595a52025fdbc07e06f27bffa972a4406ef0fc689e" gracePeriod=30 Dec 06 01:22:55 crc kubenswrapper[4991]: I1206 01:22:55.029493 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b1287126-1f1f-48cf-b13d-9be925e1c44c" containerName="glance-httpd" containerID="cri-o://59306b6b6a04260df3dcfe78b88c94bca0cfcc51a9ef321ccd89bc1b703df33d" gracePeriod=30 Dec 06 01:22:55 crc kubenswrapper[4991]: I1206 01:22:55.042563 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-779776dbb5-wnlcw" event={"ID":"0027a35f-14c5-42ff-b153-359f129f0d2b","Type":"ContainerStarted","Data":"000e682356eafd4044facb6f6abe8f7ce64cbfc33a4d99487e248a313c7de2ad"} Dec 06 01:22:55 crc kubenswrapper[4991]: I1206 01:22:55.042598 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-779776dbb5-wnlcw" event={"ID":"0027a35f-14c5-42ff-b153-359f129f0d2b","Type":"ContainerStarted","Data":"e345bb232e9d30784243da23da17bdade7b311433bfedc7636986ca9aa978c08"} Dec 06 01:22:55 crc kubenswrapper[4991]: I1206 01:22:55.072346 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=8.072308845 podStartE2EDuration="8.072308845s" podCreationTimestamp="2025-12-06 01:22:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:22:55.056136867 +0000 UTC m=+1248.406427335" watchObservedRunningTime="2025-12-06 01:22:55.072308845 +0000 UTC m=+1248.422599313" Dec 06 01:22:55 crc kubenswrapper[4991]: I1206 01:22:55.080969 4991 scope.go:117] "RemoveContainer" containerID="52102f446d27d906d5444ba68a67ce3450c1d64c2f4f20c8eb54f9f7f73e9dda" Dec 06 01:22:55 crc kubenswrapper[4991]: I1206 01:22:55.092470 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 01:22:55 crc kubenswrapper[4991]: I1206 01:22:55.105208 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 01:22:55 crc kubenswrapper[4991]: I1206 01:22:55.128456 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 01:22:55 crc kubenswrapper[4991]: E1206 01:22:55.130020 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a" containerName="glance-log" Dec 06 01:22:55 crc kubenswrapper[4991]: I1206 01:22:55.130049 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a" containerName="glance-log" Dec 06 01:22:55 crc kubenswrapper[4991]: E1206 01:22:55.130142 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a" containerName="glance-httpd" Dec 06 01:22:55 crc kubenswrapper[4991]: I1206 01:22:55.130151 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a" containerName="glance-httpd" Dec 06 01:22:55 crc kubenswrapper[4991]: I1206 01:22:55.130358 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a" containerName="glance-log" Dec 06 01:22:55 crc kubenswrapper[4991]: I1206 01:22:55.130381 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a" containerName="glance-httpd" Dec 06 01:22:55 crc kubenswrapper[4991]: I1206 01:22:55.131674 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 01:22:55 crc kubenswrapper[4991]: I1206 01:22:55.135394 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 06 01:22:55 crc kubenswrapper[4991]: I1206 01:22:55.136929 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 06 01:22:55 crc kubenswrapper[4991]: I1206 01:22:55.151796 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 01:22:55 crc kubenswrapper[4991]: I1206 01:22:55.214839 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e74c1123-26eb-43c0-935f-4e81cfebf716-logs\") pod \"glance-default-external-api-0\" (UID: \"e74c1123-26eb-43c0-935f-4e81cfebf716\") " pod="openstack/glance-default-external-api-0" Dec 06 01:22:55 crc kubenswrapper[4991]: I1206 01:22:55.214924 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e74c1123-26eb-43c0-935f-4e81cfebf716-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e74c1123-26eb-43c0-935f-4e81cfebf716\") " pod="openstack/glance-default-external-api-0" Dec 06 01:22:55 crc kubenswrapper[4991]: I1206 01:22:55.215008 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e74c1123-26eb-43c0-935f-4e81cfebf716-config-data\") pod \"glance-default-external-api-0\" (UID: \"e74c1123-26eb-43c0-935f-4e81cfebf716\") " pod="openstack/glance-default-external-api-0" Dec 06 01:22:55 crc kubenswrapper[4991]: I1206 01:22:55.215040 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e74c1123-26eb-43c0-935f-4e81cfebf716-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e74c1123-26eb-43c0-935f-4e81cfebf716\") " pod="openstack/glance-default-external-api-0" Dec 06 01:22:55 crc kubenswrapper[4991]: I1206 01:22:55.215063 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"e74c1123-26eb-43c0-935f-4e81cfebf716\") " pod="openstack/glance-default-external-api-0" Dec 06 01:22:55 crc kubenswrapper[4991]: I1206 01:22:55.215322 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e74c1123-26eb-43c0-935f-4e81cfebf716-scripts\") pod \"glance-default-external-api-0\" (UID: \"e74c1123-26eb-43c0-935f-4e81cfebf716\") " pod="openstack/glance-default-external-api-0" Dec 06 01:22:55 crc kubenswrapper[4991]: I1206 01:22:55.215445 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e74c1123-26eb-43c0-935f-4e81cfebf716-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e74c1123-26eb-43c0-935f-4e81cfebf716\") " pod="openstack/glance-default-external-api-0" Dec 06 01:22:55 crc kubenswrapper[4991]: I1206 01:22:55.215693 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fcsl\" (UniqueName: \"kubernetes.io/projected/e74c1123-26eb-43c0-935f-4e81cfebf716-kube-api-access-6fcsl\") pod \"glance-default-external-api-0\" (UID: \"e74c1123-26eb-43c0-935f-4e81cfebf716\") " pod="openstack/glance-default-external-api-0" Dec 06 01:22:55 crc kubenswrapper[4991]: I1206 01:22:55.318569 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e74c1123-26eb-43c0-935f-4e81cfebf716-scripts\") pod \"glance-default-external-api-0\" (UID: \"e74c1123-26eb-43c0-935f-4e81cfebf716\") " pod="openstack/glance-default-external-api-0" Dec 06 01:22:55 crc kubenswrapper[4991]: I1206 01:22:55.318741 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e74c1123-26eb-43c0-935f-4e81cfebf716-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e74c1123-26eb-43c0-935f-4e81cfebf716\") " pod="openstack/glance-default-external-api-0" Dec 06 01:22:55 crc kubenswrapper[4991]: I1206 01:22:55.318846 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fcsl\" (UniqueName: \"kubernetes.io/projected/e74c1123-26eb-43c0-935f-4e81cfebf716-kube-api-access-6fcsl\") pod \"glance-default-external-api-0\" (UID: \"e74c1123-26eb-43c0-935f-4e81cfebf716\") " pod="openstack/glance-default-external-api-0" Dec 06 01:22:55 crc kubenswrapper[4991]: I1206 01:22:55.318934 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e74c1123-26eb-43c0-935f-4e81cfebf716-logs\") pod \"glance-default-external-api-0\" (UID: \"e74c1123-26eb-43c0-935f-4e81cfebf716\") " pod="openstack/glance-default-external-api-0" Dec 06 01:22:55 crc kubenswrapper[4991]: I1206 01:22:55.318983 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e74c1123-26eb-43c0-935f-4e81cfebf716-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e74c1123-26eb-43c0-935f-4e81cfebf716\") " pod="openstack/glance-default-external-api-0" Dec 06 01:22:55 crc kubenswrapper[4991]: I1206 01:22:55.319064 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e74c1123-26eb-43c0-935f-4e81cfebf716-config-data\") pod \"glance-default-external-api-0\" (UID: \"e74c1123-26eb-43c0-935f-4e81cfebf716\") " pod="openstack/glance-default-external-api-0" Dec 06 01:22:55 crc kubenswrapper[4991]: I1206 01:22:55.319101 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e74c1123-26eb-43c0-935f-4e81cfebf716-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e74c1123-26eb-43c0-935f-4e81cfebf716\") " pod="openstack/glance-default-external-api-0" Dec 06 01:22:55 crc kubenswrapper[4991]: I1206 01:22:55.319147 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"e74c1123-26eb-43c0-935f-4e81cfebf716\") " pod="openstack/glance-default-external-api-0" Dec 06 01:22:55 crc kubenswrapper[4991]: I1206 01:22:55.319905 4991 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"e74c1123-26eb-43c0-935f-4e81cfebf716\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Dec 06 01:22:55 crc kubenswrapper[4991]: I1206 01:22:55.320806 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e74c1123-26eb-43c0-935f-4e81cfebf716-logs\") pod \"glance-default-external-api-0\" (UID: \"e74c1123-26eb-43c0-935f-4e81cfebf716\") " pod="openstack/glance-default-external-api-0" Dec 06 01:22:55 crc kubenswrapper[4991]: I1206 01:22:55.321383 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e74c1123-26eb-43c0-935f-4e81cfebf716-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e74c1123-26eb-43c0-935f-4e81cfebf716\") " pod="openstack/glance-default-external-api-0" Dec 06 01:22:55 crc kubenswrapper[4991]: I1206 01:22:55.338705 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e74c1123-26eb-43c0-935f-4e81cfebf716-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e74c1123-26eb-43c0-935f-4e81cfebf716\") " pod="openstack/glance-default-external-api-0" Dec 06 01:22:55 crc kubenswrapper[4991]: I1206 01:22:55.338876 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e74c1123-26eb-43c0-935f-4e81cfebf716-config-data\") pod \"glance-default-external-api-0\" (UID: \"e74c1123-26eb-43c0-935f-4e81cfebf716\") " pod="openstack/glance-default-external-api-0" Dec 06 01:22:55 crc kubenswrapper[4991]: I1206 01:22:55.349305 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e74c1123-26eb-43c0-935f-4e81cfebf716-scripts\") pod \"glance-default-external-api-0\" (UID: \"e74c1123-26eb-43c0-935f-4e81cfebf716\") " pod="openstack/glance-default-external-api-0" Dec 06 01:22:55 crc kubenswrapper[4991]: I1206 01:22:55.349543 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e74c1123-26eb-43c0-935f-4e81cfebf716-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e74c1123-26eb-43c0-935f-4e81cfebf716\") " pod="openstack/glance-default-external-api-0" Dec 06 01:22:55 crc kubenswrapper[4991]: I1206 01:22:55.350261 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fcsl\" (UniqueName: \"kubernetes.io/projected/e74c1123-26eb-43c0-935f-4e81cfebf716-kube-api-access-6fcsl\") pod \"glance-default-external-api-0\" (UID: \"e74c1123-26eb-43c0-935f-4e81cfebf716\") " pod="openstack/glance-default-external-api-0" Dec 06 01:22:55 crc kubenswrapper[4991]: I1206 01:22:55.353952 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"e74c1123-26eb-43c0-935f-4e81cfebf716\") " pod="openstack/glance-default-external-api-0" Dec 06 01:22:55 crc kubenswrapper[4991]: I1206 01:22:55.463013 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 01:22:56 crc kubenswrapper[4991]: I1206 01:22:56.058506 4991 generic.go:334] "Generic (PLEG): container finished" podID="ae198f8f-705c-406d-af09-387de07cb200" containerID="eee10b7a9a72aa2cf7fe4e5cdd174929dfe17242e129ac00d999664b94fd1593" exitCode=0 Dec 06 01:22:56 crc kubenswrapper[4991]: I1206 01:22:56.058606 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-zmlwh" event={"ID":"ae198f8f-705c-406d-af09-387de07cb200","Type":"ContainerDied","Data":"eee10b7a9a72aa2cf7fe4e5cdd174929dfe17242e129ac00d999664b94fd1593"} Dec 06 01:22:56 crc kubenswrapper[4991]: I1206 01:22:56.069443 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-779776dbb5-wnlcw" event={"ID":"0027a35f-14c5-42ff-b153-359f129f0d2b","Type":"ContainerStarted","Data":"097d8af37d8eb169373a61094c9a7e5ce6e568d211832fcdda7ea0a5d4308479"} Dec 06 01:22:56 crc kubenswrapper[4991]: I1206 01:22:56.070056 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-779776dbb5-wnlcw" Dec 06 01:22:56 crc kubenswrapper[4991]: I1206 01:22:56.073227 4991 generic.go:334] "Generic (PLEG): container finished" podID="b1287126-1f1f-48cf-b13d-9be925e1c44c" containerID="59306b6b6a04260df3dcfe78b88c94bca0cfcc51a9ef321ccd89bc1b703df33d" exitCode=0 Dec 06 01:22:56 crc kubenswrapper[4991]: I1206 01:22:56.073276 4991 generic.go:334] "Generic (PLEG): container finished" podID="b1287126-1f1f-48cf-b13d-9be925e1c44c" containerID="5408590a89781a53168f67595a52025fdbc07e06f27bffa972a4406ef0fc689e" exitCode=143 Dec 06 01:22:56 crc kubenswrapper[4991]: I1206 01:22:56.073326 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b1287126-1f1f-48cf-b13d-9be925e1c44c","Type":"ContainerDied","Data":"59306b6b6a04260df3dcfe78b88c94bca0cfcc51a9ef321ccd89bc1b703df33d"} Dec 06 01:22:56 crc kubenswrapper[4991]: I1206 01:22:56.073372 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b1287126-1f1f-48cf-b13d-9be925e1c44c","Type":"ContainerDied","Data":"5408590a89781a53168f67595a52025fdbc07e06f27bffa972a4406ef0fc689e"} Dec 06 01:22:56 crc kubenswrapper[4991]: I1206 01:22:56.126873 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-779776dbb5-wnlcw" podStartSLOduration=3.126851265 podStartE2EDuration="3.126851265s" podCreationTimestamp="2025-12-06 01:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:22:56.122908843 +0000 UTC m=+1249.473199321" watchObservedRunningTime="2025-12-06 01:22:56.126851265 +0000 UTC m=+1249.477141733" Dec 06 01:22:57 crc kubenswrapper[4991]: I1206 01:22:57.080071 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a" path="/var/lib/kubelet/pods/e5660e75-7f9d-4fc8-89f5-dd3d7f521f0a/volumes" Dec 06 01:22:57 crc kubenswrapper[4991]: I1206 01:22:57.601176 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5ccc5c4795-k5mql" Dec 06 01:22:57 crc kubenswrapper[4991]: I1206 01:22:57.734653 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fcfdd6f9f-vx96n"] Dec 06 01:22:57 crc kubenswrapper[4991]: I1206 01:22:57.734888 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-fcfdd6f9f-vx96n" podUID="dbd0cb1a-1401-4031-b980-ef43103702a3" containerName="dnsmasq-dns" containerID="cri-o://9c934d2d7ee53dbcc0b8a8a439ed753250da5345430007a106d20855b1af3ec1" gracePeriod=10 Dec 06 01:22:59 crc kubenswrapper[4991]: I1206 01:22:59.134867 4991 generic.go:334] "Generic (PLEG): container finished" podID="dbd0cb1a-1401-4031-b980-ef43103702a3" containerID="9c934d2d7ee53dbcc0b8a8a439ed753250da5345430007a106d20855b1af3ec1" exitCode=0 Dec 06 01:22:59 crc kubenswrapper[4991]: I1206 01:22:59.135345 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcfdd6f9f-vx96n" event={"ID":"dbd0cb1a-1401-4031-b980-ef43103702a3","Type":"ContainerDied","Data":"9c934d2d7ee53dbcc0b8a8a439ed753250da5345430007a106d20855b1af3ec1"} Dec 06 01:22:59 crc kubenswrapper[4991]: I1206 01:22:59.775294 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zmlwh" Dec 06 01:22:59 crc kubenswrapper[4991]: I1206 01:22:59.777414 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-669lm" Dec 06 01:22:59 crc kubenswrapper[4991]: I1206 01:22:59.947716 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae198f8f-705c-406d-af09-387de07cb200-scripts\") pod \"ae198f8f-705c-406d-af09-387de07cb200\" (UID: \"ae198f8f-705c-406d-af09-387de07cb200\") " Dec 06 01:22:59 crc kubenswrapper[4991]: I1206 01:22:59.949032 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae198f8f-705c-406d-af09-387de07cb200-config-data\") pod \"ae198f8f-705c-406d-af09-387de07cb200\" (UID: \"ae198f8f-705c-406d-af09-387de07cb200\") " Dec 06 01:22:59 crc kubenswrapper[4991]: I1206 01:22:59.949179 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ae198f8f-705c-406d-af09-387de07cb200-credential-keys\") pod \"ae198f8f-705c-406d-af09-387de07cb200\" (UID: \"ae198f8f-705c-406d-af09-387de07cb200\") " Dec 06 01:22:59 crc kubenswrapper[4991]: I1206 01:22:59.949214 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae198f8f-705c-406d-af09-387de07cb200-combined-ca-bundle\") pod \"ae198f8f-705c-406d-af09-387de07cb200\" (UID: \"ae198f8f-705c-406d-af09-387de07cb200\") " Dec 06 01:22:59 crc kubenswrapper[4991]: I1206 01:22:59.949235 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01da8e85-cc82-46af-9370-419a710b2a8d-combined-ca-bundle\") pod \"01da8e85-cc82-46af-9370-419a710b2a8d\" (UID: \"01da8e85-cc82-46af-9370-419a710b2a8d\") " Dec 06 01:22:59 crc kubenswrapper[4991]: I1206 01:22:59.949279 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zps6\" (UniqueName: \"kubernetes.io/projected/ae198f8f-705c-406d-af09-387de07cb200-kube-api-access-7zps6\") pod \"ae198f8f-705c-406d-af09-387de07cb200\" (UID: \"ae198f8f-705c-406d-af09-387de07cb200\") " Dec 06 01:22:59 crc kubenswrapper[4991]: I1206 01:22:59.949319 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ae198f8f-705c-406d-af09-387de07cb200-fernet-keys\") pod \"ae198f8f-705c-406d-af09-387de07cb200\" (UID: \"ae198f8f-705c-406d-af09-387de07cb200\") " Dec 06 01:22:59 crc kubenswrapper[4991]: I1206 01:22:59.949371 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01da8e85-cc82-46af-9370-419a710b2a8d-logs\") pod \"01da8e85-cc82-46af-9370-419a710b2a8d\" (UID: \"01da8e85-cc82-46af-9370-419a710b2a8d\") " Dec 06 01:22:59 crc kubenswrapper[4991]: I1206 01:22:59.949408 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01da8e85-cc82-46af-9370-419a710b2a8d-config-data\") pod \"01da8e85-cc82-46af-9370-419a710b2a8d\" (UID: \"01da8e85-cc82-46af-9370-419a710b2a8d\") " Dec 06 01:22:59 crc kubenswrapper[4991]: I1206 01:22:59.949470 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gld2z\" (UniqueName: \"kubernetes.io/projected/01da8e85-cc82-46af-9370-419a710b2a8d-kube-api-access-gld2z\") pod \"01da8e85-cc82-46af-9370-419a710b2a8d\" (UID: \"01da8e85-cc82-46af-9370-419a710b2a8d\") " Dec 06 01:22:59 crc kubenswrapper[4991]: I1206 01:22:59.949520 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01da8e85-cc82-46af-9370-419a710b2a8d-scripts\") pod \"01da8e85-cc82-46af-9370-419a710b2a8d\" (UID: \"01da8e85-cc82-46af-9370-419a710b2a8d\") " Dec 06 01:22:59 crc kubenswrapper[4991]: I1206 01:22:59.949942 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01da8e85-cc82-46af-9370-419a710b2a8d-logs" (OuterVolumeSpecName: "logs") pod "01da8e85-cc82-46af-9370-419a710b2a8d" (UID: "01da8e85-cc82-46af-9370-419a710b2a8d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:22:59 crc kubenswrapper[4991]: I1206 01:22:59.950032 4991 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01da8e85-cc82-46af-9370-419a710b2a8d-logs\") on node \"crc\" DevicePath \"\"" Dec 06 01:22:59 crc kubenswrapper[4991]: I1206 01:22:59.959962 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae198f8f-705c-406d-af09-387de07cb200-kube-api-access-7zps6" (OuterVolumeSpecName: "kube-api-access-7zps6") pod "ae198f8f-705c-406d-af09-387de07cb200" (UID: "ae198f8f-705c-406d-af09-387de07cb200"). InnerVolumeSpecName "kube-api-access-7zps6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:22:59 crc kubenswrapper[4991]: I1206 01:22:59.964232 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae198f8f-705c-406d-af09-387de07cb200-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ae198f8f-705c-406d-af09-387de07cb200" (UID: "ae198f8f-705c-406d-af09-387de07cb200"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:22:59 crc kubenswrapper[4991]: I1206 01:22:59.964288 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae198f8f-705c-406d-af09-387de07cb200-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "ae198f8f-705c-406d-af09-387de07cb200" (UID: "ae198f8f-705c-406d-af09-387de07cb200"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:22:59 crc kubenswrapper[4991]: I1206 01:22:59.965863 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01da8e85-cc82-46af-9370-419a710b2a8d-scripts" (OuterVolumeSpecName: "scripts") pod "01da8e85-cc82-46af-9370-419a710b2a8d" (UID: "01da8e85-cc82-46af-9370-419a710b2a8d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:22:59 crc kubenswrapper[4991]: I1206 01:22:59.965961 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae198f8f-705c-406d-af09-387de07cb200-scripts" (OuterVolumeSpecName: "scripts") pod "ae198f8f-705c-406d-af09-387de07cb200" (UID: "ae198f8f-705c-406d-af09-387de07cb200"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:22:59 crc kubenswrapper[4991]: I1206 01:22:59.971545 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fcfdd6f9f-vx96n" Dec 06 01:22:59 crc kubenswrapper[4991]: I1206 01:22:59.973389 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01da8e85-cc82-46af-9370-419a710b2a8d-kube-api-access-gld2z" (OuterVolumeSpecName: "kube-api-access-gld2z") pod "01da8e85-cc82-46af-9370-419a710b2a8d" (UID: "01da8e85-cc82-46af-9370-419a710b2a8d"). InnerVolumeSpecName "kube-api-access-gld2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:23:00 crc kubenswrapper[4991]: E1206 01:23:00.001862 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae198f8f-705c-406d-af09-387de07cb200-combined-ca-bundle podName:ae198f8f-705c-406d-af09-387de07cb200 nodeName:}" failed. No retries permitted until 2025-12-06 01:23:00.501827999 +0000 UTC m=+1253.852118467 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/ae198f8f-705c-406d-af09-387de07cb200-combined-ca-bundle") pod "ae198f8f-705c-406d-af09-387de07cb200" (UID: "ae198f8f-705c-406d-af09-387de07cb200") : error deleting /var/lib/kubelet/pods/ae198f8f-705c-406d-af09-387de07cb200/volume-subpaths: remove /var/lib/kubelet/pods/ae198f8f-705c-406d-af09-387de07cb200/volume-subpaths: no such file or directory Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.014658 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae198f8f-705c-406d-af09-387de07cb200-config-data" (OuterVolumeSpecName: "config-data") pod "ae198f8f-705c-406d-af09-387de07cb200" (UID: "ae198f8f-705c-406d-af09-387de07cb200"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.050248 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01da8e85-cc82-46af-9370-419a710b2a8d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01da8e85-cc82-46af-9370-419a710b2a8d" (UID: "01da8e85-cc82-46af-9370-419a710b2a8d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.051332 4991 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ae198f8f-705c-406d-af09-387de07cb200-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.051347 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01da8e85-cc82-46af-9370-419a710b2a8d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.051357 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zps6\" (UniqueName: \"kubernetes.io/projected/ae198f8f-705c-406d-af09-387de07cb200-kube-api-access-7zps6\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.051368 4991 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ae198f8f-705c-406d-af09-387de07cb200-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.051376 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gld2z\" (UniqueName: \"kubernetes.io/projected/01da8e85-cc82-46af-9370-419a710b2a8d-kube-api-access-gld2z\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.051388 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01da8e85-cc82-46af-9370-419a710b2a8d-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.051396 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae198f8f-705c-406d-af09-387de07cb200-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.051403 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae198f8f-705c-406d-af09-387de07cb200-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.052868 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01da8e85-cc82-46af-9370-419a710b2a8d-config-data" (OuterVolumeSpecName: "config-data") pod "01da8e85-cc82-46af-9370-419a710b2a8d" (UID: "01da8e85-cc82-46af-9370-419a710b2a8d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.073079 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.155574 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dbd0cb1a-1401-4031-b980-ef43103702a3-dns-svc\") pod \"dbd0cb1a-1401-4031-b980-ef43103702a3\" (UID: \"dbd0cb1a-1401-4031-b980-ef43103702a3\") " Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.157319 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dbd0cb1a-1401-4031-b980-ef43103702a3-ovsdbserver-sb\") pod \"dbd0cb1a-1401-4031-b980-ef43103702a3\" (UID: \"dbd0cb1a-1401-4031-b980-ef43103702a3\") " Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.157378 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnjf6\" (UniqueName: \"kubernetes.io/projected/dbd0cb1a-1401-4031-b980-ef43103702a3-kube-api-access-wnjf6\") pod \"dbd0cb1a-1401-4031-b980-ef43103702a3\" (UID: \"dbd0cb1a-1401-4031-b980-ef43103702a3\") " Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.157485 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dbd0cb1a-1401-4031-b980-ef43103702a3-ovsdbserver-nb\") pod \"dbd0cb1a-1401-4031-b980-ef43103702a3\" (UID: \"dbd0cb1a-1401-4031-b980-ef43103702a3\") " Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.157555 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbd0cb1a-1401-4031-b980-ef43103702a3-config\") pod \"dbd0cb1a-1401-4031-b980-ef43103702a3\" (UID: \"dbd0cb1a-1401-4031-b980-ef43103702a3\") " Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.157612 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dbd0cb1a-1401-4031-b980-ef43103702a3-dns-swift-storage-0\") pod \"dbd0cb1a-1401-4031-b980-ef43103702a3\" (UID: \"dbd0cb1a-1401-4031-b980-ef43103702a3\") " Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.159998 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01da8e85-cc82-46af-9370-419a710b2a8d-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.162855 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbd0cb1a-1401-4031-b980-ef43103702a3-kube-api-access-wnjf6" (OuterVolumeSpecName: "kube-api-access-wnjf6") pod "dbd0cb1a-1401-4031-b980-ef43103702a3" (UID: "dbd0cb1a-1401-4031-b980-ef43103702a3"). InnerVolumeSpecName "kube-api-access-wnjf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.165068 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"62fc1d5f-d757-46ae-bea1-2a68ae61a794","Type":"ContainerStarted","Data":"539defb8b37cc43feb594b8d4dd7f986913c50591b1426a808a5b64f0b61c484"} Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.170052 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b1287126-1f1f-48cf-b13d-9be925e1c44c","Type":"ContainerDied","Data":"13bfc6d26050fde2df38be4f2157dcb539e5fac94b132e6e454279b8d55e6ffe"} Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.170078 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.170108 4991 scope.go:117] "RemoveContainer" containerID="59306b6b6a04260df3dcfe78b88c94bca0cfcc51a9ef321ccd89bc1b703df33d" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.187450 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcfdd6f9f-vx96n" event={"ID":"dbd0cb1a-1401-4031-b980-ef43103702a3","Type":"ContainerDied","Data":"7f972dbb7405c1b3ef4ece8f0b690c43e3eaaa07d80583640deebd6aa787acfa"} Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.187571 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fcfdd6f9f-vx96n" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.193343 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-669lm" event={"ID":"01da8e85-cc82-46af-9370-419a710b2a8d","Type":"ContainerDied","Data":"37e5f2f512e37b70fa5a33e5507bd1fdb5abdd9ec0f6282946c2551e6c0715c2"} Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.193370 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37e5f2f512e37b70fa5a33e5507bd1fdb5abdd9ec0f6282946c2551e6c0715c2" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.193412 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-669lm" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.194744 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-zmlwh" event={"ID":"ae198f8f-705c-406d-af09-387de07cb200","Type":"ContainerDied","Data":"2639371b3400e7e6d0c69bd14e14e60fee297734f7d9c4bd9804a3f57e6af534"} Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.194770 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2639371b3400e7e6d0c69bd14e14e60fee297734f7d9c4bd9804a3f57e6af534" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.194842 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zmlwh" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.215111 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbd0cb1a-1401-4031-b980-ef43103702a3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dbd0cb1a-1401-4031-b980-ef43103702a3" (UID: "dbd0cb1a-1401-4031-b980-ef43103702a3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.216351 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbd0cb1a-1401-4031-b980-ef43103702a3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dbd0cb1a-1401-4031-b980-ef43103702a3" (UID: "dbd0cb1a-1401-4031-b980-ef43103702a3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.217110 4991 scope.go:117] "RemoveContainer" containerID="5408590a89781a53168f67595a52025fdbc07e06f27bffa972a4406ef0fc689e" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.227260 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbd0cb1a-1401-4031-b980-ef43103702a3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "dbd0cb1a-1401-4031-b980-ef43103702a3" (UID: "dbd0cb1a-1401-4031-b980-ef43103702a3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.231273 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbd0cb1a-1401-4031-b980-ef43103702a3-config" (OuterVolumeSpecName: "config") pod "dbd0cb1a-1401-4031-b980-ef43103702a3" (UID: "dbd0cb1a-1401-4031-b980-ef43103702a3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.237908 4991 scope.go:117] "RemoveContainer" containerID="9c934d2d7ee53dbcc0b8a8a439ed753250da5345430007a106d20855b1af3ec1" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.239944 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbd0cb1a-1401-4031-b980-ef43103702a3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dbd0cb1a-1401-4031-b980-ef43103702a3" (UID: "dbd0cb1a-1401-4031-b980-ef43103702a3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.261622 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1287126-1f1f-48cf-b13d-9be925e1c44c-combined-ca-bundle\") pod \"b1287126-1f1f-48cf-b13d-9be925e1c44c\" (UID: \"b1287126-1f1f-48cf-b13d-9be925e1c44c\") " Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.261737 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1287126-1f1f-48cf-b13d-9be925e1c44c-config-data\") pod \"b1287126-1f1f-48cf-b13d-9be925e1c44c\" (UID: \"b1287126-1f1f-48cf-b13d-9be925e1c44c\") " Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.261757 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1287126-1f1f-48cf-b13d-9be925e1c44c-logs\") pod \"b1287126-1f1f-48cf-b13d-9be925e1c44c\" (UID: \"b1287126-1f1f-48cf-b13d-9be925e1c44c\") " Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.261783 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"b1287126-1f1f-48cf-b13d-9be925e1c44c\" (UID: \"b1287126-1f1f-48cf-b13d-9be925e1c44c\") " Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.261801 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b1287126-1f1f-48cf-b13d-9be925e1c44c-httpd-run\") pod \"b1287126-1f1f-48cf-b13d-9be925e1c44c\" (UID: \"b1287126-1f1f-48cf-b13d-9be925e1c44c\") " Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.261857 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gg2l2\" (UniqueName: \"kubernetes.io/projected/b1287126-1f1f-48cf-b13d-9be925e1c44c-kube-api-access-gg2l2\") pod \"b1287126-1f1f-48cf-b13d-9be925e1c44c\" (UID: \"b1287126-1f1f-48cf-b13d-9be925e1c44c\") " Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.261955 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1287126-1f1f-48cf-b13d-9be925e1c44c-scripts\") pod \"b1287126-1f1f-48cf-b13d-9be925e1c44c\" (UID: \"b1287126-1f1f-48cf-b13d-9be925e1c44c\") " Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.262422 4991 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dbd0cb1a-1401-4031-b980-ef43103702a3-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.262441 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dbd0cb1a-1401-4031-b980-ef43103702a3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.262452 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnjf6\" (UniqueName: \"kubernetes.io/projected/dbd0cb1a-1401-4031-b980-ef43103702a3-kube-api-access-wnjf6\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.262463 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dbd0cb1a-1401-4031-b980-ef43103702a3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.262474 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbd0cb1a-1401-4031-b980-ef43103702a3-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.262483 4991 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dbd0cb1a-1401-4031-b980-ef43103702a3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.263107 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1287126-1f1f-48cf-b13d-9be925e1c44c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b1287126-1f1f-48cf-b13d-9be925e1c44c" (UID: "b1287126-1f1f-48cf-b13d-9be925e1c44c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.263139 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1287126-1f1f-48cf-b13d-9be925e1c44c-logs" (OuterVolumeSpecName: "logs") pod "b1287126-1f1f-48cf-b13d-9be925e1c44c" (UID: "b1287126-1f1f-48cf-b13d-9be925e1c44c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.266321 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1287126-1f1f-48cf-b13d-9be925e1c44c-kube-api-access-gg2l2" (OuterVolumeSpecName: "kube-api-access-gg2l2") pod "b1287126-1f1f-48cf-b13d-9be925e1c44c" (UID: "b1287126-1f1f-48cf-b13d-9be925e1c44c"). InnerVolumeSpecName "kube-api-access-gg2l2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.278696 4991 scope.go:117] "RemoveContainer" containerID="b98e9f30a746c9a4915cf9a938394e3fb83ee03190d0c733709a89aca75599c2" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.279103 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "b1287126-1f1f-48cf-b13d-9be925e1c44c" (UID: "b1287126-1f1f-48cf-b13d-9be925e1c44c"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.279786 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1287126-1f1f-48cf-b13d-9be925e1c44c-scripts" (OuterVolumeSpecName: "scripts") pod "b1287126-1f1f-48cf-b13d-9be925e1c44c" (UID: "b1287126-1f1f-48cf-b13d-9be925e1c44c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.319162 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1287126-1f1f-48cf-b13d-9be925e1c44c-config-data" (OuterVolumeSpecName: "config-data") pod "b1287126-1f1f-48cf-b13d-9be925e1c44c" (UID: "b1287126-1f1f-48cf-b13d-9be925e1c44c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.319395 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1287126-1f1f-48cf-b13d-9be925e1c44c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b1287126-1f1f-48cf-b13d-9be925e1c44c" (UID: "b1287126-1f1f-48cf-b13d-9be925e1c44c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.364779 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1287126-1f1f-48cf-b13d-9be925e1c44c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.364827 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1287126-1f1f-48cf-b13d-9be925e1c44c-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.364837 4991 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1287126-1f1f-48cf-b13d-9be925e1c44c-logs\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.364878 4991 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.364886 4991 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b1287126-1f1f-48cf-b13d-9be925e1c44c-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.364895 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gg2l2\" (UniqueName: \"kubernetes.io/projected/b1287126-1f1f-48cf-b13d-9be925e1c44c-kube-api-access-gg2l2\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.364907 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1287126-1f1f-48cf-b13d-9be925e1c44c-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.366069 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.393123 4991 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.466743 4991 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.527049 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.540375 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.551226 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 01:23:00 crc kubenswrapper[4991]: E1206 01:23:00.551639 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae198f8f-705c-406d-af09-387de07cb200" containerName="keystone-bootstrap" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.551656 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae198f8f-705c-406d-af09-387de07cb200" containerName="keystone-bootstrap" Dec 06 01:23:00 crc kubenswrapper[4991]: E1206 01:23:00.551676 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1287126-1f1f-48cf-b13d-9be925e1c44c" containerName="glance-httpd" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.551684 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1287126-1f1f-48cf-b13d-9be925e1c44c" containerName="glance-httpd" Dec 06 01:23:00 crc kubenswrapper[4991]: E1206 01:23:00.551693 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01da8e85-cc82-46af-9370-419a710b2a8d" containerName="placement-db-sync" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.551700 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="01da8e85-cc82-46af-9370-419a710b2a8d" containerName="placement-db-sync" Dec 06 01:23:00 crc kubenswrapper[4991]: E1206 01:23:00.551723 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1287126-1f1f-48cf-b13d-9be925e1c44c" containerName="glance-log" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.551729 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1287126-1f1f-48cf-b13d-9be925e1c44c" containerName="glance-log" Dec 06 01:23:00 crc kubenswrapper[4991]: E1206 01:23:00.551740 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbd0cb1a-1401-4031-b980-ef43103702a3" containerName="init" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.551748 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbd0cb1a-1401-4031-b980-ef43103702a3" containerName="init" Dec 06 01:23:00 crc kubenswrapper[4991]: E1206 01:23:00.551761 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbd0cb1a-1401-4031-b980-ef43103702a3" containerName="dnsmasq-dns" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.551769 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbd0cb1a-1401-4031-b980-ef43103702a3" containerName="dnsmasq-dns" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.552007 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae198f8f-705c-406d-af09-387de07cb200" containerName="keystone-bootstrap" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.552022 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="01da8e85-cc82-46af-9370-419a710b2a8d" containerName="placement-db-sync" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.552035 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1287126-1f1f-48cf-b13d-9be925e1c44c" containerName="glance-log" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.552045 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1287126-1f1f-48cf-b13d-9be925e1c44c" containerName="glance-httpd" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.552058 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbd0cb1a-1401-4031-b980-ef43103702a3" containerName="dnsmasq-dns" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.552931 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.556634 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.557308 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.560815 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.570481 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae198f8f-705c-406d-af09-387de07cb200-combined-ca-bundle\") pod \"ae198f8f-705c-406d-af09-387de07cb200\" (UID: \"ae198f8f-705c-406d-af09-387de07cb200\") " Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.592577 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae198f8f-705c-406d-af09-387de07cb200-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ae198f8f-705c-406d-af09-387de07cb200" (UID: "ae198f8f-705c-406d-af09-387de07cb200"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.672077 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f8e1b96-e706-4f50-b033-b0fb66936deb-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7f8e1b96-e706-4f50-b033-b0fb66936deb\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.672119 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f8e1b96-e706-4f50-b033-b0fb66936deb-logs\") pod \"glance-default-internal-api-0\" (UID: \"7f8e1b96-e706-4f50-b033-b0fb66936deb\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.672180 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f8e1b96-e706-4f50-b033-b0fb66936deb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7f8e1b96-e706-4f50-b033-b0fb66936deb\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.672204 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"7f8e1b96-e706-4f50-b033-b0fb66936deb\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.672247 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f8e1b96-e706-4f50-b033-b0fb66936deb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7f8e1b96-e706-4f50-b033-b0fb66936deb\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.672290 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpq4m\" (UniqueName: \"kubernetes.io/projected/7f8e1b96-e706-4f50-b033-b0fb66936deb-kube-api-access-bpq4m\") pod \"glance-default-internal-api-0\" (UID: \"7f8e1b96-e706-4f50-b033-b0fb66936deb\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.672316 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7f8e1b96-e706-4f50-b033-b0fb66936deb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7f8e1b96-e706-4f50-b033-b0fb66936deb\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.672353 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f8e1b96-e706-4f50-b033-b0fb66936deb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7f8e1b96-e706-4f50-b033-b0fb66936deb\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.672396 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae198f8f-705c-406d-af09-387de07cb200-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.749778 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fcfdd6f9f-vx96n"] Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.760344 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-fcfdd6f9f-vx96n"] Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.774198 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f8e1b96-e706-4f50-b033-b0fb66936deb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7f8e1b96-e706-4f50-b033-b0fb66936deb\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.774249 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"7f8e1b96-e706-4f50-b033-b0fb66936deb\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.774282 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f8e1b96-e706-4f50-b033-b0fb66936deb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7f8e1b96-e706-4f50-b033-b0fb66936deb\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.774325 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpq4m\" (UniqueName: \"kubernetes.io/projected/7f8e1b96-e706-4f50-b033-b0fb66936deb-kube-api-access-bpq4m\") pod \"glance-default-internal-api-0\" (UID: \"7f8e1b96-e706-4f50-b033-b0fb66936deb\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.774353 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7f8e1b96-e706-4f50-b033-b0fb66936deb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7f8e1b96-e706-4f50-b033-b0fb66936deb\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.774387 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f8e1b96-e706-4f50-b033-b0fb66936deb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7f8e1b96-e706-4f50-b033-b0fb66936deb\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.774413 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f8e1b96-e706-4f50-b033-b0fb66936deb-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7f8e1b96-e706-4f50-b033-b0fb66936deb\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.774429 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f8e1b96-e706-4f50-b033-b0fb66936deb-logs\") pod \"glance-default-internal-api-0\" (UID: \"7f8e1b96-e706-4f50-b033-b0fb66936deb\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.775462 4991 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"7f8e1b96-e706-4f50-b033-b0fb66936deb\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.781717 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f8e1b96-e706-4f50-b033-b0fb66936deb-logs\") pod \"glance-default-internal-api-0\" (UID: \"7f8e1b96-e706-4f50-b033-b0fb66936deb\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.782792 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f8e1b96-e706-4f50-b033-b0fb66936deb-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7f8e1b96-e706-4f50-b033-b0fb66936deb\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.785144 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7f8e1b96-e706-4f50-b033-b0fb66936deb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7f8e1b96-e706-4f50-b033-b0fb66936deb\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.785218 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f8e1b96-e706-4f50-b033-b0fb66936deb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7f8e1b96-e706-4f50-b033-b0fb66936deb\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.785593 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f8e1b96-e706-4f50-b033-b0fb66936deb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7f8e1b96-e706-4f50-b033-b0fb66936deb\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.786047 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f8e1b96-e706-4f50-b033-b0fb66936deb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7f8e1b96-e706-4f50-b033-b0fb66936deb\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.808535 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpq4m\" (UniqueName: \"kubernetes.io/projected/7f8e1b96-e706-4f50-b033-b0fb66936deb-kube-api-access-bpq4m\") pod \"glance-default-internal-api-0\" (UID: \"7f8e1b96-e706-4f50-b033-b0fb66936deb\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:23:00 crc kubenswrapper[4991]: I1206 01:23:00.812451 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"7f8e1b96-e706-4f50-b033-b0fb66936deb\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.029190 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.029773 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-99654f6-8nq4x"] Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.030940 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-99654f6-8nq4x" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.038956 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.039222 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.039364 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-z245k" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.039616 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.039809 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.042393 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.083120 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1287126-1f1f-48cf-b13d-9be925e1c44c" path="/var/lib/kubelet/pods/b1287126-1f1f-48cf-b13d-9be925e1c44c/volumes" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.092522 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbd0cb1a-1401-4031-b980-ef43103702a3" path="/var/lib/kubelet/pods/dbd0cb1a-1401-4031-b980-ef43103702a3/volumes" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.093326 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6d59dffb86-qrfm9"] Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.094658 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-99654f6-8nq4x"] Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.094737 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6d59dffb86-qrfm9" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.099267 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.099514 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.099629 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.099726 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-xl2dp" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.099817 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.126500 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6d59dffb86-qrfm9"] Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.180236 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/72ca5f78-0a0d-4e09-8975-cba79bab4842-public-tls-certs\") pod \"keystone-99654f6-8nq4x\" (UID: \"72ca5f78-0a0d-4e09-8975-cba79bab4842\") " pod="openstack/keystone-99654f6-8nq4x" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.180555 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/72ca5f78-0a0d-4e09-8975-cba79bab4842-credential-keys\") pod \"keystone-99654f6-8nq4x\" (UID: \"72ca5f78-0a0d-4e09-8975-cba79bab4842\") " pod="openstack/keystone-99654f6-8nq4x" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.180630 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72ca5f78-0a0d-4e09-8975-cba79bab4842-config-data\") pod \"keystone-99654f6-8nq4x\" (UID: \"72ca5f78-0a0d-4e09-8975-cba79bab4842\") " pod="openstack/keystone-99654f6-8nq4x" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.180656 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72ca5f78-0a0d-4e09-8975-cba79bab4842-scripts\") pod \"keystone-99654f6-8nq4x\" (UID: \"72ca5f78-0a0d-4e09-8975-cba79bab4842\") " pod="openstack/keystone-99654f6-8nq4x" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.180746 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd5faf07-40cf-4cce-9c5a-0fb43ffbb993-public-tls-certs\") pod \"placement-6d59dffb86-qrfm9\" (UID: \"cd5faf07-40cf-4cce-9c5a-0fb43ffbb993\") " pod="openstack/placement-6d59dffb86-qrfm9" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.180776 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd5faf07-40cf-4cce-9c5a-0fb43ffbb993-scripts\") pod \"placement-6d59dffb86-qrfm9\" (UID: \"cd5faf07-40cf-4cce-9c5a-0fb43ffbb993\") " pod="openstack/placement-6d59dffb86-qrfm9" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.180797 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd5faf07-40cf-4cce-9c5a-0fb43ffbb993-internal-tls-certs\") pod \"placement-6d59dffb86-qrfm9\" (UID: \"cd5faf07-40cf-4cce-9c5a-0fb43ffbb993\") " pod="openstack/placement-6d59dffb86-qrfm9" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.180819 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72ca5f78-0a0d-4e09-8975-cba79bab4842-combined-ca-bundle\") pod \"keystone-99654f6-8nq4x\" (UID: \"72ca5f78-0a0d-4e09-8975-cba79bab4842\") " pod="openstack/keystone-99654f6-8nq4x" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.181129 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd5faf07-40cf-4cce-9c5a-0fb43ffbb993-config-data\") pod \"placement-6d59dffb86-qrfm9\" (UID: \"cd5faf07-40cf-4cce-9c5a-0fb43ffbb993\") " pod="openstack/placement-6d59dffb86-qrfm9" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.181182 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/72ca5f78-0a0d-4e09-8975-cba79bab4842-fernet-keys\") pod \"keystone-99654f6-8nq4x\" (UID: \"72ca5f78-0a0d-4e09-8975-cba79bab4842\") " pod="openstack/keystone-99654f6-8nq4x" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.181227 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/72ca5f78-0a0d-4e09-8975-cba79bab4842-internal-tls-certs\") pod \"keystone-99654f6-8nq4x\" (UID: \"72ca5f78-0a0d-4e09-8975-cba79bab4842\") " pod="openstack/keystone-99654f6-8nq4x" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.181244 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdh6z\" (UniqueName: \"kubernetes.io/projected/cd5faf07-40cf-4cce-9c5a-0fb43ffbb993-kube-api-access-rdh6z\") pod \"placement-6d59dffb86-qrfm9\" (UID: \"cd5faf07-40cf-4cce-9c5a-0fb43ffbb993\") " pod="openstack/placement-6d59dffb86-qrfm9" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.181292 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd5faf07-40cf-4cce-9c5a-0fb43ffbb993-logs\") pod \"placement-6d59dffb86-qrfm9\" (UID: \"cd5faf07-40cf-4cce-9c5a-0fb43ffbb993\") " pod="openstack/placement-6d59dffb86-qrfm9" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.181315 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djst9\" (UniqueName: \"kubernetes.io/projected/72ca5f78-0a0d-4e09-8975-cba79bab4842-kube-api-access-djst9\") pod \"keystone-99654f6-8nq4x\" (UID: \"72ca5f78-0a0d-4e09-8975-cba79bab4842\") " pod="openstack/keystone-99654f6-8nq4x" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.181345 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd5faf07-40cf-4cce-9c5a-0fb43ffbb993-combined-ca-bundle\") pod \"placement-6d59dffb86-qrfm9\" (UID: \"cd5faf07-40cf-4cce-9c5a-0fb43ffbb993\") " pod="openstack/placement-6d59dffb86-qrfm9" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.208116 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-7sjq8" event={"ID":"1f102ba4-81e9-4d61-b5a7-90126ab9dd80","Type":"ContainerStarted","Data":"206f7e2f38328658fe5c0922e582194c7963df17b8d2aa3ff32ca649c0c51a4d"} Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.213885 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e74c1123-26eb-43c0-935f-4e81cfebf716","Type":"ContainerStarted","Data":"aa2eb5f0f8a82137ce365cd29c102ac4411dd75fd6727867985f43f055aecaae"} Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.213926 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e74c1123-26eb-43c0-935f-4e81cfebf716","Type":"ContainerStarted","Data":"4920aac871b90fab1103f6babeca5ad9c803cafb80961d4db51b37240c5c921b"} Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.235905 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-7sjq8" podStartSLOduration=2.664377165 podStartE2EDuration="46.235883361s" podCreationTimestamp="2025-12-06 01:22:15 +0000 UTC" firstStartedPulling="2025-12-06 01:22:16.905317566 +0000 UTC m=+1210.255608034" lastFinishedPulling="2025-12-06 01:23:00.476823762 +0000 UTC m=+1253.827114230" observedRunningTime="2025-12-06 01:23:01.225239306 +0000 UTC m=+1254.575529774" watchObservedRunningTime="2025-12-06 01:23:01.235883361 +0000 UTC m=+1254.586173849" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.283238 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/72ca5f78-0a0d-4e09-8975-cba79bab4842-public-tls-certs\") pod \"keystone-99654f6-8nq4x\" (UID: \"72ca5f78-0a0d-4e09-8975-cba79bab4842\") " pod="openstack/keystone-99654f6-8nq4x" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.283302 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/72ca5f78-0a0d-4e09-8975-cba79bab4842-credential-keys\") pod \"keystone-99654f6-8nq4x\" (UID: \"72ca5f78-0a0d-4e09-8975-cba79bab4842\") " pod="openstack/keystone-99654f6-8nq4x" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.283338 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72ca5f78-0a0d-4e09-8975-cba79bab4842-config-data\") pod \"keystone-99654f6-8nq4x\" (UID: \"72ca5f78-0a0d-4e09-8975-cba79bab4842\") " pod="openstack/keystone-99654f6-8nq4x" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.283361 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72ca5f78-0a0d-4e09-8975-cba79bab4842-scripts\") pod \"keystone-99654f6-8nq4x\" (UID: \"72ca5f78-0a0d-4e09-8975-cba79bab4842\") " pod="openstack/keystone-99654f6-8nq4x" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.283393 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd5faf07-40cf-4cce-9c5a-0fb43ffbb993-public-tls-certs\") pod \"placement-6d59dffb86-qrfm9\" (UID: \"cd5faf07-40cf-4cce-9c5a-0fb43ffbb993\") " pod="openstack/placement-6d59dffb86-qrfm9" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.283417 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd5faf07-40cf-4cce-9c5a-0fb43ffbb993-scripts\") pod \"placement-6d59dffb86-qrfm9\" (UID: \"cd5faf07-40cf-4cce-9c5a-0fb43ffbb993\") " pod="openstack/placement-6d59dffb86-qrfm9" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.283446 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd5faf07-40cf-4cce-9c5a-0fb43ffbb993-internal-tls-certs\") pod \"placement-6d59dffb86-qrfm9\" (UID: \"cd5faf07-40cf-4cce-9c5a-0fb43ffbb993\") " pod="openstack/placement-6d59dffb86-qrfm9" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.283469 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72ca5f78-0a0d-4e09-8975-cba79bab4842-combined-ca-bundle\") pod \"keystone-99654f6-8nq4x\" (UID: \"72ca5f78-0a0d-4e09-8975-cba79bab4842\") " pod="openstack/keystone-99654f6-8nq4x" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.283486 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd5faf07-40cf-4cce-9c5a-0fb43ffbb993-config-data\") pod \"placement-6d59dffb86-qrfm9\" (UID: \"cd5faf07-40cf-4cce-9c5a-0fb43ffbb993\") " pod="openstack/placement-6d59dffb86-qrfm9" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.283534 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/72ca5f78-0a0d-4e09-8975-cba79bab4842-fernet-keys\") pod \"keystone-99654f6-8nq4x\" (UID: \"72ca5f78-0a0d-4e09-8975-cba79bab4842\") " pod="openstack/keystone-99654f6-8nq4x" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.283568 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/72ca5f78-0a0d-4e09-8975-cba79bab4842-internal-tls-certs\") pod \"keystone-99654f6-8nq4x\" (UID: \"72ca5f78-0a0d-4e09-8975-cba79bab4842\") " pod="openstack/keystone-99654f6-8nq4x" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.283588 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdh6z\" (UniqueName: \"kubernetes.io/projected/cd5faf07-40cf-4cce-9c5a-0fb43ffbb993-kube-api-access-rdh6z\") pod \"placement-6d59dffb86-qrfm9\" (UID: \"cd5faf07-40cf-4cce-9c5a-0fb43ffbb993\") " pod="openstack/placement-6d59dffb86-qrfm9" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.283641 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd5faf07-40cf-4cce-9c5a-0fb43ffbb993-logs\") pod \"placement-6d59dffb86-qrfm9\" (UID: \"cd5faf07-40cf-4cce-9c5a-0fb43ffbb993\") " pod="openstack/placement-6d59dffb86-qrfm9" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.283663 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djst9\" (UniqueName: \"kubernetes.io/projected/72ca5f78-0a0d-4e09-8975-cba79bab4842-kube-api-access-djst9\") pod \"keystone-99654f6-8nq4x\" (UID: \"72ca5f78-0a0d-4e09-8975-cba79bab4842\") " pod="openstack/keystone-99654f6-8nq4x" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.283686 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd5faf07-40cf-4cce-9c5a-0fb43ffbb993-combined-ca-bundle\") pod \"placement-6d59dffb86-qrfm9\" (UID: \"cd5faf07-40cf-4cce-9c5a-0fb43ffbb993\") " pod="openstack/placement-6d59dffb86-qrfm9" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.292052 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd5faf07-40cf-4cce-9c5a-0fb43ffbb993-logs\") pod \"placement-6d59dffb86-qrfm9\" (UID: \"cd5faf07-40cf-4cce-9c5a-0fb43ffbb993\") " pod="openstack/placement-6d59dffb86-qrfm9" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.297575 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd5faf07-40cf-4cce-9c5a-0fb43ffbb993-config-data\") pod \"placement-6d59dffb86-qrfm9\" (UID: \"cd5faf07-40cf-4cce-9c5a-0fb43ffbb993\") " pod="openstack/placement-6d59dffb86-qrfm9" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.298375 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd5faf07-40cf-4cce-9c5a-0fb43ffbb993-internal-tls-certs\") pod \"placement-6d59dffb86-qrfm9\" (UID: \"cd5faf07-40cf-4cce-9c5a-0fb43ffbb993\") " pod="openstack/placement-6d59dffb86-qrfm9" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.300879 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/72ca5f78-0a0d-4e09-8975-cba79bab4842-public-tls-certs\") pod \"keystone-99654f6-8nq4x\" (UID: \"72ca5f78-0a0d-4e09-8975-cba79bab4842\") " pod="openstack/keystone-99654f6-8nq4x" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.301610 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/72ca5f78-0a0d-4e09-8975-cba79bab4842-internal-tls-certs\") pod \"keystone-99654f6-8nq4x\" (UID: \"72ca5f78-0a0d-4e09-8975-cba79bab4842\") " pod="openstack/keystone-99654f6-8nq4x" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.301651 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/72ca5f78-0a0d-4e09-8975-cba79bab4842-fernet-keys\") pod \"keystone-99654f6-8nq4x\" (UID: \"72ca5f78-0a0d-4e09-8975-cba79bab4842\") " pod="openstack/keystone-99654f6-8nq4x" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.305297 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72ca5f78-0a0d-4e09-8975-cba79bab4842-scripts\") pod \"keystone-99654f6-8nq4x\" (UID: \"72ca5f78-0a0d-4e09-8975-cba79bab4842\") " pod="openstack/keystone-99654f6-8nq4x" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.308764 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd5faf07-40cf-4cce-9c5a-0fb43ffbb993-combined-ca-bundle\") pod \"placement-6d59dffb86-qrfm9\" (UID: \"cd5faf07-40cf-4cce-9c5a-0fb43ffbb993\") " pod="openstack/placement-6d59dffb86-qrfm9" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.309112 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd5faf07-40cf-4cce-9c5a-0fb43ffbb993-public-tls-certs\") pod \"placement-6d59dffb86-qrfm9\" (UID: \"cd5faf07-40cf-4cce-9c5a-0fb43ffbb993\") " pod="openstack/placement-6d59dffb86-qrfm9" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.309680 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72ca5f78-0a0d-4e09-8975-cba79bab4842-config-data\") pod \"keystone-99654f6-8nq4x\" (UID: \"72ca5f78-0a0d-4e09-8975-cba79bab4842\") " pod="openstack/keystone-99654f6-8nq4x" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.310155 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd5faf07-40cf-4cce-9c5a-0fb43ffbb993-scripts\") pod \"placement-6d59dffb86-qrfm9\" (UID: \"cd5faf07-40cf-4cce-9c5a-0fb43ffbb993\") " pod="openstack/placement-6d59dffb86-qrfm9" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.310852 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/72ca5f78-0a0d-4e09-8975-cba79bab4842-credential-keys\") pod \"keystone-99654f6-8nq4x\" (UID: \"72ca5f78-0a0d-4e09-8975-cba79bab4842\") " pod="openstack/keystone-99654f6-8nq4x" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.311630 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djst9\" (UniqueName: \"kubernetes.io/projected/72ca5f78-0a0d-4e09-8975-cba79bab4842-kube-api-access-djst9\") pod \"keystone-99654f6-8nq4x\" (UID: \"72ca5f78-0a0d-4e09-8975-cba79bab4842\") " pod="openstack/keystone-99654f6-8nq4x" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.312006 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72ca5f78-0a0d-4e09-8975-cba79bab4842-combined-ca-bundle\") pod \"keystone-99654f6-8nq4x\" (UID: \"72ca5f78-0a0d-4e09-8975-cba79bab4842\") " pod="openstack/keystone-99654f6-8nq4x" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.318572 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdh6z\" (UniqueName: \"kubernetes.io/projected/cd5faf07-40cf-4cce-9c5a-0fb43ffbb993-kube-api-access-rdh6z\") pod \"placement-6d59dffb86-qrfm9\" (UID: \"cd5faf07-40cf-4cce-9c5a-0fb43ffbb993\") " pod="openstack/placement-6d59dffb86-qrfm9" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.402641 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-99654f6-8nq4x" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.418415 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6d59dffb86-qrfm9" Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.657674 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 01:23:01 crc kubenswrapper[4991]: W1206 01:23:01.692545 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f8e1b96_e706_4f50_b033_b0fb66936deb.slice/crio-81f722b0b530fbbf80e3ab03e0964afc4bb44f27cbb6f08260eee6176b260f60 WatchSource:0}: Error finding container 81f722b0b530fbbf80e3ab03e0964afc4bb44f27cbb6f08260eee6176b260f60: Status 404 returned error can't find the container with id 81f722b0b530fbbf80e3ab03e0964afc4bb44f27cbb6f08260eee6176b260f60 Dec 06 01:23:01 crc kubenswrapper[4991]: I1206 01:23:01.882481 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-99654f6-8nq4x"] Dec 06 01:23:01 crc kubenswrapper[4991]: W1206 01:23:01.893126 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72ca5f78_0a0d_4e09_8975_cba79bab4842.slice/crio-fe732f58811fde42e9a5c364fd0cf00d420f0b17633525f6370963638c26a1b9 WatchSource:0}: Error finding container fe732f58811fde42e9a5c364fd0cf00d420f0b17633525f6370963638c26a1b9: Status 404 returned error can't find the container with id fe732f58811fde42e9a5c364fd0cf00d420f0b17633525f6370963638c26a1b9 Dec 06 01:23:02 crc kubenswrapper[4991]: I1206 01:23:02.034623 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6d59dffb86-qrfm9"] Dec 06 01:23:02 crc kubenswrapper[4991]: W1206 01:23:02.066352 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd5faf07_40cf_4cce_9c5a_0fb43ffbb993.slice/crio-3dc6c1700e105bd7d941005e591204a9d2e2992585974a102d8e385611500cd4 WatchSource:0}: Error finding container 3dc6c1700e105bd7d941005e591204a9d2e2992585974a102d8e385611500cd4: Status 404 returned error can't find the container with id 3dc6c1700e105bd7d941005e591204a9d2e2992585974a102d8e385611500cd4 Dec 06 01:23:02 crc kubenswrapper[4991]: I1206 01:23:02.233963 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7f8e1b96-e706-4f50-b033-b0fb66936deb","Type":"ContainerStarted","Data":"81f722b0b530fbbf80e3ab03e0964afc4bb44f27cbb6f08260eee6176b260f60"} Dec 06 01:23:02 crc kubenswrapper[4991]: I1206 01:23:02.241805 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-99654f6-8nq4x" event={"ID":"72ca5f78-0a0d-4e09-8975-cba79bab4842","Type":"ContainerStarted","Data":"b362815a438aabe547d4889fb923da8ab05344069ed79ba2c85102efc429de55"} Dec 06 01:23:02 crc kubenswrapper[4991]: I1206 01:23:02.241899 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-99654f6-8nq4x" event={"ID":"72ca5f78-0a0d-4e09-8975-cba79bab4842","Type":"ContainerStarted","Data":"fe732f58811fde42e9a5c364fd0cf00d420f0b17633525f6370963638c26a1b9"} Dec 06 01:23:02 crc kubenswrapper[4991]: I1206 01:23:02.242074 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-99654f6-8nq4x" Dec 06 01:23:02 crc kubenswrapper[4991]: I1206 01:23:02.243953 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d59dffb86-qrfm9" event={"ID":"cd5faf07-40cf-4cce-9c5a-0fb43ffbb993","Type":"ContainerStarted","Data":"3dc6c1700e105bd7d941005e591204a9d2e2992585974a102d8e385611500cd4"} Dec 06 01:23:02 crc kubenswrapper[4991]: I1206 01:23:02.249805 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e74c1123-26eb-43c0-935f-4e81cfebf716","Type":"ContainerStarted","Data":"5cf4078301fb73ac5dccc6888d936b4d9b3cf6e69e8ef008ad49e9f54b072c31"} Dec 06 01:23:02 crc kubenswrapper[4991]: I1206 01:23:02.278135 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-99654f6-8nq4x" podStartSLOduration=2.278114042 podStartE2EDuration="2.278114042s" podCreationTimestamp="2025-12-06 01:23:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:23:02.26527937 +0000 UTC m=+1255.615569848" watchObservedRunningTime="2025-12-06 01:23:02.278114042 +0000 UTC m=+1255.628404510" Dec 06 01:23:02 crc kubenswrapper[4991]: I1206 01:23:02.295262 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.295234705 podStartE2EDuration="7.295234705s" podCreationTimestamp="2025-12-06 01:22:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:23:02.287654919 +0000 UTC m=+1255.637945387" watchObservedRunningTime="2025-12-06 01:23:02.295234705 +0000 UTC m=+1255.645525173" Dec 06 01:23:03 crc kubenswrapper[4991]: I1206 01:23:03.264123 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d59dffb86-qrfm9" event={"ID":"cd5faf07-40cf-4cce-9c5a-0fb43ffbb993","Type":"ContainerStarted","Data":"e628d981b9723ee5938c2ba88c3779642bdd7d22d85d5c48fab141537c832a2e"} Dec 06 01:23:03 crc kubenswrapper[4991]: I1206 01:23:03.264636 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d59dffb86-qrfm9" event={"ID":"cd5faf07-40cf-4cce-9c5a-0fb43ffbb993","Type":"ContainerStarted","Data":"29e4e4bc2b8a4992d5f235b04eaa8f2368a9e708df2dfb20caf902bc056a2d37"} Dec 06 01:23:03 crc kubenswrapper[4991]: I1206 01:23:03.265147 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6d59dffb86-qrfm9" Dec 06 01:23:03 crc kubenswrapper[4991]: I1206 01:23:03.265170 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6d59dffb86-qrfm9" Dec 06 01:23:03 crc kubenswrapper[4991]: I1206 01:23:03.268713 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7f8e1b96-e706-4f50-b033-b0fb66936deb","Type":"ContainerStarted","Data":"70e84139579a76c64d6888d736431bcd66bd236eb2d0d8df41fd650fb013a1e7"} Dec 06 01:23:03 crc kubenswrapper[4991]: I1206 01:23:03.298516 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6d59dffb86-qrfm9" podStartSLOduration=2.298488578 podStartE2EDuration="2.298488578s" podCreationTimestamp="2025-12-06 01:23:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:23:03.289827754 +0000 UTC m=+1256.640118232" watchObservedRunningTime="2025-12-06 01:23:03.298488578 +0000 UTC m=+1256.648779046" Dec 06 01:23:04 crc kubenswrapper[4991]: I1206 01:23:04.277759 4991 generic.go:334] "Generic (PLEG): container finished" podID="1f102ba4-81e9-4d61-b5a7-90126ab9dd80" containerID="206f7e2f38328658fe5c0922e582194c7963df17b8d2aa3ff32ca649c0c51a4d" exitCode=0 Dec 06 01:23:04 crc kubenswrapper[4991]: I1206 01:23:04.277842 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-7sjq8" event={"ID":"1f102ba4-81e9-4d61-b5a7-90126ab9dd80","Type":"ContainerDied","Data":"206f7e2f38328658fe5c0922e582194c7963df17b8d2aa3ff32ca649c0c51a4d"} Dec 06 01:23:04 crc kubenswrapper[4991]: I1206 01:23:04.280101 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-znbg5" event={"ID":"41629f2e-563a-4e76-8ac2-530e8d204578","Type":"ContainerStarted","Data":"c8d22aa6c500f81dff21c8443461eed1cfc666214b361737a7232ea85a332bcd"} Dec 06 01:23:04 crc kubenswrapper[4991]: I1206 01:23:04.284681 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7f8e1b96-e706-4f50-b033-b0fb66936deb","Type":"ContainerStarted","Data":"026fafed64204e02cd1b4e5a4aa3520e431c2610853beac3abc984b0f672b8b6"} Dec 06 01:23:04 crc kubenswrapper[4991]: I1206 01:23:04.307829 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-znbg5" podStartSLOduration=3.5284045600000002 podStartE2EDuration="49.307811399s" podCreationTimestamp="2025-12-06 01:22:15 +0000 UTC" firstStartedPulling="2025-12-06 01:22:16.754906208 +0000 UTC m=+1210.105196676" lastFinishedPulling="2025-12-06 01:23:02.534313047 +0000 UTC m=+1255.884603515" observedRunningTime="2025-12-06 01:23:04.305362256 +0000 UTC m=+1257.655652724" watchObservedRunningTime="2025-12-06 01:23:04.307811399 +0000 UTC m=+1257.658101857" Dec 06 01:23:04 crc kubenswrapper[4991]: I1206 01:23:04.339763 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.339739425 podStartE2EDuration="4.339739425s" podCreationTimestamp="2025-12-06 01:23:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:23:04.325912327 +0000 UTC m=+1257.676202785" watchObservedRunningTime="2025-12-06 01:23:04.339739425 +0000 UTC m=+1257.690029903" Dec 06 01:23:04 crc kubenswrapper[4991]: I1206 01:23:04.488997 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-b4cd4b7d8-99kjr" podUID="ff6a4088-3509-40aa-a3f6-917e289cd44f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Dec 06 01:23:04 crc kubenswrapper[4991]: I1206 01:23:04.749161 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-577fd657dd-psmdm" podUID="15de5638-5e2b-4ce1-9d25-8f2830cd9241" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Dec 06 01:23:05 crc kubenswrapper[4991]: I1206 01:23:05.465708 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 06 01:23:05 crc kubenswrapper[4991]: I1206 01:23:05.465830 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 06 01:23:05 crc kubenswrapper[4991]: I1206 01:23:05.514527 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 06 01:23:05 crc kubenswrapper[4991]: I1206 01:23:05.560627 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 06 01:23:06 crc kubenswrapper[4991]: I1206 01:23:06.330836 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 06 01:23:06 crc kubenswrapper[4991]: I1206 01:23:06.330933 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 06 01:23:07 crc kubenswrapper[4991]: I1206 01:23:07.339440 4991 generic.go:334] "Generic (PLEG): container finished" podID="41629f2e-563a-4e76-8ac2-530e8d204578" containerID="c8d22aa6c500f81dff21c8443461eed1cfc666214b361737a7232ea85a332bcd" exitCode=0 Dec 06 01:23:07 crc kubenswrapper[4991]: I1206 01:23:07.340750 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-znbg5" event={"ID":"41629f2e-563a-4e76-8ac2-530e8d204578","Type":"ContainerDied","Data":"c8d22aa6c500f81dff21c8443461eed1cfc666214b361737a7232ea85a332bcd"} Dec 06 01:23:08 crc kubenswrapper[4991]: I1206 01:23:08.151919 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-7sjq8" Dec 06 01:23:08 crc kubenswrapper[4991]: I1206 01:23:08.237452 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1f102ba4-81e9-4d61-b5a7-90126ab9dd80-db-sync-config-data\") pod \"1f102ba4-81e9-4d61-b5a7-90126ab9dd80\" (UID: \"1f102ba4-81e9-4d61-b5a7-90126ab9dd80\") " Dec 06 01:23:08 crc kubenswrapper[4991]: I1206 01:23:08.237671 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f102ba4-81e9-4d61-b5a7-90126ab9dd80-combined-ca-bundle\") pod \"1f102ba4-81e9-4d61-b5a7-90126ab9dd80\" (UID: \"1f102ba4-81e9-4d61-b5a7-90126ab9dd80\") " Dec 06 01:23:08 crc kubenswrapper[4991]: I1206 01:23:08.237816 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sz6n\" (UniqueName: \"kubernetes.io/projected/1f102ba4-81e9-4d61-b5a7-90126ab9dd80-kube-api-access-5sz6n\") pod \"1f102ba4-81e9-4d61-b5a7-90126ab9dd80\" (UID: \"1f102ba4-81e9-4d61-b5a7-90126ab9dd80\") " Dec 06 01:23:08 crc kubenswrapper[4991]: I1206 01:23:08.243838 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f102ba4-81e9-4d61-b5a7-90126ab9dd80-kube-api-access-5sz6n" (OuterVolumeSpecName: "kube-api-access-5sz6n") pod "1f102ba4-81e9-4d61-b5a7-90126ab9dd80" (UID: "1f102ba4-81e9-4d61-b5a7-90126ab9dd80"). InnerVolumeSpecName "kube-api-access-5sz6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:23:08 crc kubenswrapper[4991]: I1206 01:23:08.255167 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f102ba4-81e9-4d61-b5a7-90126ab9dd80-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "1f102ba4-81e9-4d61-b5a7-90126ab9dd80" (UID: "1f102ba4-81e9-4d61-b5a7-90126ab9dd80"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:23:08 crc kubenswrapper[4991]: I1206 01:23:08.279721 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f102ba4-81e9-4d61-b5a7-90126ab9dd80-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f102ba4-81e9-4d61-b5a7-90126ab9dd80" (UID: "1f102ba4-81e9-4d61-b5a7-90126ab9dd80"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:23:08 crc kubenswrapper[4991]: I1206 01:23:08.296670 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 06 01:23:08 crc kubenswrapper[4991]: I1206 01:23:08.297663 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 06 01:23:08 crc kubenswrapper[4991]: I1206 01:23:08.340523 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sz6n\" (UniqueName: \"kubernetes.io/projected/1f102ba4-81e9-4d61-b5a7-90126ab9dd80-kube-api-access-5sz6n\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:08 crc kubenswrapper[4991]: I1206 01:23:08.340558 4991 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1f102ba4-81e9-4d61-b5a7-90126ab9dd80-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:08 crc kubenswrapper[4991]: I1206 01:23:08.340569 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f102ba4-81e9-4d61-b5a7-90126ab9dd80-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:08 crc kubenswrapper[4991]: I1206 01:23:08.375020 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-7sjq8" Dec 06 01:23:08 crc kubenswrapper[4991]: I1206 01:23:08.375097 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-7sjq8" event={"ID":"1f102ba4-81e9-4d61-b5a7-90126ab9dd80","Type":"ContainerDied","Data":"b5284fb9c382af08233d541cce1c61397b12a6ccd2cbc14469ff8f1064cb392d"} Dec 06 01:23:08 crc kubenswrapper[4991]: I1206 01:23:08.375143 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5284fb9c382af08233d541cce1c61397b12a6ccd2cbc14469ff8f1064cb392d" Dec 06 01:23:08 crc kubenswrapper[4991]: I1206 01:23:08.956357 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-znbg5" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.148464 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41629f2e-563a-4e76-8ac2-530e8d204578-combined-ca-bundle\") pod \"41629f2e-563a-4e76-8ac2-530e8d204578\" (UID: \"41629f2e-563a-4e76-8ac2-530e8d204578\") " Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.148634 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/41629f2e-563a-4e76-8ac2-530e8d204578-db-sync-config-data\") pod \"41629f2e-563a-4e76-8ac2-530e8d204578\" (UID: \"41629f2e-563a-4e76-8ac2-530e8d204578\") " Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.148783 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41629f2e-563a-4e76-8ac2-530e8d204578-config-data\") pod \"41629f2e-563a-4e76-8ac2-530e8d204578\" (UID: \"41629f2e-563a-4e76-8ac2-530e8d204578\") " Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.149015 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41629f2e-563a-4e76-8ac2-530e8d204578-scripts\") pod \"41629f2e-563a-4e76-8ac2-530e8d204578\" (UID: \"41629f2e-563a-4e76-8ac2-530e8d204578\") " Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.149138 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/41629f2e-563a-4e76-8ac2-530e8d204578-etc-machine-id\") pod \"41629f2e-563a-4e76-8ac2-530e8d204578\" (UID: \"41629f2e-563a-4e76-8ac2-530e8d204578\") " Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.149264 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfw6n\" (UniqueName: \"kubernetes.io/projected/41629f2e-563a-4e76-8ac2-530e8d204578-kube-api-access-lfw6n\") pod \"41629f2e-563a-4e76-8ac2-530e8d204578\" (UID: \"41629f2e-563a-4e76-8ac2-530e8d204578\") " Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.151144 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41629f2e-563a-4e76-8ac2-530e8d204578-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "41629f2e-563a-4e76-8ac2-530e8d204578" (UID: "41629f2e-563a-4e76-8ac2-530e8d204578"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.176357 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41629f2e-563a-4e76-8ac2-530e8d204578-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "41629f2e-563a-4e76-8ac2-530e8d204578" (UID: "41629f2e-563a-4e76-8ac2-530e8d204578"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.176524 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41629f2e-563a-4e76-8ac2-530e8d204578-scripts" (OuterVolumeSpecName: "scripts") pod "41629f2e-563a-4e76-8ac2-530e8d204578" (UID: "41629f2e-563a-4e76-8ac2-530e8d204578"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.182590 4991 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/41629f2e-563a-4e76-8ac2-530e8d204578-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.223169 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41629f2e-563a-4e76-8ac2-530e8d204578-kube-api-access-lfw6n" (OuterVolumeSpecName: "kube-api-access-lfw6n") pod "41629f2e-563a-4e76-8ac2-530e8d204578" (UID: "41629f2e-563a-4e76-8ac2-530e8d204578"). InnerVolumeSpecName "kube-api-access-lfw6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.244454 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41629f2e-563a-4e76-8ac2-530e8d204578-config-data" (OuterVolumeSpecName: "config-data") pod "41629f2e-563a-4e76-8ac2-530e8d204578" (UID: "41629f2e-563a-4e76-8ac2-530e8d204578"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.247148 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41629f2e-563a-4e76-8ac2-530e8d204578-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "41629f2e-563a-4e76-8ac2-530e8d204578" (UID: "41629f2e-563a-4e76-8ac2-530e8d204578"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.287011 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41629f2e-563a-4e76-8ac2-530e8d204578-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.287082 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfw6n\" (UniqueName: \"kubernetes.io/projected/41629f2e-563a-4e76-8ac2-530e8d204578-kube-api-access-lfw6n\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.287097 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41629f2e-563a-4e76-8ac2-530e8d204578-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.287107 4991 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/41629f2e-563a-4e76-8ac2-530e8d204578-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.287115 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41629f2e-563a-4e76-8ac2-530e8d204578-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.394455 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-znbg5" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.400268 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-znbg5" event={"ID":"41629f2e-563a-4e76-8ac2-530e8d204578","Type":"ContainerDied","Data":"3ade1b923629e9a6e33edd628f00e4382beae678cf0d08c7ada1f262e685f674"} Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.400332 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ade1b923629e9a6e33edd628f00e4382beae678cf0d08c7ada1f262e685f674" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.486066 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7d69cddbc-dkwjg"] Dec 06 01:23:09 crc kubenswrapper[4991]: E1206 01:23:09.486513 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f102ba4-81e9-4d61-b5a7-90126ab9dd80" containerName="barbican-db-sync" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.486531 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f102ba4-81e9-4d61-b5a7-90126ab9dd80" containerName="barbican-db-sync" Dec 06 01:23:09 crc kubenswrapper[4991]: E1206 01:23:09.486549 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41629f2e-563a-4e76-8ac2-530e8d204578" containerName="cinder-db-sync" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.486557 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="41629f2e-563a-4e76-8ac2-530e8d204578" containerName="cinder-db-sync" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.486718 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f102ba4-81e9-4d61-b5a7-90126ab9dd80" containerName="barbican-db-sync" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.486737 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="41629f2e-563a-4e76-8ac2-530e8d204578" containerName="cinder-db-sync" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.487662 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7d69cddbc-dkwjg" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.494062 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.494278 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-25jds" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.494433 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.510775 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7d69cddbc-dkwjg"] Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.533061 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7b4c56449d-5hvrn"] Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.534682 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7b4c56449d-5hvrn" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.538304 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.554122 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7b4c56449d-5hvrn"] Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.591113 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc3413e2-6d39-4013-94ac-fb8d1c42e5ff-combined-ca-bundle\") pod \"barbican-worker-7d69cddbc-dkwjg\" (UID: \"cc3413e2-6d39-4013-94ac-fb8d1c42e5ff\") " pod="openstack/barbican-worker-7d69cddbc-dkwjg" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.591174 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlsw8\" (UniqueName: \"kubernetes.io/projected/79ffe5af-2a10-4cd9-b8bc-1c2e49dae8b1-kube-api-access-jlsw8\") pod \"barbican-keystone-listener-7b4c56449d-5hvrn\" (UID: \"79ffe5af-2a10-4cd9-b8bc-1c2e49dae8b1\") " pod="openstack/barbican-keystone-listener-7b4c56449d-5hvrn" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.591212 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79ffe5af-2a10-4cd9-b8bc-1c2e49dae8b1-config-data-custom\") pod \"barbican-keystone-listener-7b4c56449d-5hvrn\" (UID: \"79ffe5af-2a10-4cd9-b8bc-1c2e49dae8b1\") " pod="openstack/barbican-keystone-listener-7b4c56449d-5hvrn" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.591243 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc3413e2-6d39-4013-94ac-fb8d1c42e5ff-config-data\") pod \"barbican-worker-7d69cddbc-dkwjg\" (UID: \"cc3413e2-6d39-4013-94ac-fb8d1c42e5ff\") " pod="openstack/barbican-worker-7d69cddbc-dkwjg" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.591272 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79ffe5af-2a10-4cd9-b8bc-1c2e49dae8b1-config-data\") pod \"barbican-keystone-listener-7b4c56449d-5hvrn\" (UID: \"79ffe5af-2a10-4cd9-b8bc-1c2e49dae8b1\") " pod="openstack/barbican-keystone-listener-7b4c56449d-5hvrn" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.591289 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cc3413e2-6d39-4013-94ac-fb8d1c42e5ff-config-data-custom\") pod \"barbican-worker-7d69cddbc-dkwjg\" (UID: \"cc3413e2-6d39-4013-94ac-fb8d1c42e5ff\") " pod="openstack/barbican-worker-7d69cddbc-dkwjg" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.591306 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc3413e2-6d39-4013-94ac-fb8d1c42e5ff-logs\") pod \"barbican-worker-7d69cddbc-dkwjg\" (UID: \"cc3413e2-6d39-4013-94ac-fb8d1c42e5ff\") " pod="openstack/barbican-worker-7d69cddbc-dkwjg" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.591360 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgfds\" (UniqueName: \"kubernetes.io/projected/cc3413e2-6d39-4013-94ac-fb8d1c42e5ff-kube-api-access-mgfds\") pod \"barbican-worker-7d69cddbc-dkwjg\" (UID: \"cc3413e2-6d39-4013-94ac-fb8d1c42e5ff\") " pod="openstack/barbican-worker-7d69cddbc-dkwjg" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.591382 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79ffe5af-2a10-4cd9-b8bc-1c2e49dae8b1-logs\") pod \"barbican-keystone-listener-7b4c56449d-5hvrn\" (UID: \"79ffe5af-2a10-4cd9-b8bc-1c2e49dae8b1\") " pod="openstack/barbican-keystone-listener-7b4c56449d-5hvrn" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.591404 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79ffe5af-2a10-4cd9-b8bc-1c2e49dae8b1-combined-ca-bundle\") pod \"barbican-keystone-listener-7b4c56449d-5hvrn\" (UID: \"79ffe5af-2a10-4cd9-b8bc-1c2e49dae8b1\") " pod="openstack/barbican-keystone-listener-7b4c56449d-5hvrn" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.620060 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-gwv4p"] Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.621626 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-gwv4p" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.637714 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-gwv4p"] Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.699369 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79ffe5af-2a10-4cd9-b8bc-1c2e49dae8b1-config-data-custom\") pod \"barbican-keystone-listener-7b4c56449d-5hvrn\" (UID: \"79ffe5af-2a10-4cd9-b8bc-1c2e49dae8b1\") " pod="openstack/barbican-keystone-listener-7b4c56449d-5hvrn" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.699534 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c271958a-d5c0-4990-9c44-8655731b309a-dns-svc\") pod \"dnsmasq-dns-688c87cc99-gwv4p\" (UID: \"c271958a-d5c0-4990-9c44-8655731b309a\") " pod="openstack/dnsmasq-dns-688c87cc99-gwv4p" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.699592 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc3413e2-6d39-4013-94ac-fb8d1c42e5ff-config-data\") pod \"barbican-worker-7d69cddbc-dkwjg\" (UID: \"cc3413e2-6d39-4013-94ac-fb8d1c42e5ff\") " pod="openstack/barbican-worker-7d69cddbc-dkwjg" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.699708 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79ffe5af-2a10-4cd9-b8bc-1c2e49dae8b1-config-data\") pod \"barbican-keystone-listener-7b4c56449d-5hvrn\" (UID: \"79ffe5af-2a10-4cd9-b8bc-1c2e49dae8b1\") " pod="openstack/barbican-keystone-listener-7b4c56449d-5hvrn" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.699745 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cc3413e2-6d39-4013-94ac-fb8d1c42e5ff-config-data-custom\") pod \"barbican-worker-7d69cddbc-dkwjg\" (UID: \"cc3413e2-6d39-4013-94ac-fb8d1c42e5ff\") " pod="openstack/barbican-worker-7d69cddbc-dkwjg" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.699785 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc3413e2-6d39-4013-94ac-fb8d1c42e5ff-logs\") pod \"barbican-worker-7d69cddbc-dkwjg\" (UID: \"cc3413e2-6d39-4013-94ac-fb8d1c42e5ff\") " pod="openstack/barbican-worker-7d69cddbc-dkwjg" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.699846 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c271958a-d5c0-4990-9c44-8655731b309a-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-gwv4p\" (UID: \"c271958a-d5c0-4990-9c44-8655731b309a\") " pod="openstack/dnsmasq-dns-688c87cc99-gwv4p" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.699894 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgfds\" (UniqueName: \"kubernetes.io/projected/cc3413e2-6d39-4013-94ac-fb8d1c42e5ff-kube-api-access-mgfds\") pod \"barbican-worker-7d69cddbc-dkwjg\" (UID: \"cc3413e2-6d39-4013-94ac-fb8d1c42e5ff\") " pod="openstack/barbican-worker-7d69cddbc-dkwjg" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.699917 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79ffe5af-2a10-4cd9-b8bc-1c2e49dae8b1-logs\") pod \"barbican-keystone-listener-7b4c56449d-5hvrn\" (UID: \"79ffe5af-2a10-4cd9-b8bc-1c2e49dae8b1\") " pod="openstack/barbican-keystone-listener-7b4c56449d-5hvrn" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.699948 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79ffe5af-2a10-4cd9-b8bc-1c2e49dae8b1-combined-ca-bundle\") pod \"barbican-keystone-listener-7b4c56449d-5hvrn\" (UID: \"79ffe5af-2a10-4cd9-b8bc-1c2e49dae8b1\") " pod="openstack/barbican-keystone-listener-7b4c56449d-5hvrn" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.700008 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5llpb\" (UniqueName: \"kubernetes.io/projected/c271958a-d5c0-4990-9c44-8655731b309a-kube-api-access-5llpb\") pod \"dnsmasq-dns-688c87cc99-gwv4p\" (UID: \"c271958a-d5c0-4990-9c44-8655731b309a\") " pod="openstack/dnsmasq-dns-688c87cc99-gwv4p" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.700097 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c271958a-d5c0-4990-9c44-8655731b309a-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-gwv4p\" (UID: \"c271958a-d5c0-4990-9c44-8655731b309a\") " pod="openstack/dnsmasq-dns-688c87cc99-gwv4p" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.700350 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c271958a-d5c0-4990-9c44-8655731b309a-config\") pod \"dnsmasq-dns-688c87cc99-gwv4p\" (UID: \"c271958a-d5c0-4990-9c44-8655731b309a\") " pod="openstack/dnsmasq-dns-688c87cc99-gwv4p" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.700388 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc3413e2-6d39-4013-94ac-fb8d1c42e5ff-combined-ca-bundle\") pod \"barbican-worker-7d69cddbc-dkwjg\" (UID: \"cc3413e2-6d39-4013-94ac-fb8d1c42e5ff\") " pod="openstack/barbican-worker-7d69cddbc-dkwjg" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.700424 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c271958a-d5c0-4990-9c44-8655731b309a-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-gwv4p\" (UID: \"c271958a-d5c0-4990-9c44-8655731b309a\") " pod="openstack/dnsmasq-dns-688c87cc99-gwv4p" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.700471 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlsw8\" (UniqueName: \"kubernetes.io/projected/79ffe5af-2a10-4cd9-b8bc-1c2e49dae8b1-kube-api-access-jlsw8\") pod \"barbican-keystone-listener-7b4c56449d-5hvrn\" (UID: \"79ffe5af-2a10-4cd9-b8bc-1c2e49dae8b1\") " pod="openstack/barbican-keystone-listener-7b4c56449d-5hvrn" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.708311 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79ffe5af-2a10-4cd9-b8bc-1c2e49dae8b1-logs\") pod \"barbican-keystone-listener-7b4c56449d-5hvrn\" (UID: \"79ffe5af-2a10-4cd9-b8bc-1c2e49dae8b1\") " pod="openstack/barbican-keystone-listener-7b4c56449d-5hvrn" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.708786 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc3413e2-6d39-4013-94ac-fb8d1c42e5ff-config-data\") pod \"barbican-worker-7d69cddbc-dkwjg\" (UID: \"cc3413e2-6d39-4013-94ac-fb8d1c42e5ff\") " pod="openstack/barbican-worker-7d69cddbc-dkwjg" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.712139 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc3413e2-6d39-4013-94ac-fb8d1c42e5ff-logs\") pod \"barbican-worker-7d69cddbc-dkwjg\" (UID: \"cc3413e2-6d39-4013-94ac-fb8d1c42e5ff\") " pod="openstack/barbican-worker-7d69cddbc-dkwjg" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.712339 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79ffe5af-2a10-4cd9-b8bc-1c2e49dae8b1-config-data-custom\") pod \"barbican-keystone-listener-7b4c56449d-5hvrn\" (UID: \"79ffe5af-2a10-4cd9-b8bc-1c2e49dae8b1\") " pod="openstack/barbican-keystone-listener-7b4c56449d-5hvrn" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.726345 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79ffe5af-2a10-4cd9-b8bc-1c2e49dae8b1-combined-ca-bundle\") pod \"barbican-keystone-listener-7b4c56449d-5hvrn\" (UID: \"79ffe5af-2a10-4cd9-b8bc-1c2e49dae8b1\") " pod="openstack/barbican-keystone-listener-7b4c56449d-5hvrn" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.734919 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc3413e2-6d39-4013-94ac-fb8d1c42e5ff-combined-ca-bundle\") pod \"barbican-worker-7d69cddbc-dkwjg\" (UID: \"cc3413e2-6d39-4013-94ac-fb8d1c42e5ff\") " pod="openstack/barbican-worker-7d69cddbc-dkwjg" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.737896 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgfds\" (UniqueName: \"kubernetes.io/projected/cc3413e2-6d39-4013-94ac-fb8d1c42e5ff-kube-api-access-mgfds\") pod \"barbican-worker-7d69cddbc-dkwjg\" (UID: \"cc3413e2-6d39-4013-94ac-fb8d1c42e5ff\") " pod="openstack/barbican-worker-7d69cddbc-dkwjg" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.738326 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cc3413e2-6d39-4013-94ac-fb8d1c42e5ff-config-data-custom\") pod \"barbican-worker-7d69cddbc-dkwjg\" (UID: \"cc3413e2-6d39-4013-94ac-fb8d1c42e5ff\") " pod="openstack/barbican-worker-7d69cddbc-dkwjg" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.738631 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79ffe5af-2a10-4cd9-b8bc-1c2e49dae8b1-config-data\") pod \"barbican-keystone-listener-7b4c56449d-5hvrn\" (UID: \"79ffe5af-2a10-4cd9-b8bc-1c2e49dae8b1\") " pod="openstack/barbican-keystone-listener-7b4c56449d-5hvrn" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.742912 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlsw8\" (UniqueName: \"kubernetes.io/projected/79ffe5af-2a10-4cd9-b8bc-1c2e49dae8b1-kube-api-access-jlsw8\") pod \"barbican-keystone-listener-7b4c56449d-5hvrn\" (UID: \"79ffe5af-2a10-4cd9-b8bc-1c2e49dae8b1\") " pod="openstack/barbican-keystone-listener-7b4c56449d-5hvrn" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.751062 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.752584 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.760139 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.765565 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.765773 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-6wbm9" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.765918 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.766182 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.804576 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5llpb\" (UniqueName: \"kubernetes.io/projected/c271958a-d5c0-4990-9c44-8655731b309a-kube-api-access-5llpb\") pod \"dnsmasq-dns-688c87cc99-gwv4p\" (UID: \"c271958a-d5c0-4990-9c44-8655731b309a\") " pod="openstack/dnsmasq-dns-688c87cc99-gwv4p" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.804634 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c271958a-d5c0-4990-9c44-8655731b309a-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-gwv4p\" (UID: \"c271958a-d5c0-4990-9c44-8655731b309a\") " pod="openstack/dnsmasq-dns-688c87cc99-gwv4p" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.804673 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2a95fbb6-9365-42b5-8d44-5805af860743-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2a95fbb6-9365-42b5-8d44-5805af860743\") " pod="openstack/cinder-scheduler-0" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.804709 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2a95fbb6-9365-42b5-8d44-5805af860743-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2a95fbb6-9365-42b5-8d44-5805af860743\") " pod="openstack/cinder-scheduler-0" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.804725 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a95fbb6-9365-42b5-8d44-5805af860743-config-data\") pod \"cinder-scheduler-0\" (UID: \"2a95fbb6-9365-42b5-8d44-5805af860743\") " pod="openstack/cinder-scheduler-0" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.804744 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c271958a-d5c0-4990-9c44-8655731b309a-config\") pod \"dnsmasq-dns-688c87cc99-gwv4p\" (UID: \"c271958a-d5c0-4990-9c44-8655731b309a\") " pod="openstack/dnsmasq-dns-688c87cc99-gwv4p" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.804767 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c271958a-d5c0-4990-9c44-8655731b309a-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-gwv4p\" (UID: \"c271958a-d5c0-4990-9c44-8655731b309a\") " pod="openstack/dnsmasq-dns-688c87cc99-gwv4p" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.804797 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a95fbb6-9365-42b5-8d44-5805af860743-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2a95fbb6-9365-42b5-8d44-5805af860743\") " pod="openstack/cinder-scheduler-0" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.804825 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a95fbb6-9365-42b5-8d44-5805af860743-scripts\") pod \"cinder-scheduler-0\" (UID: \"2a95fbb6-9365-42b5-8d44-5805af860743\") " pod="openstack/cinder-scheduler-0" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.804843 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c271958a-d5c0-4990-9c44-8655731b309a-dns-svc\") pod \"dnsmasq-dns-688c87cc99-gwv4p\" (UID: \"c271958a-d5c0-4990-9c44-8655731b309a\") " pod="openstack/dnsmasq-dns-688c87cc99-gwv4p" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.804865 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vbcc\" (UniqueName: \"kubernetes.io/projected/2a95fbb6-9365-42b5-8d44-5805af860743-kube-api-access-8vbcc\") pod \"cinder-scheduler-0\" (UID: \"2a95fbb6-9365-42b5-8d44-5805af860743\") " pod="openstack/cinder-scheduler-0" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.804895 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c271958a-d5c0-4990-9c44-8655731b309a-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-gwv4p\" (UID: \"c271958a-d5c0-4990-9c44-8655731b309a\") " pod="openstack/dnsmasq-dns-688c87cc99-gwv4p" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.806609 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c271958a-d5c0-4990-9c44-8655731b309a-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-gwv4p\" (UID: \"c271958a-d5c0-4990-9c44-8655731b309a\") " pod="openstack/dnsmasq-dns-688c87cc99-gwv4p" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.806843 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c271958a-d5c0-4990-9c44-8655731b309a-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-gwv4p\" (UID: \"c271958a-d5c0-4990-9c44-8655731b309a\") " pod="openstack/dnsmasq-dns-688c87cc99-gwv4p" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.807867 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c271958a-d5c0-4990-9c44-8655731b309a-dns-svc\") pod \"dnsmasq-dns-688c87cc99-gwv4p\" (UID: \"c271958a-d5c0-4990-9c44-8655731b309a\") " pod="openstack/dnsmasq-dns-688c87cc99-gwv4p" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.811420 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7d69cddbc-dkwjg" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.818188 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c271958a-d5c0-4990-9c44-8655731b309a-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-gwv4p\" (UID: \"c271958a-d5c0-4990-9c44-8655731b309a\") " pod="openstack/dnsmasq-dns-688c87cc99-gwv4p" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.825028 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c271958a-d5c0-4990-9c44-8655731b309a-config\") pod \"dnsmasq-dns-688c87cc99-gwv4p\" (UID: \"c271958a-d5c0-4990-9c44-8655731b309a\") " pod="openstack/dnsmasq-dns-688c87cc99-gwv4p" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.835169 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-gwv4p"] Dec 06 01:23:09 crc kubenswrapper[4991]: E1206 01:23:09.835843 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-5llpb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-688c87cc99-gwv4p" podUID="c271958a-d5c0-4990-9c44-8655731b309a" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.843750 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5llpb\" (UniqueName: \"kubernetes.io/projected/c271958a-d5c0-4990-9c44-8655731b309a-kube-api-access-5llpb\") pod \"dnsmasq-dns-688c87cc99-gwv4p\" (UID: \"c271958a-d5c0-4990-9c44-8655731b309a\") " pod="openstack/dnsmasq-dns-688c87cc99-gwv4p" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.874669 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-c65b84cc8-zjpfz"] Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.891208 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-c65b84cc8-zjpfz" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.893113 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.902662 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7b4c56449d-5hvrn" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.908340 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntlw5\" (UniqueName: \"kubernetes.io/projected/34a2a9f3-1540-40b7-b00f-e67b824dc610-kube-api-access-ntlw5\") pod \"barbican-api-c65b84cc8-zjpfz\" (UID: \"34a2a9f3-1540-40b7-b00f-e67b824dc610\") " pod="openstack/barbican-api-c65b84cc8-zjpfz" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.908407 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2a95fbb6-9365-42b5-8d44-5805af860743-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2a95fbb6-9365-42b5-8d44-5805af860743\") " pod="openstack/cinder-scheduler-0" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.908462 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2a95fbb6-9365-42b5-8d44-5805af860743-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2a95fbb6-9365-42b5-8d44-5805af860743\") " pod="openstack/cinder-scheduler-0" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.908489 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a95fbb6-9365-42b5-8d44-5805af860743-config-data\") pod \"cinder-scheduler-0\" (UID: \"2a95fbb6-9365-42b5-8d44-5805af860743\") " pod="openstack/cinder-scheduler-0" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.908545 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a95fbb6-9365-42b5-8d44-5805af860743-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2a95fbb6-9365-42b5-8d44-5805af860743\") " pod="openstack/cinder-scheduler-0" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.908586 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/34a2a9f3-1540-40b7-b00f-e67b824dc610-config-data-custom\") pod \"barbican-api-c65b84cc8-zjpfz\" (UID: \"34a2a9f3-1540-40b7-b00f-e67b824dc610\") " pod="openstack/barbican-api-c65b84cc8-zjpfz" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.908611 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a95fbb6-9365-42b5-8d44-5805af860743-scripts\") pod \"cinder-scheduler-0\" (UID: \"2a95fbb6-9365-42b5-8d44-5805af860743\") " pod="openstack/cinder-scheduler-0" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.908646 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vbcc\" (UniqueName: \"kubernetes.io/projected/2a95fbb6-9365-42b5-8d44-5805af860743-kube-api-access-8vbcc\") pod \"cinder-scheduler-0\" (UID: \"2a95fbb6-9365-42b5-8d44-5805af860743\") " pod="openstack/cinder-scheduler-0" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.908674 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34a2a9f3-1540-40b7-b00f-e67b824dc610-logs\") pod \"barbican-api-c65b84cc8-zjpfz\" (UID: \"34a2a9f3-1540-40b7-b00f-e67b824dc610\") " pod="openstack/barbican-api-c65b84cc8-zjpfz" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.908721 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34a2a9f3-1540-40b7-b00f-e67b824dc610-config-data\") pod \"barbican-api-c65b84cc8-zjpfz\" (UID: \"34a2a9f3-1540-40b7-b00f-e67b824dc610\") " pod="openstack/barbican-api-c65b84cc8-zjpfz" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.908751 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34a2a9f3-1540-40b7-b00f-e67b824dc610-combined-ca-bundle\") pod \"barbican-api-c65b84cc8-zjpfz\" (UID: \"34a2a9f3-1540-40b7-b00f-e67b824dc610\") " pod="openstack/barbican-api-c65b84cc8-zjpfz" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.908872 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2a95fbb6-9365-42b5-8d44-5805af860743-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2a95fbb6-9365-42b5-8d44-5805af860743\") " pod="openstack/cinder-scheduler-0" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.942967 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a95fbb6-9365-42b5-8d44-5805af860743-config-data\") pod \"cinder-scheduler-0\" (UID: \"2a95fbb6-9365-42b5-8d44-5805af860743\") " pod="openstack/cinder-scheduler-0" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.943415 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a95fbb6-9365-42b5-8d44-5805af860743-scripts\") pod \"cinder-scheduler-0\" (UID: \"2a95fbb6-9365-42b5-8d44-5805af860743\") " pod="openstack/cinder-scheduler-0" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.943435 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a95fbb6-9365-42b5-8d44-5805af860743-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2a95fbb6-9365-42b5-8d44-5805af860743\") " pod="openstack/cinder-scheduler-0" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.945361 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-c65b84cc8-zjpfz"] Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.958507 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vbcc\" (UniqueName: \"kubernetes.io/projected/2a95fbb6-9365-42b5-8d44-5805af860743-kube-api-access-8vbcc\") pod \"cinder-scheduler-0\" (UID: \"2a95fbb6-9365-42b5-8d44-5805af860743\") " pod="openstack/cinder-scheduler-0" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.961335 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-jsqrg"] Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.944762 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2a95fbb6-9365-42b5-8d44-5805af860743-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2a95fbb6-9365-42b5-8d44-5805af860743\") " pod="openstack/cinder-scheduler-0" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.965233 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-jsqrg" Dec 06 01:23:09 crc kubenswrapper[4991]: I1206 01:23:09.988855 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-jsqrg"] Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.009802 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34a2a9f3-1540-40b7-b00f-e67b824dc610-logs\") pod \"barbican-api-c65b84cc8-zjpfz\" (UID: \"34a2a9f3-1540-40b7-b00f-e67b824dc610\") " pod="openstack/barbican-api-c65b84cc8-zjpfz" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.009854 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2631c52c-c2fb-40cb-beb6-7c5035dff7fe-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-jsqrg\" (UID: \"2631c52c-c2fb-40cb-beb6-7c5035dff7fe\") " pod="openstack/dnsmasq-dns-6bb4fc677f-jsqrg" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.009882 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2631c52c-c2fb-40cb-beb6-7c5035dff7fe-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-jsqrg\" (UID: \"2631c52c-c2fb-40cb-beb6-7c5035dff7fe\") " pod="openstack/dnsmasq-dns-6bb4fc677f-jsqrg" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.009906 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34a2a9f3-1540-40b7-b00f-e67b824dc610-config-data\") pod \"barbican-api-c65b84cc8-zjpfz\" (UID: \"34a2a9f3-1540-40b7-b00f-e67b824dc610\") " pod="openstack/barbican-api-c65b84cc8-zjpfz" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.009926 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2631c52c-c2fb-40cb-beb6-7c5035dff7fe-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-jsqrg\" (UID: \"2631c52c-c2fb-40cb-beb6-7c5035dff7fe\") " pod="openstack/dnsmasq-dns-6bb4fc677f-jsqrg" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.009944 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34a2a9f3-1540-40b7-b00f-e67b824dc610-combined-ca-bundle\") pod \"barbican-api-c65b84cc8-zjpfz\" (UID: \"34a2a9f3-1540-40b7-b00f-e67b824dc610\") " pod="openstack/barbican-api-c65b84cc8-zjpfz" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.009976 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntlw5\" (UniqueName: \"kubernetes.io/projected/34a2a9f3-1540-40b7-b00f-e67b824dc610-kube-api-access-ntlw5\") pod \"barbican-api-c65b84cc8-zjpfz\" (UID: \"34a2a9f3-1540-40b7-b00f-e67b824dc610\") " pod="openstack/barbican-api-c65b84cc8-zjpfz" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.010017 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2631c52c-c2fb-40cb-beb6-7c5035dff7fe-config\") pod \"dnsmasq-dns-6bb4fc677f-jsqrg\" (UID: \"2631c52c-c2fb-40cb-beb6-7c5035dff7fe\") " pod="openstack/dnsmasq-dns-6bb4fc677f-jsqrg" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.010034 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2631c52c-c2fb-40cb-beb6-7c5035dff7fe-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-jsqrg\" (UID: \"2631c52c-c2fb-40cb-beb6-7c5035dff7fe\") " pod="openstack/dnsmasq-dns-6bb4fc677f-jsqrg" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.010067 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldpfw\" (UniqueName: \"kubernetes.io/projected/2631c52c-c2fb-40cb-beb6-7c5035dff7fe-kube-api-access-ldpfw\") pod \"dnsmasq-dns-6bb4fc677f-jsqrg\" (UID: \"2631c52c-c2fb-40cb-beb6-7c5035dff7fe\") " pod="openstack/dnsmasq-dns-6bb4fc677f-jsqrg" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.010121 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/34a2a9f3-1540-40b7-b00f-e67b824dc610-config-data-custom\") pod \"barbican-api-c65b84cc8-zjpfz\" (UID: \"34a2a9f3-1540-40b7-b00f-e67b824dc610\") " pod="openstack/barbican-api-c65b84cc8-zjpfz" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.010942 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34a2a9f3-1540-40b7-b00f-e67b824dc610-logs\") pod \"barbican-api-c65b84cc8-zjpfz\" (UID: \"34a2a9f3-1540-40b7-b00f-e67b824dc610\") " pod="openstack/barbican-api-c65b84cc8-zjpfz" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.026777 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34a2a9f3-1540-40b7-b00f-e67b824dc610-config-data\") pod \"barbican-api-c65b84cc8-zjpfz\" (UID: \"34a2a9f3-1540-40b7-b00f-e67b824dc610\") " pod="openstack/barbican-api-c65b84cc8-zjpfz" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.036329 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/34a2a9f3-1540-40b7-b00f-e67b824dc610-config-data-custom\") pod \"barbican-api-c65b84cc8-zjpfz\" (UID: \"34a2a9f3-1540-40b7-b00f-e67b824dc610\") " pod="openstack/barbican-api-c65b84cc8-zjpfz" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.048794 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34a2a9f3-1540-40b7-b00f-e67b824dc610-combined-ca-bundle\") pod \"barbican-api-c65b84cc8-zjpfz\" (UID: \"34a2a9f3-1540-40b7-b00f-e67b824dc610\") " pod="openstack/barbican-api-c65b84cc8-zjpfz" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.049291 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntlw5\" (UniqueName: \"kubernetes.io/projected/34a2a9f3-1540-40b7-b00f-e67b824dc610-kube-api-access-ntlw5\") pod \"barbican-api-c65b84cc8-zjpfz\" (UID: \"34a2a9f3-1540-40b7-b00f-e67b824dc610\") " pod="openstack/barbican-api-c65b84cc8-zjpfz" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.079840 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.104090 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.113738 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.113861 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.114571 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2631c52c-c2fb-40cb-beb6-7c5035dff7fe-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-jsqrg\" (UID: \"2631c52c-c2fb-40cb-beb6-7c5035dff7fe\") " pod="openstack/dnsmasq-dns-6bb4fc677f-jsqrg" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.114623 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2631c52c-c2fb-40cb-beb6-7c5035dff7fe-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-jsqrg\" (UID: \"2631c52c-c2fb-40cb-beb6-7c5035dff7fe\") " pod="openstack/dnsmasq-dns-6bb4fc677f-jsqrg" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.114651 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2631c52c-c2fb-40cb-beb6-7c5035dff7fe-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-jsqrg\" (UID: \"2631c52c-c2fb-40cb-beb6-7c5035dff7fe\") " pod="openstack/dnsmasq-dns-6bb4fc677f-jsqrg" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.114711 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2631c52c-c2fb-40cb-beb6-7c5035dff7fe-config\") pod \"dnsmasq-dns-6bb4fc677f-jsqrg\" (UID: \"2631c52c-c2fb-40cb-beb6-7c5035dff7fe\") " pod="openstack/dnsmasq-dns-6bb4fc677f-jsqrg" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.114781 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2631c52c-c2fb-40cb-beb6-7c5035dff7fe-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-jsqrg\" (UID: \"2631c52c-c2fb-40cb-beb6-7c5035dff7fe\") " pod="openstack/dnsmasq-dns-6bb4fc677f-jsqrg" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.114829 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldpfw\" (UniqueName: \"kubernetes.io/projected/2631c52c-c2fb-40cb-beb6-7c5035dff7fe-kube-api-access-ldpfw\") pod \"dnsmasq-dns-6bb4fc677f-jsqrg\" (UID: \"2631c52c-c2fb-40cb-beb6-7c5035dff7fe\") " pod="openstack/dnsmasq-dns-6bb4fc677f-jsqrg" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.116381 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2631c52c-c2fb-40cb-beb6-7c5035dff7fe-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-jsqrg\" (UID: \"2631c52c-c2fb-40cb-beb6-7c5035dff7fe\") " pod="openstack/dnsmasq-dns-6bb4fc677f-jsqrg" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.117241 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2631c52c-c2fb-40cb-beb6-7c5035dff7fe-config\") pod \"dnsmasq-dns-6bb4fc677f-jsqrg\" (UID: \"2631c52c-c2fb-40cb-beb6-7c5035dff7fe\") " pod="openstack/dnsmasq-dns-6bb4fc677f-jsqrg" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.117705 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2631c52c-c2fb-40cb-beb6-7c5035dff7fe-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-jsqrg\" (UID: \"2631c52c-c2fb-40cb-beb6-7c5035dff7fe\") " pod="openstack/dnsmasq-dns-6bb4fc677f-jsqrg" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.121953 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2631c52c-c2fb-40cb-beb6-7c5035dff7fe-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-jsqrg\" (UID: \"2631c52c-c2fb-40cb-beb6-7c5035dff7fe\") " pod="openstack/dnsmasq-dns-6bb4fc677f-jsqrg" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.123665 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.127697 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2631c52c-c2fb-40cb-beb6-7c5035dff7fe-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-jsqrg\" (UID: \"2631c52c-c2fb-40cb-beb6-7c5035dff7fe\") " pod="openstack/dnsmasq-dns-6bb4fc677f-jsqrg" Dec 06 01:23:10 crc kubenswrapper[4991]: E1206 01:23:10.168309 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="62fc1d5f-d757-46ae-bea1-2a68ae61a794" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.223756 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bf196fe-b548-409b-9297-8e181d01ef54-config-data\") pod \"cinder-api-0\" (UID: \"5bf196fe-b548-409b-9297-8e181d01ef54\") " pod="openstack/cinder-api-0" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.223839 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bf196fe-b548-409b-9297-8e181d01ef54-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5bf196fe-b548-409b-9297-8e181d01ef54\") " pod="openstack/cinder-api-0" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.223872 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bf196fe-b548-409b-9297-8e181d01ef54-scripts\") pod \"cinder-api-0\" (UID: \"5bf196fe-b548-409b-9297-8e181d01ef54\") " pod="openstack/cinder-api-0" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.223923 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bf196fe-b548-409b-9297-8e181d01ef54-logs\") pod \"cinder-api-0\" (UID: \"5bf196fe-b548-409b-9297-8e181d01ef54\") " pod="openstack/cinder-api-0" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.224002 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5bf196fe-b548-409b-9297-8e181d01ef54-config-data-custom\") pod \"cinder-api-0\" (UID: \"5bf196fe-b548-409b-9297-8e181d01ef54\") " pod="openstack/cinder-api-0" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.224023 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4784\" (UniqueName: \"kubernetes.io/projected/5bf196fe-b548-409b-9297-8e181d01ef54-kube-api-access-q4784\") pod \"cinder-api-0\" (UID: \"5bf196fe-b548-409b-9297-8e181d01ef54\") " pod="openstack/cinder-api-0" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.224076 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5bf196fe-b548-409b-9297-8e181d01ef54-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5bf196fe-b548-409b-9297-8e181d01ef54\") " pod="openstack/cinder-api-0" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.229043 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldpfw\" (UniqueName: \"kubernetes.io/projected/2631c52c-c2fb-40cb-beb6-7c5035dff7fe-kube-api-access-ldpfw\") pod \"dnsmasq-dns-6bb4fc677f-jsqrg\" (UID: \"2631c52c-c2fb-40cb-beb6-7c5035dff7fe\") " pod="openstack/dnsmasq-dns-6bb4fc677f-jsqrg" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.258616 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-c65b84cc8-zjpfz" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.289220 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-jsqrg" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.349268 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5bf196fe-b548-409b-9297-8e181d01ef54-config-data-custom\") pod \"cinder-api-0\" (UID: \"5bf196fe-b548-409b-9297-8e181d01ef54\") " pod="openstack/cinder-api-0" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.349541 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4784\" (UniqueName: \"kubernetes.io/projected/5bf196fe-b548-409b-9297-8e181d01ef54-kube-api-access-q4784\") pod \"cinder-api-0\" (UID: \"5bf196fe-b548-409b-9297-8e181d01ef54\") " pod="openstack/cinder-api-0" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.349686 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5bf196fe-b548-409b-9297-8e181d01ef54-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5bf196fe-b548-409b-9297-8e181d01ef54\") " pod="openstack/cinder-api-0" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.349820 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bf196fe-b548-409b-9297-8e181d01ef54-config-data\") pod \"cinder-api-0\" (UID: \"5bf196fe-b548-409b-9297-8e181d01ef54\") " pod="openstack/cinder-api-0" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.349857 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bf196fe-b548-409b-9297-8e181d01ef54-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5bf196fe-b548-409b-9297-8e181d01ef54\") " pod="openstack/cinder-api-0" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.349896 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bf196fe-b548-409b-9297-8e181d01ef54-scripts\") pod \"cinder-api-0\" (UID: \"5bf196fe-b548-409b-9297-8e181d01ef54\") " pod="openstack/cinder-api-0" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.350011 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bf196fe-b548-409b-9297-8e181d01ef54-logs\") pod \"cinder-api-0\" (UID: \"5bf196fe-b548-409b-9297-8e181d01ef54\") " pod="openstack/cinder-api-0" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.350716 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bf196fe-b548-409b-9297-8e181d01ef54-logs\") pod \"cinder-api-0\" (UID: \"5bf196fe-b548-409b-9297-8e181d01ef54\") " pod="openstack/cinder-api-0" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.360209 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5bf196fe-b548-409b-9297-8e181d01ef54-config-data-custom\") pod \"cinder-api-0\" (UID: \"5bf196fe-b548-409b-9297-8e181d01ef54\") " pod="openstack/cinder-api-0" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.363930 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bf196fe-b548-409b-9297-8e181d01ef54-config-data\") pod \"cinder-api-0\" (UID: \"5bf196fe-b548-409b-9297-8e181d01ef54\") " pod="openstack/cinder-api-0" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.364273 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5bf196fe-b548-409b-9297-8e181d01ef54-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5bf196fe-b548-409b-9297-8e181d01ef54\") " pod="openstack/cinder-api-0" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.368774 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bf196fe-b548-409b-9297-8e181d01ef54-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5bf196fe-b548-409b-9297-8e181d01ef54\") " pod="openstack/cinder-api-0" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.379956 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bf196fe-b548-409b-9297-8e181d01ef54-scripts\") pod \"cinder-api-0\" (UID: \"5bf196fe-b548-409b-9297-8e181d01ef54\") " pod="openstack/cinder-api-0" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.395997 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4784\" (UniqueName: \"kubernetes.io/projected/5bf196fe-b548-409b-9297-8e181d01ef54-kube-api-access-q4784\") pod \"cinder-api-0\" (UID: \"5bf196fe-b548-409b-9297-8e181d01ef54\") " pod="openstack/cinder-api-0" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.475395 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.489965 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-gwv4p" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.491895 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="62fc1d5f-d757-46ae-bea1-2a68ae61a794" containerName="proxy-httpd" containerID="cri-o://4984086e917ec196648a9c3a053ee46385ab0b6eb2469f5074c801b78f268693" gracePeriod=30 Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.492316 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="62fc1d5f-d757-46ae-bea1-2a68ae61a794" containerName="sg-core" containerID="cri-o://539defb8b37cc43feb594b8d4dd7f986913c50591b1426a808a5b64f0b61c484" gracePeriod=30 Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.495786 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="62fc1d5f-d757-46ae-bea1-2a68ae61a794" containerName="ceilometer-notification-agent" containerID="cri-o://50d14529429efe3966e15ac4dbd2bcfc747722bc46039f193a48291063f08624" gracePeriod=30 Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.501153 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"62fc1d5f-d757-46ae-bea1-2a68ae61a794","Type":"ContainerStarted","Data":"4984086e917ec196648a9c3a053ee46385ab0b6eb2469f5074c801b78f268693"} Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.501233 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.538488 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-gwv4p" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.674379 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c271958a-d5c0-4990-9c44-8655731b309a-dns-swift-storage-0\") pod \"c271958a-d5c0-4990-9c44-8655731b309a\" (UID: \"c271958a-d5c0-4990-9c44-8655731b309a\") " Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.674956 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c271958a-d5c0-4990-9c44-8655731b309a-config\") pod \"c271958a-d5c0-4990-9c44-8655731b309a\" (UID: \"c271958a-d5c0-4990-9c44-8655731b309a\") " Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.674999 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c271958a-d5c0-4990-9c44-8655731b309a-ovsdbserver-sb\") pod \"c271958a-d5c0-4990-9c44-8655731b309a\" (UID: \"c271958a-d5c0-4990-9c44-8655731b309a\") " Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.675097 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c271958a-d5c0-4990-9c44-8655731b309a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c271958a-d5c0-4990-9c44-8655731b309a" (UID: "c271958a-d5c0-4990-9c44-8655731b309a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.675156 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c271958a-d5c0-4990-9c44-8655731b309a-ovsdbserver-nb\") pod \"c271958a-d5c0-4990-9c44-8655731b309a\" (UID: \"c271958a-d5c0-4990-9c44-8655731b309a\") " Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.675203 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c271958a-d5c0-4990-9c44-8655731b309a-dns-svc\") pod \"c271958a-d5c0-4990-9c44-8655731b309a\" (UID: \"c271958a-d5c0-4990-9c44-8655731b309a\") " Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.675348 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5llpb\" (UniqueName: \"kubernetes.io/projected/c271958a-d5c0-4990-9c44-8655731b309a-kube-api-access-5llpb\") pod \"c271958a-d5c0-4990-9c44-8655731b309a\" (UID: \"c271958a-d5c0-4990-9c44-8655731b309a\") " Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.675503 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c271958a-d5c0-4990-9c44-8655731b309a-config" (OuterVolumeSpecName: "config") pod "c271958a-d5c0-4990-9c44-8655731b309a" (UID: "c271958a-d5c0-4990-9c44-8655731b309a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.675577 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c271958a-d5c0-4990-9c44-8655731b309a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c271958a-d5c0-4990-9c44-8655731b309a" (UID: "c271958a-d5c0-4990-9c44-8655731b309a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.675874 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c271958a-d5c0-4990-9c44-8655731b309a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c271958a-d5c0-4990-9c44-8655731b309a" (UID: "c271958a-d5c0-4990-9c44-8655731b309a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.675923 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c271958a-d5c0-4990-9c44-8655731b309a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c271958a-d5c0-4990-9c44-8655731b309a" (UID: "c271958a-d5c0-4990-9c44-8655731b309a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.676476 4991 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c271958a-d5c0-4990-9c44-8655731b309a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.676499 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c271958a-d5c0-4990-9c44-8655731b309a-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.676512 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c271958a-d5c0-4990-9c44-8655731b309a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.676522 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c271958a-d5c0-4990-9c44-8655731b309a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.676533 4991 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c271958a-d5c0-4990-9c44-8655731b309a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.686460 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c271958a-d5c0-4990-9c44-8655731b309a-kube-api-access-5llpb" (OuterVolumeSpecName: "kube-api-access-5llpb") pod "c271958a-d5c0-4990-9c44-8655731b309a" (UID: "c271958a-d5c0-4990-9c44-8655731b309a"). InnerVolumeSpecName "kube-api-access-5llpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.780222 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5llpb\" (UniqueName: \"kubernetes.io/projected/c271958a-d5c0-4990-9c44-8655731b309a-kube-api-access-5llpb\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.794367 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7d69cddbc-dkwjg"] Dec 06 01:23:10 crc kubenswrapper[4991]: I1206 01:23:10.947155 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7b4c56449d-5hvrn"] Dec 06 01:23:10 crc kubenswrapper[4991]: W1206 01:23:10.963453 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79ffe5af_2a10_4cd9_b8bc_1c2e49dae8b1.slice/crio-144cfa691344dfd35f594ed79eb457a508d21d156a93ce13d2a64fca2198fbd3 WatchSource:0}: Error finding container 144cfa691344dfd35f594ed79eb457a508d21d156a93ce13d2a64fca2198fbd3: Status 404 returned error can't find the container with id 144cfa691344dfd35f594ed79eb457a508d21d156a93ce13d2a64fca2198fbd3 Dec 06 01:23:11 crc kubenswrapper[4991]: I1206 01:23:11.030435 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 06 01:23:11 crc kubenswrapper[4991]: I1206 01:23:11.030503 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 06 01:23:11 crc kubenswrapper[4991]: I1206 01:23:11.082436 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 01:23:11 crc kubenswrapper[4991]: W1206 01:23:11.083074 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2631c52c_c2fb_40cb_beb6_7c5035dff7fe.slice/crio-13461b804c8122865d243420779bcdf08ec4459a42d35603d133dc7a22609f7f WatchSource:0}: Error finding container 13461b804c8122865d243420779bcdf08ec4459a42d35603d133dc7a22609f7f: Status 404 returned error can't find the container with id 13461b804c8122865d243420779bcdf08ec4459a42d35603d133dc7a22609f7f Dec 06 01:23:11 crc kubenswrapper[4991]: W1206 01:23:11.090904 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a95fbb6_9365_42b5_8d44_5805af860743.slice/crio-72e111792f27f94cab031a71043fd51a2ad3371eeedeca7eeb22723b4a566ab9 WatchSource:0}: Error finding container 72e111792f27f94cab031a71043fd51a2ad3371eeedeca7eeb22723b4a566ab9: Status 404 returned error can't find the container with id 72e111792f27f94cab031a71043fd51a2ad3371eeedeca7eeb22723b4a566ab9 Dec 06 01:23:11 crc kubenswrapper[4991]: I1206 01:23:11.121374 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-jsqrg"] Dec 06 01:23:11 crc kubenswrapper[4991]: I1206 01:23:11.177192 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-c65b84cc8-zjpfz"] Dec 06 01:23:11 crc kubenswrapper[4991]: I1206 01:23:11.205491 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 06 01:23:11 crc kubenswrapper[4991]: I1206 01:23:11.205581 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 06 01:23:11 crc kubenswrapper[4991]: I1206 01:23:11.270820 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 06 01:23:11 crc kubenswrapper[4991]: I1206 01:23:11.525088 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"62fc1d5f-d757-46ae-bea1-2a68ae61a794","Type":"ContainerDied","Data":"4984086e917ec196648a9c3a053ee46385ab0b6eb2469f5074c801b78f268693"} Dec 06 01:23:11 crc kubenswrapper[4991]: I1206 01:23:11.525174 4991 generic.go:334] "Generic (PLEG): container finished" podID="62fc1d5f-d757-46ae-bea1-2a68ae61a794" containerID="4984086e917ec196648a9c3a053ee46385ab0b6eb2469f5074c801b78f268693" exitCode=0 Dec 06 01:23:11 crc kubenswrapper[4991]: I1206 01:23:11.525653 4991 generic.go:334] "Generic (PLEG): container finished" podID="62fc1d5f-d757-46ae-bea1-2a68ae61a794" containerID="539defb8b37cc43feb594b8d4dd7f986913c50591b1426a808a5b64f0b61c484" exitCode=2 Dec 06 01:23:11 crc kubenswrapper[4991]: I1206 01:23:11.525750 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"62fc1d5f-d757-46ae-bea1-2a68ae61a794","Type":"ContainerDied","Data":"539defb8b37cc43feb594b8d4dd7f986913c50591b1426a808a5b64f0b61c484"} Dec 06 01:23:11 crc kubenswrapper[4991]: I1206 01:23:11.528167 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5bf196fe-b548-409b-9297-8e181d01ef54","Type":"ContainerStarted","Data":"c6774d23f01e847a64f1d8264057d11a5ef63a9dea0e0705b679f8eb3a60842c"} Dec 06 01:23:11 crc kubenswrapper[4991]: I1206 01:23:11.531592 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7b4c56449d-5hvrn" event={"ID":"79ffe5af-2a10-4cd9-b8bc-1c2e49dae8b1","Type":"ContainerStarted","Data":"144cfa691344dfd35f594ed79eb457a508d21d156a93ce13d2a64fca2198fbd3"} Dec 06 01:23:11 crc kubenswrapper[4991]: I1206 01:23:11.532488 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7d69cddbc-dkwjg" event={"ID":"cc3413e2-6d39-4013-94ac-fb8d1c42e5ff","Type":"ContainerStarted","Data":"00c9c3b826e0504c48ef72acd5a8dfa123baa04d45393c88a1ea17c02901248f"} Dec 06 01:23:11 crc kubenswrapper[4991]: I1206 01:23:11.536793 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2a95fbb6-9365-42b5-8d44-5805af860743","Type":"ContainerStarted","Data":"72e111792f27f94cab031a71043fd51a2ad3371eeedeca7eeb22723b4a566ab9"} Dec 06 01:23:11 crc kubenswrapper[4991]: I1206 01:23:11.546970 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-c65b84cc8-zjpfz" event={"ID":"34a2a9f3-1540-40b7-b00f-e67b824dc610","Type":"ContainerStarted","Data":"75509aa880d1a5266b56283a20be7c82285096b599c167c9408c9be245299260"} Dec 06 01:23:11 crc kubenswrapper[4991]: I1206 01:23:11.547057 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-c65b84cc8-zjpfz" event={"ID":"34a2a9f3-1540-40b7-b00f-e67b824dc610","Type":"ContainerStarted","Data":"d97ccbe872ce77a3dc19bfbab9112a49f44db1f437ae37fa869194f5dced3ea7"} Dec 06 01:23:11 crc kubenswrapper[4991]: I1206 01:23:11.577279 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-jsqrg" event={"ID":"2631c52c-c2fb-40cb-beb6-7c5035dff7fe","Type":"ContainerStarted","Data":"2cac24e9a6fc037ae65da4496869291205b3e01af25d3526d9cb35dc77626815"} Dec 06 01:23:11 crc kubenswrapper[4991]: I1206 01:23:11.577321 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-gwv4p" Dec 06 01:23:11 crc kubenswrapper[4991]: I1206 01:23:11.577349 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-jsqrg" event={"ID":"2631c52c-c2fb-40cb-beb6-7c5035dff7fe","Type":"ContainerStarted","Data":"13461b804c8122865d243420779bcdf08ec4459a42d35603d133dc7a22609f7f"} Dec 06 01:23:11 crc kubenswrapper[4991]: I1206 01:23:11.580192 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 06 01:23:11 crc kubenswrapper[4991]: I1206 01:23:11.580247 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 06 01:23:11 crc kubenswrapper[4991]: I1206 01:23:11.668103 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-gwv4p"] Dec 06 01:23:11 crc kubenswrapper[4991]: I1206 01:23:11.702938 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-gwv4p"] Dec 06 01:23:12 crc kubenswrapper[4991]: I1206 01:23:12.327880 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 06 01:23:12 crc kubenswrapper[4991]: I1206 01:23:12.591154 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-c65b84cc8-zjpfz" event={"ID":"34a2a9f3-1540-40b7-b00f-e67b824dc610","Type":"ContainerStarted","Data":"4426b63ca6f1d78d409353bb8bfd3eba7d96e3589bff72e7b46bb1841863fea4"} Dec 06 01:23:12 crc kubenswrapper[4991]: I1206 01:23:12.591285 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-c65b84cc8-zjpfz" Dec 06 01:23:12 crc kubenswrapper[4991]: I1206 01:23:12.591304 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-c65b84cc8-zjpfz" Dec 06 01:23:12 crc kubenswrapper[4991]: I1206 01:23:12.594530 4991 generic.go:334] "Generic (PLEG): container finished" podID="2631c52c-c2fb-40cb-beb6-7c5035dff7fe" containerID="2cac24e9a6fc037ae65da4496869291205b3e01af25d3526d9cb35dc77626815" exitCode=0 Dec 06 01:23:12 crc kubenswrapper[4991]: I1206 01:23:12.594895 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-jsqrg" event={"ID":"2631c52c-c2fb-40cb-beb6-7c5035dff7fe","Type":"ContainerDied","Data":"2cac24e9a6fc037ae65da4496869291205b3e01af25d3526d9cb35dc77626815"} Dec 06 01:23:12 crc kubenswrapper[4991]: I1206 01:23:12.594949 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-jsqrg" event={"ID":"2631c52c-c2fb-40cb-beb6-7c5035dff7fe","Type":"ContainerStarted","Data":"788e98ca062f00a9c72b1297d91f301a643682b89f7a298d0f8207f1f68b3ce3"} Dec 06 01:23:12 crc kubenswrapper[4991]: I1206 01:23:12.595032 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bb4fc677f-jsqrg" Dec 06 01:23:12 crc kubenswrapper[4991]: I1206 01:23:12.609291 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5bf196fe-b548-409b-9297-8e181d01ef54","Type":"ContainerStarted","Data":"9af773cf8d803bbb5355798f2f498e8064a8c08bae72ec3bca5d7e4ffc19fefc"} Dec 06 01:23:12 crc kubenswrapper[4991]: I1206 01:23:12.618840 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-c65b84cc8-zjpfz" podStartSLOduration=3.6188092149999997 podStartE2EDuration="3.618809215s" podCreationTimestamp="2025-12-06 01:23:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:23:12.611966378 +0000 UTC m=+1265.962256846" watchObservedRunningTime="2025-12-06 01:23:12.618809215 +0000 UTC m=+1265.969099683" Dec 06 01:23:12 crc kubenswrapper[4991]: I1206 01:23:12.640439 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bb4fc677f-jsqrg" podStartSLOduration=3.640420234 podStartE2EDuration="3.640420234s" podCreationTimestamp="2025-12-06 01:23:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:23:12.63407881 +0000 UTC m=+1265.984369278" watchObservedRunningTime="2025-12-06 01:23:12.640420234 +0000 UTC m=+1265.990710702" Dec 06 01:23:13 crc kubenswrapper[4991]: I1206 01:23:13.059950 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c271958a-d5c0-4990-9c44-8655731b309a" path="/var/lib/kubelet/pods/c271958a-d5c0-4990-9c44-8655731b309a/volumes" Dec 06 01:23:13 crc kubenswrapper[4991]: I1206 01:23:13.649801 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2a95fbb6-9365-42b5-8d44-5805af860743","Type":"ContainerStarted","Data":"7278c6ccbb39b36d1cdc90cd9f3c190e5a089beb10ad1328328dd623fa443974"} Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.206881 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.207857 4991 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.212611 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.486511 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-b4cd4b7d8-99kjr" podUID="ff6a4088-3509-40aa-a3f6-917e289cd44f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.612450 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.664935 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5bf196fe-b548-409b-9297-8e181d01ef54","Type":"ContainerStarted","Data":"fc80a01f2f8f2b9705e46f8aa3b37adce4e66449214245dc0943f89b38c4e570"} Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.665201 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="5bf196fe-b548-409b-9297-8e181d01ef54" containerName="cinder-api-log" containerID="cri-o://9af773cf8d803bbb5355798f2f498e8064a8c08bae72ec3bca5d7e4ffc19fefc" gracePeriod=30 Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.665536 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.665862 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="5bf196fe-b548-409b-9297-8e181d01ef54" containerName="cinder-api" containerID="cri-o://fc80a01f2f8f2b9705e46f8aa3b37adce4e66449214245dc0943f89b38c4e570" gracePeriod=30 Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.678772 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7b4c56449d-5hvrn" event={"ID":"79ffe5af-2a10-4cd9-b8bc-1c2e49dae8b1","Type":"ContainerStarted","Data":"d1686ad33617380f62b49c130aefb9d70363746ac2b28043db73c2d878cf652a"} Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.678832 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7b4c56449d-5hvrn" event={"ID":"79ffe5af-2a10-4cd9-b8bc-1c2e49dae8b1","Type":"ContainerStarted","Data":"b7a016e448a375be4fa0ac67d5a95bce37b6adbd4bc35a5088b18ad158839d92"} Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.683208 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7d69cddbc-dkwjg" event={"ID":"cc3413e2-6d39-4013-94ac-fb8d1c42e5ff","Type":"ContainerStarted","Data":"8e15e13118ee5ceef504caed422a22c3b216e7d06c6c7fd88619bec7f1479b95"} Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.683258 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7d69cddbc-dkwjg" event={"ID":"cc3413e2-6d39-4013-94ac-fb8d1c42e5ff","Type":"ContainerStarted","Data":"93ecb47e6bbfa358a75da962f7df06f539037cec720b3b58cc2d338be33d149a"} Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.686132 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.686109653 podStartE2EDuration="4.686109653s" podCreationTimestamp="2025-12-06 01:23:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:23:14.684357218 +0000 UTC m=+1268.034647686" watchObservedRunningTime="2025-12-06 01:23:14.686109653 +0000 UTC m=+1268.036400121" Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.696853 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7pcc\" (UniqueName: \"kubernetes.io/projected/62fc1d5f-d757-46ae-bea1-2a68ae61a794-kube-api-access-g7pcc\") pod \"62fc1d5f-d757-46ae-bea1-2a68ae61a794\" (UID: \"62fc1d5f-d757-46ae-bea1-2a68ae61a794\") " Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.696907 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/62fc1d5f-d757-46ae-bea1-2a68ae61a794-sg-core-conf-yaml\") pod \"62fc1d5f-d757-46ae-bea1-2a68ae61a794\" (UID: \"62fc1d5f-d757-46ae-bea1-2a68ae61a794\") " Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.696960 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62fc1d5f-d757-46ae-bea1-2a68ae61a794-scripts\") pod \"62fc1d5f-d757-46ae-bea1-2a68ae61a794\" (UID: \"62fc1d5f-d757-46ae-bea1-2a68ae61a794\") " Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.697181 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62fc1d5f-d757-46ae-bea1-2a68ae61a794-combined-ca-bundle\") pod \"62fc1d5f-d757-46ae-bea1-2a68ae61a794\" (UID: \"62fc1d5f-d757-46ae-bea1-2a68ae61a794\") " Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.697236 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62fc1d5f-d757-46ae-bea1-2a68ae61a794-run-httpd\") pod \"62fc1d5f-d757-46ae-bea1-2a68ae61a794\" (UID: \"62fc1d5f-d757-46ae-bea1-2a68ae61a794\") " Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.697328 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62fc1d5f-d757-46ae-bea1-2a68ae61a794-config-data\") pod \"62fc1d5f-d757-46ae-bea1-2a68ae61a794\" (UID: \"62fc1d5f-d757-46ae-bea1-2a68ae61a794\") " Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.697366 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62fc1d5f-d757-46ae-bea1-2a68ae61a794-log-httpd\") pod \"62fc1d5f-d757-46ae-bea1-2a68ae61a794\" (UID: \"62fc1d5f-d757-46ae-bea1-2a68ae61a794\") " Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.698623 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62fc1d5f-d757-46ae-bea1-2a68ae61a794-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "62fc1d5f-d757-46ae-bea1-2a68ae61a794" (UID: "62fc1d5f-d757-46ae-bea1-2a68ae61a794"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.698944 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62fc1d5f-d757-46ae-bea1-2a68ae61a794-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "62fc1d5f-d757-46ae-bea1-2a68ae61a794" (UID: "62fc1d5f-d757-46ae-bea1-2a68ae61a794"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.699383 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2a95fbb6-9365-42b5-8d44-5805af860743","Type":"ContainerStarted","Data":"beecea3d4d300304002967fb18e3994793ffce123844acac70d908d6dbad7a4b"} Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.714299 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62fc1d5f-d757-46ae-bea1-2a68ae61a794-kube-api-access-g7pcc" (OuterVolumeSpecName: "kube-api-access-g7pcc") pod "62fc1d5f-d757-46ae-bea1-2a68ae61a794" (UID: "62fc1d5f-d757-46ae-bea1-2a68ae61a794"). InnerVolumeSpecName "kube-api-access-g7pcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.715759 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7b4c56449d-5hvrn" podStartSLOduration=3.226176292 podStartE2EDuration="5.7157401s" podCreationTimestamp="2025-12-06 01:23:09 +0000 UTC" firstStartedPulling="2025-12-06 01:23:10.967463942 +0000 UTC m=+1264.317754410" lastFinishedPulling="2025-12-06 01:23:13.45702775 +0000 UTC m=+1266.807318218" observedRunningTime="2025-12-06 01:23:14.712757812 +0000 UTC m=+1268.063048300" watchObservedRunningTime="2025-12-06 01:23:14.7157401 +0000 UTC m=+1268.066030568" Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.754136 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-577fd657dd-psmdm" podUID="15de5638-5e2b-4ce1-9d25-8f2830cd9241" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.759680 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62fc1d5f-d757-46ae-bea1-2a68ae61a794-scripts" (OuterVolumeSpecName: "scripts") pod "62fc1d5f-d757-46ae-bea1-2a68ae61a794" (UID: "62fc1d5f-d757-46ae-bea1-2a68ae61a794"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.766138 4991 generic.go:334] "Generic (PLEG): container finished" podID="62fc1d5f-d757-46ae-bea1-2a68ae61a794" containerID="50d14529429efe3966e15ac4dbd2bcfc747722bc46039f193a48291063f08624" exitCode=0 Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.766966 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.767160 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"62fc1d5f-d757-46ae-bea1-2a68ae61a794","Type":"ContainerDied","Data":"50d14529429efe3966e15ac4dbd2bcfc747722bc46039f193a48291063f08624"} Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.767189 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"62fc1d5f-d757-46ae-bea1-2a68ae61a794","Type":"ContainerDied","Data":"07382850d344806885e3e3fe0fa323ed1da45cb7fd0f390e2e8ab1fb6912c0e5"} Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.767204 4991 scope.go:117] "RemoveContainer" containerID="4984086e917ec196648a9c3a053ee46385ab0b6eb2469f5074c801b78f268693" Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.779969 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7d69cddbc-dkwjg" podStartSLOduration=3.140120797 podStartE2EDuration="5.77995226s" podCreationTimestamp="2025-12-06 01:23:09 +0000 UTC" firstStartedPulling="2025-12-06 01:23:10.813651465 +0000 UTC m=+1264.163941923" lastFinishedPulling="2025-12-06 01:23:13.453482918 +0000 UTC m=+1266.803773386" observedRunningTime="2025-12-06 01:23:14.779073997 +0000 UTC m=+1268.129364485" watchObservedRunningTime="2025-12-06 01:23:14.77995226 +0000 UTC m=+1268.130242728" Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.800899 4991 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62fc1d5f-d757-46ae-bea1-2a68ae61a794-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.800937 4991 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62fc1d5f-d757-46ae-bea1-2a68ae61a794-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.800945 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7pcc\" (UniqueName: \"kubernetes.io/projected/62fc1d5f-d757-46ae-bea1-2a68ae61a794-kube-api-access-g7pcc\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.801092 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62fc1d5f-d757-46ae-bea1-2a68ae61a794-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.809313 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62fc1d5f-d757-46ae-bea1-2a68ae61a794-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "62fc1d5f-d757-46ae-bea1-2a68ae61a794" (UID: "62fc1d5f-d757-46ae-bea1-2a68ae61a794"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.846320 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.973948267 podStartE2EDuration="5.846302126s" podCreationTimestamp="2025-12-06 01:23:09 +0000 UTC" firstStartedPulling="2025-12-06 01:23:11.102083563 +0000 UTC m=+1264.452374031" lastFinishedPulling="2025-12-06 01:23:11.974437422 +0000 UTC m=+1265.324727890" observedRunningTime="2025-12-06 01:23:14.835959758 +0000 UTC m=+1268.186250216" watchObservedRunningTime="2025-12-06 01:23:14.846302126 +0000 UTC m=+1268.196592594" Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.853013 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62fc1d5f-d757-46ae-bea1-2a68ae61a794-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62fc1d5f-d757-46ae-bea1-2a68ae61a794" (UID: "62fc1d5f-d757-46ae-bea1-2a68ae61a794"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.875102 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62fc1d5f-d757-46ae-bea1-2a68ae61a794-config-data" (OuterVolumeSpecName: "config-data") pod "62fc1d5f-d757-46ae-bea1-2a68ae61a794" (UID: "62fc1d5f-d757-46ae-bea1-2a68ae61a794"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.886142 4991 scope.go:117] "RemoveContainer" containerID="539defb8b37cc43feb594b8d4dd7f986913c50591b1426a808a5b64f0b61c484" Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.903212 4991 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/62fc1d5f-d757-46ae-bea1-2a68ae61a794-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.903244 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62fc1d5f-d757-46ae-bea1-2a68ae61a794-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.903254 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62fc1d5f-d757-46ae-bea1-2a68ae61a794-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.914252 4991 scope.go:117] "RemoveContainer" containerID="50d14529429efe3966e15ac4dbd2bcfc747722bc46039f193a48291063f08624" Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.943970 4991 scope.go:117] "RemoveContainer" containerID="4984086e917ec196648a9c3a053ee46385ab0b6eb2469f5074c801b78f268693" Dec 06 01:23:14 crc kubenswrapper[4991]: E1206 01:23:14.948198 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4984086e917ec196648a9c3a053ee46385ab0b6eb2469f5074c801b78f268693\": container with ID starting with 4984086e917ec196648a9c3a053ee46385ab0b6eb2469f5074c801b78f268693 not found: ID does not exist" containerID="4984086e917ec196648a9c3a053ee46385ab0b6eb2469f5074c801b78f268693" Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.948260 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4984086e917ec196648a9c3a053ee46385ab0b6eb2469f5074c801b78f268693"} err="failed to get container status \"4984086e917ec196648a9c3a053ee46385ab0b6eb2469f5074c801b78f268693\": rpc error: code = NotFound desc = could not find container \"4984086e917ec196648a9c3a053ee46385ab0b6eb2469f5074c801b78f268693\": container with ID starting with 4984086e917ec196648a9c3a053ee46385ab0b6eb2469f5074c801b78f268693 not found: ID does not exist" Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.948299 4991 scope.go:117] "RemoveContainer" containerID="539defb8b37cc43feb594b8d4dd7f986913c50591b1426a808a5b64f0b61c484" Dec 06 01:23:14 crc kubenswrapper[4991]: E1206 01:23:14.949358 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"539defb8b37cc43feb594b8d4dd7f986913c50591b1426a808a5b64f0b61c484\": container with ID starting with 539defb8b37cc43feb594b8d4dd7f986913c50591b1426a808a5b64f0b61c484 not found: ID does not exist" containerID="539defb8b37cc43feb594b8d4dd7f986913c50591b1426a808a5b64f0b61c484" Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.949415 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"539defb8b37cc43feb594b8d4dd7f986913c50591b1426a808a5b64f0b61c484"} err="failed to get container status \"539defb8b37cc43feb594b8d4dd7f986913c50591b1426a808a5b64f0b61c484\": rpc error: code = NotFound desc = could not find container \"539defb8b37cc43feb594b8d4dd7f986913c50591b1426a808a5b64f0b61c484\": container with ID starting with 539defb8b37cc43feb594b8d4dd7f986913c50591b1426a808a5b64f0b61c484 not found: ID does not exist" Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.949459 4991 scope.go:117] "RemoveContainer" containerID="50d14529429efe3966e15ac4dbd2bcfc747722bc46039f193a48291063f08624" Dec 06 01:23:14 crc kubenswrapper[4991]: E1206 01:23:14.950028 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50d14529429efe3966e15ac4dbd2bcfc747722bc46039f193a48291063f08624\": container with ID starting with 50d14529429efe3966e15ac4dbd2bcfc747722bc46039f193a48291063f08624 not found: ID does not exist" containerID="50d14529429efe3966e15ac4dbd2bcfc747722bc46039f193a48291063f08624" Dec 06 01:23:14 crc kubenswrapper[4991]: I1206 01:23:14.950847 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50d14529429efe3966e15ac4dbd2bcfc747722bc46039f193a48291063f08624"} err="failed to get container status \"50d14529429efe3966e15ac4dbd2bcfc747722bc46039f193a48291063f08624\": rpc error: code = NotFound desc = could not find container \"50d14529429efe3966e15ac4dbd2bcfc747722bc46039f193a48291063f08624\": container with ID starting with 50d14529429efe3966e15ac4dbd2bcfc747722bc46039f193a48291063f08624 not found: ID does not exist" Dec 06 01:23:15 crc kubenswrapper[4991]: I1206 01:23:15.080715 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 06 01:23:15 crc kubenswrapper[4991]: I1206 01:23:15.110310 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:23:15 crc kubenswrapper[4991]: I1206 01:23:15.117186 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:23:15 crc kubenswrapper[4991]: I1206 01:23:15.164174 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:23:15 crc kubenswrapper[4991]: E1206 01:23:15.164721 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62fc1d5f-d757-46ae-bea1-2a68ae61a794" containerName="proxy-httpd" Dec 06 01:23:15 crc kubenswrapper[4991]: I1206 01:23:15.164740 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="62fc1d5f-d757-46ae-bea1-2a68ae61a794" containerName="proxy-httpd" Dec 06 01:23:15 crc kubenswrapper[4991]: E1206 01:23:15.164798 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62fc1d5f-d757-46ae-bea1-2a68ae61a794" containerName="ceilometer-notification-agent" Dec 06 01:23:15 crc kubenswrapper[4991]: I1206 01:23:15.164807 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="62fc1d5f-d757-46ae-bea1-2a68ae61a794" containerName="ceilometer-notification-agent" Dec 06 01:23:15 crc kubenswrapper[4991]: E1206 01:23:15.164820 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62fc1d5f-d757-46ae-bea1-2a68ae61a794" containerName="sg-core" Dec 06 01:23:15 crc kubenswrapper[4991]: I1206 01:23:15.164845 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="62fc1d5f-d757-46ae-bea1-2a68ae61a794" containerName="sg-core" Dec 06 01:23:15 crc kubenswrapper[4991]: I1206 01:23:15.165118 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="62fc1d5f-d757-46ae-bea1-2a68ae61a794" containerName="sg-core" Dec 06 01:23:15 crc kubenswrapper[4991]: I1206 01:23:15.165148 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="62fc1d5f-d757-46ae-bea1-2a68ae61a794" containerName="proxy-httpd" Dec 06 01:23:15 crc kubenswrapper[4991]: I1206 01:23:15.165160 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="62fc1d5f-d757-46ae-bea1-2a68ae61a794" containerName="ceilometer-notification-agent" Dec 06 01:23:15 crc kubenswrapper[4991]: I1206 01:23:15.166765 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 01:23:15 crc kubenswrapper[4991]: I1206 01:23:15.171160 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 06 01:23:15 crc kubenswrapper[4991]: I1206 01:23:15.171365 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 06 01:23:15 crc kubenswrapper[4991]: I1206 01:23:15.180417 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:23:15 crc kubenswrapper[4991]: I1206 01:23:15.309906 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eedff750-2cc3-4eff-a889-79e542f0ba39-run-httpd\") pod \"ceilometer-0\" (UID: \"eedff750-2cc3-4eff-a889-79e542f0ba39\") " pod="openstack/ceilometer-0" Dec 06 01:23:15 crc kubenswrapper[4991]: I1206 01:23:15.310030 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eedff750-2cc3-4eff-a889-79e542f0ba39-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eedff750-2cc3-4eff-a889-79e542f0ba39\") " pod="openstack/ceilometer-0" Dec 06 01:23:15 crc kubenswrapper[4991]: I1206 01:23:15.310068 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ssn8\" (UniqueName: \"kubernetes.io/projected/eedff750-2cc3-4eff-a889-79e542f0ba39-kube-api-access-2ssn8\") pod \"ceilometer-0\" (UID: \"eedff750-2cc3-4eff-a889-79e542f0ba39\") " pod="openstack/ceilometer-0" Dec 06 01:23:15 crc kubenswrapper[4991]: I1206 01:23:15.310109 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eedff750-2cc3-4eff-a889-79e542f0ba39-scripts\") pod \"ceilometer-0\" (UID: \"eedff750-2cc3-4eff-a889-79e542f0ba39\") " pod="openstack/ceilometer-0" Dec 06 01:23:15 crc kubenswrapper[4991]: I1206 01:23:15.310137 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eedff750-2cc3-4eff-a889-79e542f0ba39-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eedff750-2cc3-4eff-a889-79e542f0ba39\") " pod="openstack/ceilometer-0" Dec 06 01:23:15 crc kubenswrapper[4991]: I1206 01:23:15.310171 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eedff750-2cc3-4eff-a889-79e542f0ba39-log-httpd\") pod \"ceilometer-0\" (UID: \"eedff750-2cc3-4eff-a889-79e542f0ba39\") " pod="openstack/ceilometer-0" Dec 06 01:23:15 crc kubenswrapper[4991]: I1206 01:23:15.310193 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eedff750-2cc3-4eff-a889-79e542f0ba39-config-data\") pod \"ceilometer-0\" (UID: \"eedff750-2cc3-4eff-a889-79e542f0ba39\") " pod="openstack/ceilometer-0" Dec 06 01:23:15 crc kubenswrapper[4991]: I1206 01:23:15.411458 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eedff750-2cc3-4eff-a889-79e542f0ba39-run-httpd\") pod \"ceilometer-0\" (UID: \"eedff750-2cc3-4eff-a889-79e542f0ba39\") " pod="openstack/ceilometer-0" Dec 06 01:23:15 crc kubenswrapper[4991]: I1206 01:23:15.411524 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eedff750-2cc3-4eff-a889-79e542f0ba39-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eedff750-2cc3-4eff-a889-79e542f0ba39\") " pod="openstack/ceilometer-0" Dec 06 01:23:15 crc kubenswrapper[4991]: I1206 01:23:15.411566 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ssn8\" (UniqueName: \"kubernetes.io/projected/eedff750-2cc3-4eff-a889-79e542f0ba39-kube-api-access-2ssn8\") pod \"ceilometer-0\" (UID: \"eedff750-2cc3-4eff-a889-79e542f0ba39\") " pod="openstack/ceilometer-0" Dec 06 01:23:15 crc kubenswrapper[4991]: I1206 01:23:15.411619 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eedff750-2cc3-4eff-a889-79e542f0ba39-scripts\") pod \"ceilometer-0\" (UID: \"eedff750-2cc3-4eff-a889-79e542f0ba39\") " pod="openstack/ceilometer-0" Dec 06 01:23:15 crc kubenswrapper[4991]: I1206 01:23:15.411652 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eedff750-2cc3-4eff-a889-79e542f0ba39-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eedff750-2cc3-4eff-a889-79e542f0ba39\") " pod="openstack/ceilometer-0" Dec 06 01:23:15 crc kubenswrapper[4991]: I1206 01:23:15.411698 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eedff750-2cc3-4eff-a889-79e542f0ba39-log-httpd\") pod \"ceilometer-0\" (UID: \"eedff750-2cc3-4eff-a889-79e542f0ba39\") " pod="openstack/ceilometer-0" Dec 06 01:23:15 crc kubenswrapper[4991]: I1206 01:23:15.411721 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eedff750-2cc3-4eff-a889-79e542f0ba39-config-data\") pod \"ceilometer-0\" (UID: \"eedff750-2cc3-4eff-a889-79e542f0ba39\") " pod="openstack/ceilometer-0" Dec 06 01:23:15 crc kubenswrapper[4991]: I1206 01:23:15.412243 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eedff750-2cc3-4eff-a889-79e542f0ba39-run-httpd\") pod \"ceilometer-0\" (UID: \"eedff750-2cc3-4eff-a889-79e542f0ba39\") " pod="openstack/ceilometer-0" Dec 06 01:23:15 crc kubenswrapper[4991]: I1206 01:23:15.412375 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eedff750-2cc3-4eff-a889-79e542f0ba39-log-httpd\") pod \"ceilometer-0\" (UID: \"eedff750-2cc3-4eff-a889-79e542f0ba39\") " pod="openstack/ceilometer-0" Dec 06 01:23:15 crc kubenswrapper[4991]: I1206 01:23:15.420509 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eedff750-2cc3-4eff-a889-79e542f0ba39-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eedff750-2cc3-4eff-a889-79e542f0ba39\") " pod="openstack/ceilometer-0" Dec 06 01:23:15 crc kubenswrapper[4991]: I1206 01:23:15.420847 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eedff750-2cc3-4eff-a889-79e542f0ba39-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eedff750-2cc3-4eff-a889-79e542f0ba39\") " pod="openstack/ceilometer-0" Dec 06 01:23:15 crc kubenswrapper[4991]: I1206 01:23:15.423237 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eedff750-2cc3-4eff-a889-79e542f0ba39-scripts\") pod \"ceilometer-0\" (UID: \"eedff750-2cc3-4eff-a889-79e542f0ba39\") " pod="openstack/ceilometer-0" Dec 06 01:23:15 crc kubenswrapper[4991]: I1206 01:23:15.440075 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eedff750-2cc3-4eff-a889-79e542f0ba39-config-data\") pod \"ceilometer-0\" (UID: \"eedff750-2cc3-4eff-a889-79e542f0ba39\") " pod="openstack/ceilometer-0" Dec 06 01:23:15 crc kubenswrapper[4991]: I1206 01:23:15.443671 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ssn8\" (UniqueName: \"kubernetes.io/projected/eedff750-2cc3-4eff-a889-79e542f0ba39-kube-api-access-2ssn8\") pod \"ceilometer-0\" (UID: \"eedff750-2cc3-4eff-a889-79e542f0ba39\") " pod="openstack/ceilometer-0" Dec 06 01:23:15 crc kubenswrapper[4991]: I1206 01:23:15.493422 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 01:23:15 crc kubenswrapper[4991]: I1206 01:23:15.783635 4991 generic.go:334] "Generic (PLEG): container finished" podID="5bf196fe-b548-409b-9297-8e181d01ef54" containerID="9af773cf8d803bbb5355798f2f498e8064a8c08bae72ec3bca5d7e4ffc19fefc" exitCode=143 Dec 06 01:23:15 crc kubenswrapper[4991]: I1206 01:23:15.786006 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5bf196fe-b548-409b-9297-8e181d01ef54","Type":"ContainerDied","Data":"9af773cf8d803bbb5355798f2f498e8064a8c08bae72ec3bca5d7e4ffc19fefc"} Dec 06 01:23:16 crc kubenswrapper[4991]: I1206 01:23:16.072209 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5d7f8d97fd-bwgzp"] Dec 06 01:23:16 crc kubenswrapper[4991]: I1206 01:23:16.084842 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:23:16 crc kubenswrapper[4991]: I1206 01:23:16.085046 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5d7f8d97fd-bwgzp" Dec 06 01:23:16 crc kubenswrapper[4991]: I1206 01:23:16.089262 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 06 01:23:16 crc kubenswrapper[4991]: I1206 01:23:16.089347 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 06 01:23:16 crc kubenswrapper[4991]: I1206 01:23:16.098765 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5d7f8d97fd-bwgzp"] Dec 06 01:23:16 crc kubenswrapper[4991]: I1206 01:23:16.128581 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b53a8fe6-006f-49e8-9598-dd986c58e97a-config-data\") pod \"barbican-api-5d7f8d97fd-bwgzp\" (UID: \"b53a8fe6-006f-49e8-9598-dd986c58e97a\") " pod="openstack/barbican-api-5d7f8d97fd-bwgzp" Dec 06 01:23:16 crc kubenswrapper[4991]: I1206 01:23:16.128627 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b53a8fe6-006f-49e8-9598-dd986c58e97a-internal-tls-certs\") pod \"barbican-api-5d7f8d97fd-bwgzp\" (UID: \"b53a8fe6-006f-49e8-9598-dd986c58e97a\") " pod="openstack/barbican-api-5d7f8d97fd-bwgzp" Dec 06 01:23:16 crc kubenswrapper[4991]: I1206 01:23:16.128664 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b53a8fe6-006f-49e8-9598-dd986c58e97a-combined-ca-bundle\") pod \"barbican-api-5d7f8d97fd-bwgzp\" (UID: \"b53a8fe6-006f-49e8-9598-dd986c58e97a\") " pod="openstack/barbican-api-5d7f8d97fd-bwgzp" Dec 06 01:23:16 crc kubenswrapper[4991]: I1206 01:23:16.128845 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b53a8fe6-006f-49e8-9598-dd986c58e97a-config-data-custom\") pod \"barbican-api-5d7f8d97fd-bwgzp\" (UID: \"b53a8fe6-006f-49e8-9598-dd986c58e97a\") " pod="openstack/barbican-api-5d7f8d97fd-bwgzp" Dec 06 01:23:16 crc kubenswrapper[4991]: I1206 01:23:16.128924 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9gwx\" (UniqueName: \"kubernetes.io/projected/b53a8fe6-006f-49e8-9598-dd986c58e97a-kube-api-access-r9gwx\") pod \"barbican-api-5d7f8d97fd-bwgzp\" (UID: \"b53a8fe6-006f-49e8-9598-dd986c58e97a\") " pod="openstack/barbican-api-5d7f8d97fd-bwgzp" Dec 06 01:23:16 crc kubenswrapper[4991]: I1206 01:23:16.129009 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b53a8fe6-006f-49e8-9598-dd986c58e97a-logs\") pod \"barbican-api-5d7f8d97fd-bwgzp\" (UID: \"b53a8fe6-006f-49e8-9598-dd986c58e97a\") " pod="openstack/barbican-api-5d7f8d97fd-bwgzp" Dec 06 01:23:16 crc kubenswrapper[4991]: I1206 01:23:16.129130 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b53a8fe6-006f-49e8-9598-dd986c58e97a-public-tls-certs\") pod \"barbican-api-5d7f8d97fd-bwgzp\" (UID: \"b53a8fe6-006f-49e8-9598-dd986c58e97a\") " pod="openstack/barbican-api-5d7f8d97fd-bwgzp" Dec 06 01:23:16 crc kubenswrapper[4991]: I1206 01:23:16.232246 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b53a8fe6-006f-49e8-9598-dd986c58e97a-config-data\") pod \"barbican-api-5d7f8d97fd-bwgzp\" (UID: \"b53a8fe6-006f-49e8-9598-dd986c58e97a\") " pod="openstack/barbican-api-5d7f8d97fd-bwgzp" Dec 06 01:23:16 crc kubenswrapper[4991]: I1206 01:23:16.232331 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b53a8fe6-006f-49e8-9598-dd986c58e97a-internal-tls-certs\") pod \"barbican-api-5d7f8d97fd-bwgzp\" (UID: \"b53a8fe6-006f-49e8-9598-dd986c58e97a\") " pod="openstack/barbican-api-5d7f8d97fd-bwgzp" Dec 06 01:23:16 crc kubenswrapper[4991]: I1206 01:23:16.232381 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b53a8fe6-006f-49e8-9598-dd986c58e97a-combined-ca-bundle\") pod \"barbican-api-5d7f8d97fd-bwgzp\" (UID: \"b53a8fe6-006f-49e8-9598-dd986c58e97a\") " pod="openstack/barbican-api-5d7f8d97fd-bwgzp" Dec 06 01:23:16 crc kubenswrapper[4991]: I1206 01:23:16.232470 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b53a8fe6-006f-49e8-9598-dd986c58e97a-config-data-custom\") pod \"barbican-api-5d7f8d97fd-bwgzp\" (UID: \"b53a8fe6-006f-49e8-9598-dd986c58e97a\") " pod="openstack/barbican-api-5d7f8d97fd-bwgzp" Dec 06 01:23:16 crc kubenswrapper[4991]: I1206 01:23:16.232508 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9gwx\" (UniqueName: \"kubernetes.io/projected/b53a8fe6-006f-49e8-9598-dd986c58e97a-kube-api-access-r9gwx\") pod \"barbican-api-5d7f8d97fd-bwgzp\" (UID: \"b53a8fe6-006f-49e8-9598-dd986c58e97a\") " pod="openstack/barbican-api-5d7f8d97fd-bwgzp" Dec 06 01:23:16 crc kubenswrapper[4991]: I1206 01:23:16.232554 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b53a8fe6-006f-49e8-9598-dd986c58e97a-logs\") pod \"barbican-api-5d7f8d97fd-bwgzp\" (UID: \"b53a8fe6-006f-49e8-9598-dd986c58e97a\") " pod="openstack/barbican-api-5d7f8d97fd-bwgzp" Dec 06 01:23:16 crc kubenswrapper[4991]: I1206 01:23:16.232626 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b53a8fe6-006f-49e8-9598-dd986c58e97a-public-tls-certs\") pod \"barbican-api-5d7f8d97fd-bwgzp\" (UID: \"b53a8fe6-006f-49e8-9598-dd986c58e97a\") " pod="openstack/barbican-api-5d7f8d97fd-bwgzp" Dec 06 01:23:16 crc kubenswrapper[4991]: I1206 01:23:16.233714 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b53a8fe6-006f-49e8-9598-dd986c58e97a-logs\") pod \"barbican-api-5d7f8d97fd-bwgzp\" (UID: \"b53a8fe6-006f-49e8-9598-dd986c58e97a\") " pod="openstack/barbican-api-5d7f8d97fd-bwgzp" Dec 06 01:23:16 crc kubenswrapper[4991]: I1206 01:23:16.240899 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b53a8fe6-006f-49e8-9598-dd986c58e97a-combined-ca-bundle\") pod \"barbican-api-5d7f8d97fd-bwgzp\" (UID: \"b53a8fe6-006f-49e8-9598-dd986c58e97a\") " pod="openstack/barbican-api-5d7f8d97fd-bwgzp" Dec 06 01:23:16 crc kubenswrapper[4991]: I1206 01:23:16.247141 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b53a8fe6-006f-49e8-9598-dd986c58e97a-config-data-custom\") pod \"barbican-api-5d7f8d97fd-bwgzp\" (UID: \"b53a8fe6-006f-49e8-9598-dd986c58e97a\") " pod="openstack/barbican-api-5d7f8d97fd-bwgzp" Dec 06 01:23:16 crc kubenswrapper[4991]: I1206 01:23:16.257462 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b53a8fe6-006f-49e8-9598-dd986c58e97a-public-tls-certs\") pod \"barbican-api-5d7f8d97fd-bwgzp\" (UID: \"b53a8fe6-006f-49e8-9598-dd986c58e97a\") " pod="openstack/barbican-api-5d7f8d97fd-bwgzp" Dec 06 01:23:16 crc kubenswrapper[4991]: I1206 01:23:16.264744 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b53a8fe6-006f-49e8-9598-dd986c58e97a-internal-tls-certs\") pod \"barbican-api-5d7f8d97fd-bwgzp\" (UID: \"b53a8fe6-006f-49e8-9598-dd986c58e97a\") " pod="openstack/barbican-api-5d7f8d97fd-bwgzp" Dec 06 01:23:16 crc kubenswrapper[4991]: I1206 01:23:16.271478 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9gwx\" (UniqueName: \"kubernetes.io/projected/b53a8fe6-006f-49e8-9598-dd986c58e97a-kube-api-access-r9gwx\") pod \"barbican-api-5d7f8d97fd-bwgzp\" (UID: \"b53a8fe6-006f-49e8-9598-dd986c58e97a\") " pod="openstack/barbican-api-5d7f8d97fd-bwgzp" Dec 06 01:23:16 crc kubenswrapper[4991]: I1206 01:23:16.271897 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b53a8fe6-006f-49e8-9598-dd986c58e97a-config-data\") pod \"barbican-api-5d7f8d97fd-bwgzp\" (UID: \"b53a8fe6-006f-49e8-9598-dd986c58e97a\") " pod="openstack/barbican-api-5d7f8d97fd-bwgzp" Dec 06 01:23:16 crc kubenswrapper[4991]: I1206 01:23:16.465070 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5d7f8d97fd-bwgzp" Dec 06 01:23:16 crc kubenswrapper[4991]: I1206 01:23:16.739714 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 01:23:16 crc kubenswrapper[4991]: I1206 01:23:16.740717 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 01:23:16 crc kubenswrapper[4991]: I1206 01:23:16.740789 4991 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" Dec 06 01:23:16 crc kubenswrapper[4991]: I1206 01:23:16.741880 4991 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dba3ac16754d5a6b2843c8e44c7c8c77526e3ed39ca408957653647034fac3f9"} pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 01:23:16 crc kubenswrapper[4991]: I1206 01:23:16.741948 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" containerID="cri-o://dba3ac16754d5a6b2843c8e44c7c8c77526e3ed39ca408957653647034fac3f9" gracePeriod=600 Dec 06 01:23:16 crc kubenswrapper[4991]: I1206 01:23:16.817087 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eedff750-2cc3-4eff-a889-79e542f0ba39","Type":"ContainerStarted","Data":"454db986e019c82e2107cf7807112494fc6b1e8137dfc0c1626df344a412dbb7"} Dec 06 01:23:17 crc kubenswrapper[4991]: I1206 01:23:17.119658 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62fc1d5f-d757-46ae-bea1-2a68ae61a794" path="/var/lib/kubelet/pods/62fc1d5f-d757-46ae-bea1-2a68ae61a794/volumes" Dec 06 01:23:17 crc kubenswrapper[4991]: I1206 01:23:17.150439 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5d7f8d97fd-bwgzp"] Dec 06 01:23:17 crc kubenswrapper[4991]: I1206 01:23:17.323499 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-84bc9ddf6-jd669" Dec 06 01:23:17 crc kubenswrapper[4991]: I1206 01:23:17.834045 4991 generic.go:334] "Generic (PLEG): container finished" podID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerID="dba3ac16754d5a6b2843c8e44c7c8c77526e3ed39ca408957653647034fac3f9" exitCode=0 Dec 06 01:23:17 crc kubenswrapper[4991]: I1206 01:23:17.835394 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" event={"ID":"f35cea72-9e31-4432-9e3b-16a50ebce794","Type":"ContainerDied","Data":"dba3ac16754d5a6b2843c8e44c7c8c77526e3ed39ca408957653647034fac3f9"} Dec 06 01:23:17 crc kubenswrapper[4991]: I1206 01:23:17.835645 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" event={"ID":"f35cea72-9e31-4432-9e3b-16a50ebce794","Type":"ContainerStarted","Data":"fcb06d7c03c6d3eab8bf59045483a7760319fe20cd9fc9098eb56b0b1ff9d4ef"} Dec 06 01:23:17 crc kubenswrapper[4991]: I1206 01:23:17.835673 4991 scope.go:117] "RemoveContainer" containerID="f375611be938212e8f0efe7042b61ab4ee937e28fa539d018afc5fdc1b82d418" Dec 06 01:23:17 crc kubenswrapper[4991]: I1206 01:23:17.847151 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eedff750-2cc3-4eff-a889-79e542f0ba39","Type":"ContainerStarted","Data":"d2fef9984ff4b0ab7f85374fa5466168c9dcb6faffa9ad40e39766f270ccda4b"} Dec 06 01:23:17 crc kubenswrapper[4991]: I1206 01:23:17.847211 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eedff750-2cc3-4eff-a889-79e542f0ba39","Type":"ContainerStarted","Data":"2de9cb7275d9cb6984d1fd371537bb0907ed76276a4fb19a12226daea11c702a"} Dec 06 01:23:17 crc kubenswrapper[4991]: I1206 01:23:17.852472 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d7f8d97fd-bwgzp" event={"ID":"b53a8fe6-006f-49e8-9598-dd986c58e97a","Type":"ContainerStarted","Data":"1f62840b9d2e00661ce67fbdad7923b929452d13d1670878260ee153ccf48ec5"} Dec 06 01:23:17 crc kubenswrapper[4991]: I1206 01:23:17.852565 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d7f8d97fd-bwgzp" event={"ID":"b53a8fe6-006f-49e8-9598-dd986c58e97a","Type":"ContainerStarted","Data":"d66b97bee1f58b43de463b5fed6776a0627880415d3cd41291dad269a3b97a99"} Dec 06 01:23:17 crc kubenswrapper[4991]: I1206 01:23:17.852592 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d7f8d97fd-bwgzp" event={"ID":"b53a8fe6-006f-49e8-9598-dd986c58e97a","Type":"ContainerStarted","Data":"ebe09bc5a27f13d9a8ec9487da235e2d645445ba32bc1fe8915da3236aa4c621"} Dec 06 01:23:17 crc kubenswrapper[4991]: I1206 01:23:17.853659 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5d7f8d97fd-bwgzp" Dec 06 01:23:17 crc kubenswrapper[4991]: I1206 01:23:17.853700 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5d7f8d97fd-bwgzp" Dec 06 01:23:17 crc kubenswrapper[4991]: I1206 01:23:17.889438 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5d7f8d97fd-bwgzp" podStartSLOduration=1.889420578 podStartE2EDuration="1.889420578s" podCreationTimestamp="2025-12-06 01:23:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:23:17.887223162 +0000 UTC m=+1271.237513630" watchObservedRunningTime="2025-12-06 01:23:17.889420578 +0000 UTC m=+1271.239711046" Dec 06 01:23:18 crc kubenswrapper[4991]: I1206 01:23:18.868793 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eedff750-2cc3-4eff-a889-79e542f0ba39","Type":"ContainerStarted","Data":"f363ce7370925afdc02e74af77eba91e67b1c55e34abd285b829bf240957e14c"} Dec 06 01:23:19 crc kubenswrapper[4991]: I1206 01:23:19.882613 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eedff750-2cc3-4eff-a889-79e542f0ba39","Type":"ContainerStarted","Data":"6392ce5712490db9fbd3ce2c4ab4e277b59338eb3c0203a0a28c0ce93487bc14"} Dec 06 01:23:19 crc kubenswrapper[4991]: I1206 01:23:19.883514 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 06 01:23:19 crc kubenswrapper[4991]: I1206 01:23:19.909185 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.693684208 podStartE2EDuration="4.909136907s" podCreationTimestamp="2025-12-06 01:23:15 +0000 UTC" firstStartedPulling="2025-12-06 01:23:16.114208673 +0000 UTC m=+1269.464499141" lastFinishedPulling="2025-12-06 01:23:19.329661372 +0000 UTC m=+1272.679951840" observedRunningTime="2025-12-06 01:23:19.905277808 +0000 UTC m=+1273.255568276" watchObservedRunningTime="2025-12-06 01:23:19.909136907 +0000 UTC m=+1273.259427375" Dec 06 01:23:20 crc kubenswrapper[4991]: I1206 01:23:20.291238 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bb4fc677f-jsqrg" Dec 06 01:23:20 crc kubenswrapper[4991]: I1206 01:23:20.317059 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 06 01:23:20 crc kubenswrapper[4991]: I1206 01:23:20.409307 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 01:23:20 crc kubenswrapper[4991]: I1206 01:23:20.425497 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-k5mql"] Dec 06 01:23:20 crc kubenswrapper[4991]: I1206 01:23:20.425876 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc5c4795-k5mql" podUID="42ea5f2a-f150-4554-9f87-7ac46f04fd5b" containerName="dnsmasq-dns" containerID="cri-o://84e655dd752044e98c8eb4073341bbe508f91f987fa6dc42189d09912c7132e2" gracePeriod=10 Dec 06 01:23:20 crc kubenswrapper[4991]: I1206 01:23:20.896460 4991 generic.go:334] "Generic (PLEG): container finished" podID="42ea5f2a-f150-4554-9f87-7ac46f04fd5b" containerID="84e655dd752044e98c8eb4073341bbe508f91f987fa6dc42189d09912c7132e2" exitCode=0 Dec 06 01:23:20 crc kubenswrapper[4991]: I1206 01:23:20.897121 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-k5mql" event={"ID":"42ea5f2a-f150-4554-9f87-7ac46f04fd5b","Type":"ContainerDied","Data":"84e655dd752044e98c8eb4073341bbe508f91f987fa6dc42189d09912c7132e2"} Dec 06 01:23:20 crc kubenswrapper[4991]: I1206 01:23:20.897381 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="2a95fbb6-9365-42b5-8d44-5805af860743" containerName="cinder-scheduler" containerID="cri-o://7278c6ccbb39b36d1cdc90cd9f3c190e5a089beb10ad1328328dd623fa443974" gracePeriod=30 Dec 06 01:23:20 crc kubenswrapper[4991]: I1206 01:23:20.899440 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="2a95fbb6-9365-42b5-8d44-5805af860743" containerName="probe" containerID="cri-o://beecea3d4d300304002967fb18e3994793ffce123844acac70d908d6dbad7a4b" gracePeriod=30 Dec 06 01:23:21 crc kubenswrapper[4991]: I1206 01:23:21.050042 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-k5mql" Dec 06 01:23:21 crc kubenswrapper[4991]: I1206 01:23:21.200376 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42ea5f2a-f150-4554-9f87-7ac46f04fd5b-ovsdbserver-sb\") pod \"42ea5f2a-f150-4554-9f87-7ac46f04fd5b\" (UID: \"42ea5f2a-f150-4554-9f87-7ac46f04fd5b\") " Dec 06 01:23:21 crc kubenswrapper[4991]: I1206 01:23:21.200435 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42ea5f2a-f150-4554-9f87-7ac46f04fd5b-ovsdbserver-nb\") pod \"42ea5f2a-f150-4554-9f87-7ac46f04fd5b\" (UID: \"42ea5f2a-f150-4554-9f87-7ac46f04fd5b\") " Dec 06 01:23:21 crc kubenswrapper[4991]: I1206 01:23:21.200506 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42ea5f2a-f150-4554-9f87-7ac46f04fd5b-config\") pod \"42ea5f2a-f150-4554-9f87-7ac46f04fd5b\" (UID: \"42ea5f2a-f150-4554-9f87-7ac46f04fd5b\") " Dec 06 01:23:21 crc kubenswrapper[4991]: I1206 01:23:21.200740 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tsrm\" (UniqueName: \"kubernetes.io/projected/42ea5f2a-f150-4554-9f87-7ac46f04fd5b-kube-api-access-6tsrm\") pod \"42ea5f2a-f150-4554-9f87-7ac46f04fd5b\" (UID: \"42ea5f2a-f150-4554-9f87-7ac46f04fd5b\") " Dec 06 01:23:21 crc kubenswrapper[4991]: I1206 01:23:21.200810 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/42ea5f2a-f150-4554-9f87-7ac46f04fd5b-dns-swift-storage-0\") pod \"42ea5f2a-f150-4554-9f87-7ac46f04fd5b\" (UID: \"42ea5f2a-f150-4554-9f87-7ac46f04fd5b\") " Dec 06 01:23:21 crc kubenswrapper[4991]: I1206 01:23:21.200858 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42ea5f2a-f150-4554-9f87-7ac46f04fd5b-dns-svc\") pod \"42ea5f2a-f150-4554-9f87-7ac46f04fd5b\" (UID: \"42ea5f2a-f150-4554-9f87-7ac46f04fd5b\") " Dec 06 01:23:21 crc kubenswrapper[4991]: I1206 01:23:21.209271 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42ea5f2a-f150-4554-9f87-7ac46f04fd5b-kube-api-access-6tsrm" (OuterVolumeSpecName: "kube-api-access-6tsrm") pod "42ea5f2a-f150-4554-9f87-7ac46f04fd5b" (UID: "42ea5f2a-f150-4554-9f87-7ac46f04fd5b"). InnerVolumeSpecName "kube-api-access-6tsrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:23:21 crc kubenswrapper[4991]: I1206 01:23:21.255719 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42ea5f2a-f150-4554-9f87-7ac46f04fd5b-config" (OuterVolumeSpecName: "config") pod "42ea5f2a-f150-4554-9f87-7ac46f04fd5b" (UID: "42ea5f2a-f150-4554-9f87-7ac46f04fd5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:23:21 crc kubenswrapper[4991]: I1206 01:23:21.266778 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42ea5f2a-f150-4554-9f87-7ac46f04fd5b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "42ea5f2a-f150-4554-9f87-7ac46f04fd5b" (UID: "42ea5f2a-f150-4554-9f87-7ac46f04fd5b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:23:21 crc kubenswrapper[4991]: I1206 01:23:21.283349 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42ea5f2a-f150-4554-9f87-7ac46f04fd5b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "42ea5f2a-f150-4554-9f87-7ac46f04fd5b" (UID: "42ea5f2a-f150-4554-9f87-7ac46f04fd5b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:23:21 crc kubenswrapper[4991]: I1206 01:23:21.283721 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42ea5f2a-f150-4554-9f87-7ac46f04fd5b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "42ea5f2a-f150-4554-9f87-7ac46f04fd5b" (UID: "42ea5f2a-f150-4554-9f87-7ac46f04fd5b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:23:21 crc kubenswrapper[4991]: I1206 01:23:21.285514 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42ea5f2a-f150-4554-9f87-7ac46f04fd5b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "42ea5f2a-f150-4554-9f87-7ac46f04fd5b" (UID: "42ea5f2a-f150-4554-9f87-7ac46f04fd5b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:23:21 crc kubenswrapper[4991]: I1206 01:23:21.306354 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tsrm\" (UniqueName: \"kubernetes.io/projected/42ea5f2a-f150-4554-9f87-7ac46f04fd5b-kube-api-access-6tsrm\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:21 crc kubenswrapper[4991]: I1206 01:23:21.306400 4991 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/42ea5f2a-f150-4554-9f87-7ac46f04fd5b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:21 crc kubenswrapper[4991]: I1206 01:23:21.306411 4991 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42ea5f2a-f150-4554-9f87-7ac46f04fd5b-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:21 crc kubenswrapper[4991]: I1206 01:23:21.306421 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42ea5f2a-f150-4554-9f87-7ac46f04fd5b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:21 crc kubenswrapper[4991]: I1206 01:23:21.306430 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42ea5f2a-f150-4554-9f87-7ac46f04fd5b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:21 crc kubenswrapper[4991]: I1206 01:23:21.306438 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42ea5f2a-f150-4554-9f87-7ac46f04fd5b-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:21 crc kubenswrapper[4991]: I1206 01:23:21.804896 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-c65b84cc8-zjpfz" Dec 06 01:23:21 crc kubenswrapper[4991]: I1206 01:23:21.910075 4991 generic.go:334] "Generic (PLEG): container finished" podID="2a95fbb6-9365-42b5-8d44-5805af860743" containerID="beecea3d4d300304002967fb18e3994793ffce123844acac70d908d6dbad7a4b" exitCode=0 Dec 06 01:23:21 crc kubenswrapper[4991]: I1206 01:23:21.910178 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2a95fbb6-9365-42b5-8d44-5805af860743","Type":"ContainerDied","Data":"beecea3d4d300304002967fb18e3994793ffce123844acac70d908d6dbad7a4b"} Dec 06 01:23:21 crc kubenswrapper[4991]: I1206 01:23:21.912829 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-k5mql" event={"ID":"42ea5f2a-f150-4554-9f87-7ac46f04fd5b","Type":"ContainerDied","Data":"8ac3748ed9bdac5e1bc0a6e2e02af985a2594df00de594efc4338f9cf2a8b1c9"} Dec 06 01:23:21 crc kubenswrapper[4991]: I1206 01:23:21.912886 4991 scope.go:117] "RemoveContainer" containerID="84e655dd752044e98c8eb4073341bbe508f91f987fa6dc42189d09912c7132e2" Dec 06 01:23:21 crc kubenswrapper[4991]: I1206 01:23:21.912884 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-k5mql" Dec 06 01:23:21 crc kubenswrapper[4991]: I1206 01:23:21.943143 4991 scope.go:117] "RemoveContainer" containerID="7902b390abdafc3527d4e237f3a3caff8d6e88c20108f4a1a1e904674c3803d6" Dec 06 01:23:21 crc kubenswrapper[4991]: I1206 01:23:21.955226 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-k5mql"] Dec 06 01:23:21 crc kubenswrapper[4991]: I1206 01:23:21.962914 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-k5mql"] Dec 06 01:23:21 crc kubenswrapper[4991]: I1206 01:23:21.967837 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-c65b84cc8-zjpfz" Dec 06 01:23:23 crc kubenswrapper[4991]: I1206 01:23:23.047211 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42ea5f2a-f150-4554-9f87-7ac46f04fd5b" path="/var/lib/kubelet/pods/42ea5f2a-f150-4554-9f87-7ac46f04fd5b/volumes" Dec 06 01:23:23 crc kubenswrapper[4991]: I1206 01:23:23.214172 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 06 01:23:23 crc kubenswrapper[4991]: I1206 01:23:23.705525 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-779776dbb5-wnlcw" Dec 06 01:23:23 crc kubenswrapper[4991]: I1206 01:23:23.777932 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-84bc9ddf6-jd669"] Dec 06 01:23:23 crc kubenswrapper[4991]: I1206 01:23:23.778428 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-84bc9ddf6-jd669" podUID="b305d8ce-5702-42c2-979a-15e0f7e46cae" containerName="neutron-api" containerID="cri-o://7c79c7eb4c22d5c28f1d73c3f542bad213a7899a8004af58de5dd084179087bf" gracePeriod=30 Dec 06 01:23:23 crc kubenswrapper[4991]: I1206 01:23:23.779402 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-84bc9ddf6-jd669" podUID="b305d8ce-5702-42c2-979a-15e0f7e46cae" containerName="neutron-httpd" containerID="cri-o://83cd19e892c5b1c1a9d83ebc85f9d716d6a9f10363145fa246234c4943a9eae3" gracePeriod=30 Dec 06 01:23:23 crc kubenswrapper[4991]: I1206 01:23:23.938700 4991 generic.go:334] "Generic (PLEG): container finished" podID="b305d8ce-5702-42c2-979a-15e0f7e46cae" containerID="83cd19e892c5b1c1a9d83ebc85f9d716d6a9f10363145fa246234c4943a9eae3" exitCode=0 Dec 06 01:23:23 crc kubenswrapper[4991]: I1206 01:23:23.939146 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84bc9ddf6-jd669" event={"ID":"b305d8ce-5702-42c2-979a-15e0f7e46cae","Type":"ContainerDied","Data":"83cd19e892c5b1c1a9d83ebc85f9d716d6a9f10363145fa246234c4943a9eae3"} Dec 06 01:23:25 crc kubenswrapper[4991]: I1206 01:23:25.958260 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 06 01:23:25 crc kubenswrapper[4991]: I1206 01:23:25.964507 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2a95fbb6-9365-42b5-8d44-5805af860743","Type":"ContainerDied","Data":"7278c6ccbb39b36d1cdc90cd9f3c190e5a089beb10ad1328328dd623fa443974"} Dec 06 01:23:25 crc kubenswrapper[4991]: I1206 01:23:25.964595 4991 scope.go:117] "RemoveContainer" containerID="beecea3d4d300304002967fb18e3994793ffce123844acac70d908d6dbad7a4b" Dec 06 01:23:25 crc kubenswrapper[4991]: I1206 01:23:25.964455 4991 generic.go:334] "Generic (PLEG): container finished" podID="2a95fbb6-9365-42b5-8d44-5805af860743" containerID="7278c6ccbb39b36d1cdc90cd9f3c190e5a089beb10ad1328328dd623fa443974" exitCode=0 Dec 06 01:23:25 crc kubenswrapper[4991]: I1206 01:23:25.964918 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2a95fbb6-9365-42b5-8d44-5805af860743","Type":"ContainerDied","Data":"72e111792f27f94cab031a71043fd51a2ad3371eeedeca7eeb22723b4a566ab9"} Dec 06 01:23:26 crc kubenswrapper[4991]: I1206 01:23:26.004382 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2a95fbb6-9365-42b5-8d44-5805af860743-config-data-custom\") pod \"2a95fbb6-9365-42b5-8d44-5805af860743\" (UID: \"2a95fbb6-9365-42b5-8d44-5805af860743\") " Dec 06 01:23:26 crc kubenswrapper[4991]: I1206 01:23:26.004483 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a95fbb6-9365-42b5-8d44-5805af860743-scripts\") pod \"2a95fbb6-9365-42b5-8d44-5805af860743\" (UID: \"2a95fbb6-9365-42b5-8d44-5805af860743\") " Dec 06 01:23:26 crc kubenswrapper[4991]: I1206 01:23:26.004582 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a95fbb6-9365-42b5-8d44-5805af860743-config-data\") pod \"2a95fbb6-9365-42b5-8d44-5805af860743\" (UID: \"2a95fbb6-9365-42b5-8d44-5805af860743\") " Dec 06 01:23:26 crc kubenswrapper[4991]: I1206 01:23:26.004688 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a95fbb6-9365-42b5-8d44-5805af860743-combined-ca-bundle\") pod \"2a95fbb6-9365-42b5-8d44-5805af860743\" (UID: \"2a95fbb6-9365-42b5-8d44-5805af860743\") " Dec 06 01:23:26 crc kubenswrapper[4991]: I1206 01:23:26.004824 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2a95fbb6-9365-42b5-8d44-5805af860743-etc-machine-id\") pod \"2a95fbb6-9365-42b5-8d44-5805af860743\" (UID: \"2a95fbb6-9365-42b5-8d44-5805af860743\") " Dec 06 01:23:26 crc kubenswrapper[4991]: I1206 01:23:26.004919 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vbcc\" (UniqueName: \"kubernetes.io/projected/2a95fbb6-9365-42b5-8d44-5805af860743-kube-api-access-8vbcc\") pod \"2a95fbb6-9365-42b5-8d44-5805af860743\" (UID: \"2a95fbb6-9365-42b5-8d44-5805af860743\") " Dec 06 01:23:26 crc kubenswrapper[4991]: I1206 01:23:26.009190 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2a95fbb6-9365-42b5-8d44-5805af860743-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2a95fbb6-9365-42b5-8d44-5805af860743" (UID: "2a95fbb6-9365-42b5-8d44-5805af860743"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 01:23:26 crc kubenswrapper[4991]: I1206 01:23:26.014547 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a95fbb6-9365-42b5-8d44-5805af860743-scripts" (OuterVolumeSpecName: "scripts") pod "2a95fbb6-9365-42b5-8d44-5805af860743" (UID: "2a95fbb6-9365-42b5-8d44-5805af860743"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:23:26 crc kubenswrapper[4991]: I1206 01:23:26.018825 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a95fbb6-9365-42b5-8d44-5805af860743-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2a95fbb6-9365-42b5-8d44-5805af860743" (UID: "2a95fbb6-9365-42b5-8d44-5805af860743"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:23:26 crc kubenswrapper[4991]: I1206 01:23:26.048192 4991 scope.go:117] "RemoveContainer" containerID="7278c6ccbb39b36d1cdc90cd9f3c190e5a089beb10ad1328328dd623fa443974" Dec 06 01:23:26 crc kubenswrapper[4991]: I1206 01:23:26.048592 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a95fbb6-9365-42b5-8d44-5805af860743-kube-api-access-8vbcc" (OuterVolumeSpecName: "kube-api-access-8vbcc") pod "2a95fbb6-9365-42b5-8d44-5805af860743" (UID: "2a95fbb6-9365-42b5-8d44-5805af860743"). InnerVolumeSpecName "kube-api-access-8vbcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:23:26 crc kubenswrapper[4991]: I1206 01:23:26.107569 4991 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2a95fbb6-9365-42b5-8d44-5805af860743-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:26 crc kubenswrapper[4991]: I1206 01:23:26.107607 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vbcc\" (UniqueName: \"kubernetes.io/projected/2a95fbb6-9365-42b5-8d44-5805af860743-kube-api-access-8vbcc\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:26 crc kubenswrapper[4991]: I1206 01:23:26.107622 4991 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2a95fbb6-9365-42b5-8d44-5805af860743-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:26 crc kubenswrapper[4991]: I1206 01:23:26.107630 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a95fbb6-9365-42b5-8d44-5805af860743-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:26 crc kubenswrapper[4991]: I1206 01:23:26.131133 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a95fbb6-9365-42b5-8d44-5805af860743-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a95fbb6-9365-42b5-8d44-5805af860743" (UID: "2a95fbb6-9365-42b5-8d44-5805af860743"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:23:26 crc kubenswrapper[4991]: I1206 01:23:26.142004 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a95fbb6-9365-42b5-8d44-5805af860743-config-data" (OuterVolumeSpecName: "config-data") pod "2a95fbb6-9365-42b5-8d44-5805af860743" (UID: "2a95fbb6-9365-42b5-8d44-5805af860743"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:23:26 crc kubenswrapper[4991]: I1206 01:23:26.144801 4991 scope.go:117] "RemoveContainer" containerID="beecea3d4d300304002967fb18e3994793ffce123844acac70d908d6dbad7a4b" Dec 06 01:23:26 crc kubenswrapper[4991]: E1206 01:23:26.145681 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"beecea3d4d300304002967fb18e3994793ffce123844acac70d908d6dbad7a4b\": container with ID starting with beecea3d4d300304002967fb18e3994793ffce123844acac70d908d6dbad7a4b not found: ID does not exist" containerID="beecea3d4d300304002967fb18e3994793ffce123844acac70d908d6dbad7a4b" Dec 06 01:23:26 crc kubenswrapper[4991]: I1206 01:23:26.145720 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"beecea3d4d300304002967fb18e3994793ffce123844acac70d908d6dbad7a4b"} err="failed to get container status \"beecea3d4d300304002967fb18e3994793ffce123844acac70d908d6dbad7a4b\": rpc error: code = NotFound desc = could not find container \"beecea3d4d300304002967fb18e3994793ffce123844acac70d908d6dbad7a4b\": container with ID starting with beecea3d4d300304002967fb18e3994793ffce123844acac70d908d6dbad7a4b not found: ID does not exist" Dec 06 01:23:26 crc kubenswrapper[4991]: I1206 01:23:26.145743 4991 scope.go:117] "RemoveContainer" containerID="7278c6ccbb39b36d1cdc90cd9f3c190e5a089beb10ad1328328dd623fa443974" Dec 06 01:23:26 crc kubenswrapper[4991]: E1206 01:23:26.149626 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7278c6ccbb39b36d1cdc90cd9f3c190e5a089beb10ad1328328dd623fa443974\": container with ID starting with 7278c6ccbb39b36d1cdc90cd9f3c190e5a089beb10ad1328328dd623fa443974 not found: ID does not exist" containerID="7278c6ccbb39b36d1cdc90cd9f3c190e5a089beb10ad1328328dd623fa443974" Dec 06 01:23:26 crc kubenswrapper[4991]: I1206 01:23:26.149654 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7278c6ccbb39b36d1cdc90cd9f3c190e5a089beb10ad1328328dd623fa443974"} err="failed to get container status \"7278c6ccbb39b36d1cdc90cd9f3c190e5a089beb10ad1328328dd623fa443974\": rpc error: code = NotFound desc = could not find container \"7278c6ccbb39b36d1cdc90cd9f3c190e5a089beb10ad1328328dd623fa443974\": container with ID starting with 7278c6ccbb39b36d1cdc90cd9f3c190e5a089beb10ad1328328dd623fa443974 not found: ID does not exist" Dec 06 01:23:26 crc kubenswrapper[4991]: I1206 01:23:26.209259 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a95fbb6-9365-42b5-8d44-5805af860743-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:26 crc kubenswrapper[4991]: I1206 01:23:26.209298 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a95fbb6-9365-42b5-8d44-5805af860743-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:26 crc kubenswrapper[4991]: I1206 01:23:26.774289 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-b4cd4b7d8-99kjr" Dec 06 01:23:26 crc kubenswrapper[4991]: I1206 01:23:26.950479 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-577fd657dd-psmdm" Dec 06 01:23:26 crc kubenswrapper[4991]: I1206 01:23:26.982516 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 06 01:23:27 crc kubenswrapper[4991]: I1206 01:23:27.020905 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 01:23:27 crc kubenswrapper[4991]: I1206 01:23:27.032392 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 01:23:27 crc kubenswrapper[4991]: I1206 01:23:27.066315 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a95fbb6-9365-42b5-8d44-5805af860743" path="/var/lib/kubelet/pods/2a95fbb6-9365-42b5-8d44-5805af860743/volumes" Dec 06 01:23:27 crc kubenswrapper[4991]: I1206 01:23:27.066917 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 01:23:27 crc kubenswrapper[4991]: E1206 01:23:27.067268 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42ea5f2a-f150-4554-9f87-7ac46f04fd5b" containerName="dnsmasq-dns" Dec 06 01:23:27 crc kubenswrapper[4991]: I1206 01:23:27.067284 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="42ea5f2a-f150-4554-9f87-7ac46f04fd5b" containerName="dnsmasq-dns" Dec 06 01:23:27 crc kubenswrapper[4991]: E1206 01:23:27.067303 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42ea5f2a-f150-4554-9f87-7ac46f04fd5b" containerName="init" Dec 06 01:23:27 crc kubenswrapper[4991]: I1206 01:23:27.067310 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="42ea5f2a-f150-4554-9f87-7ac46f04fd5b" containerName="init" Dec 06 01:23:27 crc kubenswrapper[4991]: E1206 01:23:27.067328 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a95fbb6-9365-42b5-8d44-5805af860743" containerName="cinder-scheduler" Dec 06 01:23:27 crc kubenswrapper[4991]: I1206 01:23:27.067334 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a95fbb6-9365-42b5-8d44-5805af860743" containerName="cinder-scheduler" Dec 06 01:23:27 crc kubenswrapper[4991]: E1206 01:23:27.067344 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a95fbb6-9365-42b5-8d44-5805af860743" containerName="probe" Dec 06 01:23:27 crc kubenswrapper[4991]: I1206 01:23:27.067350 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a95fbb6-9365-42b5-8d44-5805af860743" containerName="probe" Dec 06 01:23:27 crc kubenswrapper[4991]: I1206 01:23:27.067515 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a95fbb6-9365-42b5-8d44-5805af860743" containerName="cinder-scheduler" Dec 06 01:23:27 crc kubenswrapper[4991]: I1206 01:23:27.067537 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a95fbb6-9365-42b5-8d44-5805af860743" containerName="probe" Dec 06 01:23:27 crc kubenswrapper[4991]: I1206 01:23:27.067551 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="42ea5f2a-f150-4554-9f87-7ac46f04fd5b" containerName="dnsmasq-dns" Dec 06 01:23:27 crc kubenswrapper[4991]: I1206 01:23:27.068553 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 06 01:23:27 crc kubenswrapper[4991]: I1206 01:23:27.082205 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 06 01:23:27 crc kubenswrapper[4991]: I1206 01:23:27.088782 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 01:23:27 crc kubenswrapper[4991]: I1206 01:23:27.125174 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89779edf-f784-441f-9420-1e03444eea94-config-data\") pod \"cinder-scheduler-0\" (UID: \"89779edf-f784-441f-9420-1e03444eea94\") " pod="openstack/cinder-scheduler-0" Dec 06 01:23:27 crc kubenswrapper[4991]: I1206 01:23:27.125235 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjsz6\" (UniqueName: \"kubernetes.io/projected/89779edf-f784-441f-9420-1e03444eea94-kube-api-access-rjsz6\") pod \"cinder-scheduler-0\" (UID: \"89779edf-f784-441f-9420-1e03444eea94\") " pod="openstack/cinder-scheduler-0" Dec 06 01:23:27 crc kubenswrapper[4991]: I1206 01:23:27.125255 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89779edf-f784-441f-9420-1e03444eea94-scripts\") pod \"cinder-scheduler-0\" (UID: \"89779edf-f784-441f-9420-1e03444eea94\") " pod="openstack/cinder-scheduler-0" Dec 06 01:23:27 crc kubenswrapper[4991]: I1206 01:23:27.125279 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/89779edf-f784-441f-9420-1e03444eea94-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"89779edf-f784-441f-9420-1e03444eea94\") " pod="openstack/cinder-scheduler-0" Dec 06 01:23:27 crc kubenswrapper[4991]: I1206 01:23:27.125308 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/89779edf-f784-441f-9420-1e03444eea94-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"89779edf-f784-441f-9420-1e03444eea94\") " pod="openstack/cinder-scheduler-0" Dec 06 01:23:27 crc kubenswrapper[4991]: I1206 01:23:27.125399 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89779edf-f784-441f-9420-1e03444eea94-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"89779edf-f784-441f-9420-1e03444eea94\") " pod="openstack/cinder-scheduler-0" Dec 06 01:23:27 crc kubenswrapper[4991]: I1206 01:23:27.226715 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89779edf-f784-441f-9420-1e03444eea94-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"89779edf-f784-441f-9420-1e03444eea94\") " pod="openstack/cinder-scheduler-0" Dec 06 01:23:27 crc kubenswrapper[4991]: I1206 01:23:27.226804 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89779edf-f784-441f-9420-1e03444eea94-config-data\") pod \"cinder-scheduler-0\" (UID: \"89779edf-f784-441f-9420-1e03444eea94\") " pod="openstack/cinder-scheduler-0" Dec 06 01:23:27 crc kubenswrapper[4991]: I1206 01:23:27.226838 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjsz6\" (UniqueName: \"kubernetes.io/projected/89779edf-f784-441f-9420-1e03444eea94-kube-api-access-rjsz6\") pod \"cinder-scheduler-0\" (UID: \"89779edf-f784-441f-9420-1e03444eea94\") " pod="openstack/cinder-scheduler-0" Dec 06 01:23:27 crc kubenswrapper[4991]: I1206 01:23:27.226857 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89779edf-f784-441f-9420-1e03444eea94-scripts\") pod \"cinder-scheduler-0\" (UID: \"89779edf-f784-441f-9420-1e03444eea94\") " pod="openstack/cinder-scheduler-0" Dec 06 01:23:27 crc kubenswrapper[4991]: I1206 01:23:27.226880 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/89779edf-f784-441f-9420-1e03444eea94-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"89779edf-f784-441f-9420-1e03444eea94\") " pod="openstack/cinder-scheduler-0" Dec 06 01:23:27 crc kubenswrapper[4991]: I1206 01:23:27.226905 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/89779edf-f784-441f-9420-1e03444eea94-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"89779edf-f784-441f-9420-1e03444eea94\") " pod="openstack/cinder-scheduler-0" Dec 06 01:23:27 crc kubenswrapper[4991]: I1206 01:23:27.227025 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/89779edf-f784-441f-9420-1e03444eea94-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"89779edf-f784-441f-9420-1e03444eea94\") " pod="openstack/cinder-scheduler-0" Dec 06 01:23:27 crc kubenswrapper[4991]: I1206 01:23:27.246486 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89779edf-f784-441f-9420-1e03444eea94-config-data\") pod \"cinder-scheduler-0\" (UID: \"89779edf-f784-441f-9420-1e03444eea94\") " pod="openstack/cinder-scheduler-0" Dec 06 01:23:27 crc kubenswrapper[4991]: I1206 01:23:27.246830 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/89779edf-f784-441f-9420-1e03444eea94-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"89779edf-f784-441f-9420-1e03444eea94\") " pod="openstack/cinder-scheduler-0" Dec 06 01:23:27 crc kubenswrapper[4991]: I1206 01:23:27.249837 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89779edf-f784-441f-9420-1e03444eea94-scripts\") pod \"cinder-scheduler-0\" (UID: \"89779edf-f784-441f-9420-1e03444eea94\") " pod="openstack/cinder-scheduler-0" Dec 06 01:23:27 crc kubenswrapper[4991]: I1206 01:23:27.257812 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89779edf-f784-441f-9420-1e03444eea94-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"89779edf-f784-441f-9420-1e03444eea94\") " pod="openstack/cinder-scheduler-0" Dec 06 01:23:27 crc kubenswrapper[4991]: I1206 01:23:27.263366 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjsz6\" (UniqueName: \"kubernetes.io/projected/89779edf-f784-441f-9420-1e03444eea94-kube-api-access-rjsz6\") pod \"cinder-scheduler-0\" (UID: \"89779edf-f784-441f-9420-1e03444eea94\") " pod="openstack/cinder-scheduler-0" Dec 06 01:23:27 crc kubenswrapper[4991]: I1206 01:23:27.401103 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 06 01:23:27 crc kubenswrapper[4991]: I1206 01:23:27.930617 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 06 01:23:27 crc kubenswrapper[4991]: I1206 01:23:27.995680 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"89779edf-f784-441f-9420-1e03444eea94","Type":"ContainerStarted","Data":"a9d31b14f262a3e4d00eddfd93a8066fdb9d2bd7682bb83be39486c8c79bc79b"} Dec 06 01:23:28 crc kubenswrapper[4991]: I1206 01:23:28.138802 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5d7f8d97fd-bwgzp" Dec 06 01:23:28 crc kubenswrapper[4991]: I1206 01:23:28.194921 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5d7f8d97fd-bwgzp" Dec 06 01:23:28 crc kubenswrapper[4991]: I1206 01:23:28.257826 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-c65b84cc8-zjpfz"] Dec 06 01:23:28 crc kubenswrapper[4991]: I1206 01:23:28.258061 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-c65b84cc8-zjpfz" podUID="34a2a9f3-1540-40b7-b00f-e67b824dc610" containerName="barbican-api-log" containerID="cri-o://75509aa880d1a5266b56283a20be7c82285096b599c167c9408c9be245299260" gracePeriod=30 Dec 06 01:23:28 crc kubenswrapper[4991]: I1206 01:23:28.258582 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-c65b84cc8-zjpfz" podUID="34a2a9f3-1540-40b7-b00f-e67b824dc610" containerName="barbican-api" containerID="cri-o://4426b63ca6f1d78d409353bb8bfd3eba7d96e3589bff72e7b46bb1841863fea4" gracePeriod=30 Dec 06 01:23:28 crc kubenswrapper[4991]: I1206 01:23:28.635752 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-b4cd4b7d8-99kjr" Dec 06 01:23:29 crc kubenswrapper[4991]: I1206 01:23:29.004852 4991 generic.go:334] "Generic (PLEG): container finished" podID="34a2a9f3-1540-40b7-b00f-e67b824dc610" containerID="75509aa880d1a5266b56283a20be7c82285096b599c167c9408c9be245299260" exitCode=143 Dec 06 01:23:29 crc kubenswrapper[4991]: I1206 01:23:29.005246 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-c65b84cc8-zjpfz" event={"ID":"34a2a9f3-1540-40b7-b00f-e67b824dc610","Type":"ContainerDied","Data":"75509aa880d1a5266b56283a20be7c82285096b599c167c9408c9be245299260"} Dec 06 01:23:29 crc kubenswrapper[4991]: I1206 01:23:29.012251 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"89779edf-f784-441f-9420-1e03444eea94","Type":"ContainerStarted","Data":"880fd028303c930d396420800cdd5b6085ec798604829d284a8ed578bda06290"} Dec 06 01:23:29 crc kubenswrapper[4991]: I1206 01:23:29.153432 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-577fd657dd-psmdm" Dec 06 01:23:29 crc kubenswrapper[4991]: I1206 01:23:29.222410 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-b4cd4b7d8-99kjr"] Dec 06 01:23:29 crc kubenswrapper[4991]: I1206 01:23:29.222627 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-b4cd4b7d8-99kjr" podUID="ff6a4088-3509-40aa-a3f6-917e289cd44f" containerName="horizon-log" containerID="cri-o://9c8d337db25b8faca94ece7dc0776b6fb45830b2b42f2b7d52e4a5a4778a214e" gracePeriod=30 Dec 06 01:23:29 crc kubenswrapper[4991]: I1206 01:23:29.222712 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-b4cd4b7d8-99kjr" podUID="ff6a4088-3509-40aa-a3f6-917e289cd44f" containerName="horizon" containerID="cri-o://696c3f31706c7bcf6be1fafe3f512c3ea184441fcb34083126b168a2bdddc61d" gracePeriod=30 Dec 06 01:23:29 crc kubenswrapper[4991]: I1206 01:23:29.889480 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-84bc9ddf6-jd669" Dec 06 01:23:30 crc kubenswrapper[4991]: I1206 01:23:30.007677 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b305d8ce-5702-42c2-979a-15e0f7e46cae-httpd-config\") pod \"b305d8ce-5702-42c2-979a-15e0f7e46cae\" (UID: \"b305d8ce-5702-42c2-979a-15e0f7e46cae\") " Dec 06 01:23:30 crc kubenswrapper[4991]: I1206 01:23:30.008172 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b305d8ce-5702-42c2-979a-15e0f7e46cae-config\") pod \"b305d8ce-5702-42c2-979a-15e0f7e46cae\" (UID: \"b305d8ce-5702-42c2-979a-15e0f7e46cae\") " Dec 06 01:23:30 crc kubenswrapper[4991]: I1206 01:23:30.008792 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b305d8ce-5702-42c2-979a-15e0f7e46cae-combined-ca-bundle\") pod \"b305d8ce-5702-42c2-979a-15e0f7e46cae\" (UID: \"b305d8ce-5702-42c2-979a-15e0f7e46cae\") " Dec 06 01:23:30 crc kubenswrapper[4991]: I1206 01:23:30.009568 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9n7fd\" (UniqueName: \"kubernetes.io/projected/b305d8ce-5702-42c2-979a-15e0f7e46cae-kube-api-access-9n7fd\") pod \"b305d8ce-5702-42c2-979a-15e0f7e46cae\" (UID: \"b305d8ce-5702-42c2-979a-15e0f7e46cae\") " Dec 06 01:23:30 crc kubenswrapper[4991]: I1206 01:23:30.009837 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b305d8ce-5702-42c2-979a-15e0f7e46cae-ovndb-tls-certs\") pod \"b305d8ce-5702-42c2-979a-15e0f7e46cae\" (UID: \"b305d8ce-5702-42c2-979a-15e0f7e46cae\") " Dec 06 01:23:30 crc kubenswrapper[4991]: I1206 01:23:30.016369 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b305d8ce-5702-42c2-979a-15e0f7e46cae-kube-api-access-9n7fd" (OuterVolumeSpecName: "kube-api-access-9n7fd") pod "b305d8ce-5702-42c2-979a-15e0f7e46cae" (UID: "b305d8ce-5702-42c2-979a-15e0f7e46cae"). InnerVolumeSpecName "kube-api-access-9n7fd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:23:30 crc kubenswrapper[4991]: I1206 01:23:30.019598 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b305d8ce-5702-42c2-979a-15e0f7e46cae-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "b305d8ce-5702-42c2-979a-15e0f7e46cae" (UID: "b305d8ce-5702-42c2-979a-15e0f7e46cae"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:23:30 crc kubenswrapper[4991]: I1206 01:23:30.036150 4991 generic.go:334] "Generic (PLEG): container finished" podID="b305d8ce-5702-42c2-979a-15e0f7e46cae" containerID="7c79c7eb4c22d5c28f1d73c3f542bad213a7899a8004af58de5dd084179087bf" exitCode=0 Dec 06 01:23:30 crc kubenswrapper[4991]: I1206 01:23:30.036484 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84bc9ddf6-jd669" event={"ID":"b305d8ce-5702-42c2-979a-15e0f7e46cae","Type":"ContainerDied","Data":"7c79c7eb4c22d5c28f1d73c3f542bad213a7899a8004af58de5dd084179087bf"} Dec 06 01:23:30 crc kubenswrapper[4991]: I1206 01:23:30.036654 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84bc9ddf6-jd669" event={"ID":"b305d8ce-5702-42c2-979a-15e0f7e46cae","Type":"ContainerDied","Data":"3eb09c7d4664210572ecc25efb6e48b6166dc54613ca340bce7a1acf4114a777"} Dec 06 01:23:30 crc kubenswrapper[4991]: I1206 01:23:30.036779 4991 scope.go:117] "RemoveContainer" containerID="83cd19e892c5b1c1a9d83ebc85f9d716d6a9f10363145fa246234c4943a9eae3" Dec 06 01:23:30 crc kubenswrapper[4991]: I1206 01:23:30.037121 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-84bc9ddf6-jd669" Dec 06 01:23:30 crc kubenswrapper[4991]: I1206 01:23:30.047870 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"89779edf-f784-441f-9420-1e03444eea94","Type":"ContainerStarted","Data":"c6356069e7607cafc4e58b43b61ccd51dcb6a31cab93713a986e9f204063d1fa"} Dec 06 01:23:30 crc kubenswrapper[4991]: I1206 01:23:30.076572 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.076553219 podStartE2EDuration="3.076553219s" podCreationTimestamp="2025-12-06 01:23:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:23:30.071705444 +0000 UTC m=+1283.421995922" watchObservedRunningTime="2025-12-06 01:23:30.076553219 +0000 UTC m=+1283.426843687" Dec 06 01:23:30 crc kubenswrapper[4991]: I1206 01:23:30.099761 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b305d8ce-5702-42c2-979a-15e0f7e46cae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b305d8ce-5702-42c2-979a-15e0f7e46cae" (UID: "b305d8ce-5702-42c2-979a-15e0f7e46cae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:23:30 crc kubenswrapper[4991]: I1206 01:23:30.112335 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b305d8ce-5702-42c2-979a-15e0f7e46cae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:30 crc kubenswrapper[4991]: I1206 01:23:30.112567 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9n7fd\" (UniqueName: \"kubernetes.io/projected/b305d8ce-5702-42c2-979a-15e0f7e46cae-kube-api-access-9n7fd\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:30 crc kubenswrapper[4991]: I1206 01:23:30.112705 4991 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b305d8ce-5702-42c2-979a-15e0f7e46cae-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:30 crc kubenswrapper[4991]: I1206 01:23:30.134177 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b305d8ce-5702-42c2-979a-15e0f7e46cae-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "b305d8ce-5702-42c2-979a-15e0f7e46cae" (UID: "b305d8ce-5702-42c2-979a-15e0f7e46cae"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:23:30 crc kubenswrapper[4991]: I1206 01:23:30.163708 4991 scope.go:117] "RemoveContainer" containerID="7c79c7eb4c22d5c28f1d73c3f542bad213a7899a8004af58de5dd084179087bf" Dec 06 01:23:30 crc kubenswrapper[4991]: I1206 01:23:30.170664 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b305d8ce-5702-42c2-979a-15e0f7e46cae-config" (OuterVolumeSpecName: "config") pod "b305d8ce-5702-42c2-979a-15e0f7e46cae" (UID: "b305d8ce-5702-42c2-979a-15e0f7e46cae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:23:30 crc kubenswrapper[4991]: I1206 01:23:30.194815 4991 scope.go:117] "RemoveContainer" containerID="83cd19e892c5b1c1a9d83ebc85f9d716d6a9f10363145fa246234c4943a9eae3" Dec 06 01:23:30 crc kubenswrapper[4991]: E1206 01:23:30.195382 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83cd19e892c5b1c1a9d83ebc85f9d716d6a9f10363145fa246234c4943a9eae3\": container with ID starting with 83cd19e892c5b1c1a9d83ebc85f9d716d6a9f10363145fa246234c4943a9eae3 not found: ID does not exist" containerID="83cd19e892c5b1c1a9d83ebc85f9d716d6a9f10363145fa246234c4943a9eae3" Dec 06 01:23:30 crc kubenswrapper[4991]: I1206 01:23:30.195432 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83cd19e892c5b1c1a9d83ebc85f9d716d6a9f10363145fa246234c4943a9eae3"} err="failed to get container status \"83cd19e892c5b1c1a9d83ebc85f9d716d6a9f10363145fa246234c4943a9eae3\": rpc error: code = NotFound desc = could not find container \"83cd19e892c5b1c1a9d83ebc85f9d716d6a9f10363145fa246234c4943a9eae3\": container with ID starting with 83cd19e892c5b1c1a9d83ebc85f9d716d6a9f10363145fa246234c4943a9eae3 not found: ID does not exist" Dec 06 01:23:30 crc kubenswrapper[4991]: I1206 01:23:30.195460 4991 scope.go:117] "RemoveContainer" containerID="7c79c7eb4c22d5c28f1d73c3f542bad213a7899a8004af58de5dd084179087bf" Dec 06 01:23:30 crc kubenswrapper[4991]: E1206 01:23:30.196052 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c79c7eb4c22d5c28f1d73c3f542bad213a7899a8004af58de5dd084179087bf\": container with ID starting with 7c79c7eb4c22d5c28f1d73c3f542bad213a7899a8004af58de5dd084179087bf not found: ID does not exist" containerID="7c79c7eb4c22d5c28f1d73c3f542bad213a7899a8004af58de5dd084179087bf" Dec 06 01:23:30 crc kubenswrapper[4991]: I1206 01:23:30.196100 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c79c7eb4c22d5c28f1d73c3f542bad213a7899a8004af58de5dd084179087bf"} err="failed to get container status \"7c79c7eb4c22d5c28f1d73c3f542bad213a7899a8004af58de5dd084179087bf\": rpc error: code = NotFound desc = could not find container \"7c79c7eb4c22d5c28f1d73c3f542bad213a7899a8004af58de5dd084179087bf\": container with ID starting with 7c79c7eb4c22d5c28f1d73c3f542bad213a7899a8004af58de5dd084179087bf not found: ID does not exist" Dec 06 01:23:30 crc kubenswrapper[4991]: I1206 01:23:30.214507 4991 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b305d8ce-5702-42c2-979a-15e0f7e46cae-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:30 crc kubenswrapper[4991]: I1206 01:23:30.214573 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b305d8ce-5702-42c2-979a-15e0f7e46cae-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:30 crc kubenswrapper[4991]: I1206 01:23:30.382525 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-84bc9ddf6-jd669"] Dec 06 01:23:30 crc kubenswrapper[4991]: I1206 01:23:30.391042 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-84bc9ddf6-jd669"] Dec 06 01:23:31 crc kubenswrapper[4991]: I1206 01:23:31.048269 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b305d8ce-5702-42c2-979a-15e0f7e46cae" path="/var/lib/kubelet/pods/b305d8ce-5702-42c2-979a-15e0f7e46cae/volumes" Dec 06 01:23:31 crc kubenswrapper[4991]: I1206 01:23:31.464107 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-c65b84cc8-zjpfz" podUID="34a2a9f3-1540-40b7-b00f-e67b824dc610" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": read tcp 10.217.0.2:37922->10.217.0.160:9311: read: connection reset by peer" Dec 06 01:23:31 crc kubenswrapper[4991]: I1206 01:23:31.464139 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-c65b84cc8-zjpfz" podUID="34a2a9f3-1540-40b7-b00f-e67b824dc610" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": read tcp 10.217.0.2:37936->10.217.0.160:9311: read: connection reset by peer" Dec 06 01:23:31 crc kubenswrapper[4991]: I1206 01:23:31.907098 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-c65b84cc8-zjpfz" Dec 06 01:23:31 crc kubenswrapper[4991]: I1206 01:23:31.947323 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34a2a9f3-1540-40b7-b00f-e67b824dc610-combined-ca-bundle\") pod \"34a2a9f3-1540-40b7-b00f-e67b824dc610\" (UID: \"34a2a9f3-1540-40b7-b00f-e67b824dc610\") " Dec 06 01:23:31 crc kubenswrapper[4991]: I1206 01:23:31.947464 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34a2a9f3-1540-40b7-b00f-e67b824dc610-config-data\") pod \"34a2a9f3-1540-40b7-b00f-e67b824dc610\" (UID: \"34a2a9f3-1540-40b7-b00f-e67b824dc610\") " Dec 06 01:23:31 crc kubenswrapper[4991]: I1206 01:23:31.947510 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34a2a9f3-1540-40b7-b00f-e67b824dc610-logs\") pod \"34a2a9f3-1540-40b7-b00f-e67b824dc610\" (UID: \"34a2a9f3-1540-40b7-b00f-e67b824dc610\") " Dec 06 01:23:31 crc kubenswrapper[4991]: I1206 01:23:31.947578 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/34a2a9f3-1540-40b7-b00f-e67b824dc610-config-data-custom\") pod \"34a2a9f3-1540-40b7-b00f-e67b824dc610\" (UID: \"34a2a9f3-1540-40b7-b00f-e67b824dc610\") " Dec 06 01:23:31 crc kubenswrapper[4991]: I1206 01:23:31.948069 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34a2a9f3-1540-40b7-b00f-e67b824dc610-logs" (OuterVolumeSpecName: "logs") pod "34a2a9f3-1540-40b7-b00f-e67b824dc610" (UID: "34a2a9f3-1540-40b7-b00f-e67b824dc610"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:23:31 crc kubenswrapper[4991]: I1206 01:23:31.948144 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntlw5\" (UniqueName: \"kubernetes.io/projected/34a2a9f3-1540-40b7-b00f-e67b824dc610-kube-api-access-ntlw5\") pod \"34a2a9f3-1540-40b7-b00f-e67b824dc610\" (UID: \"34a2a9f3-1540-40b7-b00f-e67b824dc610\") " Dec 06 01:23:31 crc kubenswrapper[4991]: I1206 01:23:31.948932 4991 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34a2a9f3-1540-40b7-b00f-e67b824dc610-logs\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:31 crc kubenswrapper[4991]: I1206 01:23:31.955310 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34a2a9f3-1540-40b7-b00f-e67b824dc610-kube-api-access-ntlw5" (OuterVolumeSpecName: "kube-api-access-ntlw5") pod "34a2a9f3-1540-40b7-b00f-e67b824dc610" (UID: "34a2a9f3-1540-40b7-b00f-e67b824dc610"). InnerVolumeSpecName "kube-api-access-ntlw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:23:31 crc kubenswrapper[4991]: I1206 01:23:31.956290 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34a2a9f3-1540-40b7-b00f-e67b824dc610-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "34a2a9f3-1540-40b7-b00f-e67b824dc610" (UID: "34a2a9f3-1540-40b7-b00f-e67b824dc610"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:23:32 crc kubenswrapper[4991]: I1206 01:23:32.011151 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34a2a9f3-1540-40b7-b00f-e67b824dc610-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34a2a9f3-1540-40b7-b00f-e67b824dc610" (UID: "34a2a9f3-1540-40b7-b00f-e67b824dc610"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:23:32 crc kubenswrapper[4991]: I1206 01:23:32.051067 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34a2a9f3-1540-40b7-b00f-e67b824dc610-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:32 crc kubenswrapper[4991]: I1206 01:23:32.051106 4991 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/34a2a9f3-1540-40b7-b00f-e67b824dc610-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:32 crc kubenswrapper[4991]: I1206 01:23:32.051118 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntlw5\" (UniqueName: \"kubernetes.io/projected/34a2a9f3-1540-40b7-b00f-e67b824dc610-kube-api-access-ntlw5\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:32 crc kubenswrapper[4991]: I1206 01:23:32.071848 4991 generic.go:334] "Generic (PLEG): container finished" podID="34a2a9f3-1540-40b7-b00f-e67b824dc610" containerID="4426b63ca6f1d78d409353bb8bfd3eba7d96e3589bff72e7b46bb1841863fea4" exitCode=0 Dec 06 01:23:32 crc kubenswrapper[4991]: I1206 01:23:32.071927 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-c65b84cc8-zjpfz" event={"ID":"34a2a9f3-1540-40b7-b00f-e67b824dc610","Type":"ContainerDied","Data":"4426b63ca6f1d78d409353bb8bfd3eba7d96e3589bff72e7b46bb1841863fea4"} Dec 06 01:23:32 crc kubenswrapper[4991]: I1206 01:23:32.071964 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-c65b84cc8-zjpfz" event={"ID":"34a2a9f3-1540-40b7-b00f-e67b824dc610","Type":"ContainerDied","Data":"d97ccbe872ce77a3dc19bfbab9112a49f44db1f437ae37fa869194f5dced3ea7"} Dec 06 01:23:32 crc kubenswrapper[4991]: I1206 01:23:32.072003 4991 scope.go:117] "RemoveContainer" containerID="4426b63ca6f1d78d409353bb8bfd3eba7d96e3589bff72e7b46bb1841863fea4" Dec 06 01:23:32 crc kubenswrapper[4991]: I1206 01:23:32.072129 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-c65b84cc8-zjpfz" Dec 06 01:23:32 crc kubenswrapper[4991]: I1206 01:23:32.085497 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34a2a9f3-1540-40b7-b00f-e67b824dc610-config-data" (OuterVolumeSpecName: "config-data") pod "34a2a9f3-1540-40b7-b00f-e67b824dc610" (UID: "34a2a9f3-1540-40b7-b00f-e67b824dc610"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:23:32 crc kubenswrapper[4991]: I1206 01:23:32.100960 4991 scope.go:117] "RemoveContainer" containerID="75509aa880d1a5266b56283a20be7c82285096b599c167c9408c9be245299260" Dec 06 01:23:32 crc kubenswrapper[4991]: I1206 01:23:32.124107 4991 scope.go:117] "RemoveContainer" containerID="4426b63ca6f1d78d409353bb8bfd3eba7d96e3589bff72e7b46bb1841863fea4" Dec 06 01:23:32 crc kubenswrapper[4991]: E1206 01:23:32.125588 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4426b63ca6f1d78d409353bb8bfd3eba7d96e3589bff72e7b46bb1841863fea4\": container with ID starting with 4426b63ca6f1d78d409353bb8bfd3eba7d96e3589bff72e7b46bb1841863fea4 not found: ID does not exist" containerID="4426b63ca6f1d78d409353bb8bfd3eba7d96e3589bff72e7b46bb1841863fea4" Dec 06 01:23:32 crc kubenswrapper[4991]: I1206 01:23:32.127357 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4426b63ca6f1d78d409353bb8bfd3eba7d96e3589bff72e7b46bb1841863fea4"} err="failed to get container status \"4426b63ca6f1d78d409353bb8bfd3eba7d96e3589bff72e7b46bb1841863fea4\": rpc error: code = NotFound desc = could not find container \"4426b63ca6f1d78d409353bb8bfd3eba7d96e3589bff72e7b46bb1841863fea4\": container with ID starting with 4426b63ca6f1d78d409353bb8bfd3eba7d96e3589bff72e7b46bb1841863fea4 not found: ID does not exist" Dec 06 01:23:32 crc kubenswrapper[4991]: I1206 01:23:32.127387 4991 scope.go:117] "RemoveContainer" containerID="75509aa880d1a5266b56283a20be7c82285096b599c167c9408c9be245299260" Dec 06 01:23:32 crc kubenswrapper[4991]: E1206 01:23:32.128757 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75509aa880d1a5266b56283a20be7c82285096b599c167c9408c9be245299260\": container with ID starting with 75509aa880d1a5266b56283a20be7c82285096b599c167c9408c9be245299260 not found: ID does not exist" containerID="75509aa880d1a5266b56283a20be7c82285096b599c167c9408c9be245299260" Dec 06 01:23:32 crc kubenswrapper[4991]: I1206 01:23:32.128843 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75509aa880d1a5266b56283a20be7c82285096b599c167c9408c9be245299260"} err="failed to get container status \"75509aa880d1a5266b56283a20be7c82285096b599c167c9408c9be245299260\": rpc error: code = NotFound desc = could not find container \"75509aa880d1a5266b56283a20be7c82285096b599c167c9408c9be245299260\": container with ID starting with 75509aa880d1a5266b56283a20be7c82285096b599c167c9408c9be245299260 not found: ID does not exist" Dec 06 01:23:32 crc kubenswrapper[4991]: I1206 01:23:32.152477 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34a2a9f3-1540-40b7-b00f-e67b824dc610-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:32 crc kubenswrapper[4991]: I1206 01:23:32.402025 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 06 01:23:32 crc kubenswrapper[4991]: I1206 01:23:32.415412 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-c65b84cc8-zjpfz"] Dec 06 01:23:32 crc kubenswrapper[4991]: I1206 01:23:32.423960 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-c65b84cc8-zjpfz"] Dec 06 01:23:32 crc kubenswrapper[4991]: I1206 01:23:32.748843 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6d59dffb86-qrfm9" Dec 06 01:23:32 crc kubenswrapper[4991]: I1206 01:23:32.818743 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6d59dffb86-qrfm9" Dec 06 01:23:33 crc kubenswrapper[4991]: I1206 01:23:33.062706 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34a2a9f3-1540-40b7-b00f-e67b824dc610" path="/var/lib/kubelet/pods/34a2a9f3-1540-40b7-b00f-e67b824dc610/volumes" Dec 06 01:23:33 crc kubenswrapper[4991]: I1206 01:23:33.102362 4991 generic.go:334] "Generic (PLEG): container finished" podID="ff6a4088-3509-40aa-a3f6-917e289cd44f" containerID="696c3f31706c7bcf6be1fafe3f512c3ea184441fcb34083126b168a2bdddc61d" exitCode=0 Dec 06 01:23:33 crc kubenswrapper[4991]: I1206 01:23:33.105731 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b4cd4b7d8-99kjr" event={"ID":"ff6a4088-3509-40aa-a3f6-917e289cd44f","Type":"ContainerDied","Data":"696c3f31706c7bcf6be1fafe3f512c3ea184441fcb34083126b168a2bdddc61d"} Dec 06 01:23:33 crc kubenswrapper[4991]: I1206 01:23:33.216984 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-99654f6-8nq4x" Dec 06 01:23:34 crc kubenswrapper[4991]: I1206 01:23:34.486710 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-b4cd4b7d8-99kjr" podUID="ff6a4088-3509-40aa-a3f6-917e289cd44f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Dec 06 01:23:37 crc kubenswrapper[4991]: I1206 01:23:37.623547 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 06 01:23:37 crc kubenswrapper[4991]: I1206 01:23:37.840205 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 06 01:23:37 crc kubenswrapper[4991]: E1206 01:23:37.841466 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34a2a9f3-1540-40b7-b00f-e67b824dc610" containerName="barbican-api" Dec 06 01:23:37 crc kubenswrapper[4991]: I1206 01:23:37.841500 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="34a2a9f3-1540-40b7-b00f-e67b824dc610" containerName="barbican-api" Dec 06 01:23:37 crc kubenswrapper[4991]: E1206 01:23:37.841549 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34a2a9f3-1540-40b7-b00f-e67b824dc610" containerName="barbican-api-log" Dec 06 01:23:37 crc kubenswrapper[4991]: I1206 01:23:37.841558 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="34a2a9f3-1540-40b7-b00f-e67b824dc610" containerName="barbican-api-log" Dec 06 01:23:37 crc kubenswrapper[4991]: E1206 01:23:37.841607 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b305d8ce-5702-42c2-979a-15e0f7e46cae" containerName="neutron-api" Dec 06 01:23:37 crc kubenswrapper[4991]: I1206 01:23:37.841618 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="b305d8ce-5702-42c2-979a-15e0f7e46cae" containerName="neutron-api" Dec 06 01:23:37 crc kubenswrapper[4991]: E1206 01:23:37.841644 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b305d8ce-5702-42c2-979a-15e0f7e46cae" containerName="neutron-httpd" Dec 06 01:23:37 crc kubenswrapper[4991]: I1206 01:23:37.841655 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="b305d8ce-5702-42c2-979a-15e0f7e46cae" containerName="neutron-httpd" Dec 06 01:23:37 crc kubenswrapper[4991]: I1206 01:23:37.842165 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="34a2a9f3-1540-40b7-b00f-e67b824dc610" containerName="barbican-api" Dec 06 01:23:37 crc kubenswrapper[4991]: I1206 01:23:37.842204 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="b305d8ce-5702-42c2-979a-15e0f7e46cae" containerName="neutron-httpd" Dec 06 01:23:37 crc kubenswrapper[4991]: I1206 01:23:37.842214 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="b305d8ce-5702-42c2-979a-15e0f7e46cae" containerName="neutron-api" Dec 06 01:23:37 crc kubenswrapper[4991]: I1206 01:23:37.842255 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="34a2a9f3-1540-40b7-b00f-e67b824dc610" containerName="barbican-api-log" Dec 06 01:23:37 crc kubenswrapper[4991]: I1206 01:23:37.843696 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 06 01:23:37 crc kubenswrapper[4991]: I1206 01:23:37.846127 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-4lbjv" Dec 06 01:23:37 crc kubenswrapper[4991]: I1206 01:23:37.847041 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 06 01:23:37 crc kubenswrapper[4991]: I1206 01:23:37.847238 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 06 01:23:37 crc kubenswrapper[4991]: I1206 01:23:37.863632 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 06 01:23:37 crc kubenswrapper[4991]: I1206 01:23:37.906035 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ff108074-7904-43a0-b621-6676980779f6-openstack-config\") pod \"openstackclient\" (UID: \"ff108074-7904-43a0-b621-6676980779f6\") " pod="openstack/openstackclient" Dec 06 01:23:37 crc kubenswrapper[4991]: I1206 01:23:37.906354 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ff108074-7904-43a0-b621-6676980779f6-openstack-config-secret\") pod \"openstackclient\" (UID: \"ff108074-7904-43a0-b621-6676980779f6\") " pod="openstack/openstackclient" Dec 06 01:23:37 crc kubenswrapper[4991]: I1206 01:23:37.906418 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff108074-7904-43a0-b621-6676980779f6-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ff108074-7904-43a0-b621-6676980779f6\") " pod="openstack/openstackclient" Dec 06 01:23:37 crc kubenswrapper[4991]: I1206 01:23:37.906470 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4tpf\" (UniqueName: \"kubernetes.io/projected/ff108074-7904-43a0-b621-6676980779f6-kube-api-access-b4tpf\") pod \"openstackclient\" (UID: \"ff108074-7904-43a0-b621-6676980779f6\") " pod="openstack/openstackclient" Dec 06 01:23:37 crc kubenswrapper[4991]: I1206 01:23:37.971747 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-ff7f9b9c9-g4mpb"] Dec 06 01:23:37 crc kubenswrapper[4991]: I1206 01:23:37.974219 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-ff7f9b9c9-g4mpb" Dec 06 01:23:37 crc kubenswrapper[4991]: I1206 01:23:37.976899 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 06 01:23:37 crc kubenswrapper[4991]: I1206 01:23:37.977135 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 06 01:23:37 crc kubenswrapper[4991]: I1206 01:23:37.977586 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 06 01:23:37 crc kubenswrapper[4991]: I1206 01:23:37.995104 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-ff7f9b9c9-g4mpb"] Dec 06 01:23:38 crc kubenswrapper[4991]: I1206 01:23:38.008376 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4tpf\" (UniqueName: \"kubernetes.io/projected/ff108074-7904-43a0-b621-6676980779f6-kube-api-access-b4tpf\") pod \"openstackclient\" (UID: \"ff108074-7904-43a0-b621-6676980779f6\") " pod="openstack/openstackclient" Dec 06 01:23:38 crc kubenswrapper[4991]: I1206 01:23:38.008647 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ff108074-7904-43a0-b621-6676980779f6-openstack-config\") pod \"openstackclient\" (UID: \"ff108074-7904-43a0-b621-6676980779f6\") " pod="openstack/openstackclient" Dec 06 01:23:38 crc kubenswrapper[4991]: I1206 01:23:38.008753 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ff108074-7904-43a0-b621-6676980779f6-openstack-config-secret\") pod \"openstackclient\" (UID: \"ff108074-7904-43a0-b621-6676980779f6\") " pod="openstack/openstackclient" Dec 06 01:23:38 crc kubenswrapper[4991]: I1206 01:23:38.009541 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ff108074-7904-43a0-b621-6676980779f6-openstack-config\") pod \"openstackclient\" (UID: \"ff108074-7904-43a0-b621-6676980779f6\") " pod="openstack/openstackclient" Dec 06 01:23:38 crc kubenswrapper[4991]: I1206 01:23:38.010253 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff108074-7904-43a0-b621-6676980779f6-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ff108074-7904-43a0-b621-6676980779f6\") " pod="openstack/openstackclient" Dec 06 01:23:38 crc kubenswrapper[4991]: I1206 01:23:38.019245 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff108074-7904-43a0-b621-6676980779f6-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ff108074-7904-43a0-b621-6676980779f6\") " pod="openstack/openstackclient" Dec 06 01:23:38 crc kubenswrapper[4991]: I1206 01:23:38.030253 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ff108074-7904-43a0-b621-6676980779f6-openstack-config-secret\") pod \"openstackclient\" (UID: \"ff108074-7904-43a0-b621-6676980779f6\") " pod="openstack/openstackclient" Dec 06 01:23:38 crc kubenswrapper[4991]: I1206 01:23:38.034073 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4tpf\" (UniqueName: \"kubernetes.io/projected/ff108074-7904-43a0-b621-6676980779f6-kube-api-access-b4tpf\") pod \"openstackclient\" (UID: \"ff108074-7904-43a0-b621-6676980779f6\") " pod="openstack/openstackclient" Dec 06 01:23:38 crc kubenswrapper[4991]: I1206 01:23:38.112221 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bbcad69c-20b6-46da-95b6-28a7d993ebab-etc-swift\") pod \"swift-proxy-ff7f9b9c9-g4mpb\" (UID: \"bbcad69c-20b6-46da-95b6-28a7d993ebab\") " pod="openstack/swift-proxy-ff7f9b9c9-g4mpb" Dec 06 01:23:38 crc kubenswrapper[4991]: I1206 01:23:38.112282 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbcad69c-20b6-46da-95b6-28a7d993ebab-run-httpd\") pod \"swift-proxy-ff7f9b9c9-g4mpb\" (UID: \"bbcad69c-20b6-46da-95b6-28a7d993ebab\") " pod="openstack/swift-proxy-ff7f9b9c9-g4mpb" Dec 06 01:23:38 crc kubenswrapper[4991]: I1206 01:23:38.112308 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbcad69c-20b6-46da-95b6-28a7d993ebab-internal-tls-certs\") pod \"swift-proxy-ff7f9b9c9-g4mpb\" (UID: \"bbcad69c-20b6-46da-95b6-28a7d993ebab\") " pod="openstack/swift-proxy-ff7f9b9c9-g4mpb" Dec 06 01:23:38 crc kubenswrapper[4991]: I1206 01:23:38.112373 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbcad69c-20b6-46da-95b6-28a7d993ebab-public-tls-certs\") pod \"swift-proxy-ff7f9b9c9-g4mpb\" (UID: \"bbcad69c-20b6-46da-95b6-28a7d993ebab\") " pod="openstack/swift-proxy-ff7f9b9c9-g4mpb" Dec 06 01:23:38 crc kubenswrapper[4991]: I1206 01:23:38.112395 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbcad69c-20b6-46da-95b6-28a7d993ebab-combined-ca-bundle\") pod \"swift-proxy-ff7f9b9c9-g4mpb\" (UID: \"bbcad69c-20b6-46da-95b6-28a7d993ebab\") " pod="openstack/swift-proxy-ff7f9b9c9-g4mpb" Dec 06 01:23:38 crc kubenswrapper[4991]: I1206 01:23:38.112412 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbcad69c-20b6-46da-95b6-28a7d993ebab-log-httpd\") pod \"swift-proxy-ff7f9b9c9-g4mpb\" (UID: \"bbcad69c-20b6-46da-95b6-28a7d993ebab\") " pod="openstack/swift-proxy-ff7f9b9c9-g4mpb" Dec 06 01:23:38 crc kubenswrapper[4991]: I1206 01:23:38.112433 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp6l4\" (UniqueName: \"kubernetes.io/projected/bbcad69c-20b6-46da-95b6-28a7d993ebab-kube-api-access-xp6l4\") pod \"swift-proxy-ff7f9b9c9-g4mpb\" (UID: \"bbcad69c-20b6-46da-95b6-28a7d993ebab\") " pod="openstack/swift-proxy-ff7f9b9c9-g4mpb" Dec 06 01:23:38 crc kubenswrapper[4991]: I1206 01:23:38.112457 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbcad69c-20b6-46da-95b6-28a7d993ebab-config-data\") pod \"swift-proxy-ff7f9b9c9-g4mpb\" (UID: \"bbcad69c-20b6-46da-95b6-28a7d993ebab\") " pod="openstack/swift-proxy-ff7f9b9c9-g4mpb" Dec 06 01:23:38 crc kubenswrapper[4991]: I1206 01:23:38.214091 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bbcad69c-20b6-46da-95b6-28a7d993ebab-etc-swift\") pod \"swift-proxy-ff7f9b9c9-g4mpb\" (UID: \"bbcad69c-20b6-46da-95b6-28a7d993ebab\") " pod="openstack/swift-proxy-ff7f9b9c9-g4mpb" Dec 06 01:23:38 crc kubenswrapper[4991]: I1206 01:23:38.214149 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbcad69c-20b6-46da-95b6-28a7d993ebab-run-httpd\") pod \"swift-proxy-ff7f9b9c9-g4mpb\" (UID: \"bbcad69c-20b6-46da-95b6-28a7d993ebab\") " pod="openstack/swift-proxy-ff7f9b9c9-g4mpb" Dec 06 01:23:38 crc kubenswrapper[4991]: I1206 01:23:38.214169 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbcad69c-20b6-46da-95b6-28a7d993ebab-internal-tls-certs\") pod \"swift-proxy-ff7f9b9c9-g4mpb\" (UID: \"bbcad69c-20b6-46da-95b6-28a7d993ebab\") " pod="openstack/swift-proxy-ff7f9b9c9-g4mpb" Dec 06 01:23:38 crc kubenswrapper[4991]: I1206 01:23:38.214223 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbcad69c-20b6-46da-95b6-28a7d993ebab-public-tls-certs\") pod \"swift-proxy-ff7f9b9c9-g4mpb\" (UID: \"bbcad69c-20b6-46da-95b6-28a7d993ebab\") " pod="openstack/swift-proxy-ff7f9b9c9-g4mpb" Dec 06 01:23:38 crc kubenswrapper[4991]: I1206 01:23:38.214248 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbcad69c-20b6-46da-95b6-28a7d993ebab-combined-ca-bundle\") pod \"swift-proxy-ff7f9b9c9-g4mpb\" (UID: \"bbcad69c-20b6-46da-95b6-28a7d993ebab\") " pod="openstack/swift-proxy-ff7f9b9c9-g4mpb" Dec 06 01:23:38 crc kubenswrapper[4991]: I1206 01:23:38.214269 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbcad69c-20b6-46da-95b6-28a7d993ebab-log-httpd\") pod \"swift-proxy-ff7f9b9c9-g4mpb\" (UID: \"bbcad69c-20b6-46da-95b6-28a7d993ebab\") " pod="openstack/swift-proxy-ff7f9b9c9-g4mpb" Dec 06 01:23:38 crc kubenswrapper[4991]: I1206 01:23:38.214292 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xp6l4\" (UniqueName: \"kubernetes.io/projected/bbcad69c-20b6-46da-95b6-28a7d993ebab-kube-api-access-xp6l4\") pod \"swift-proxy-ff7f9b9c9-g4mpb\" (UID: \"bbcad69c-20b6-46da-95b6-28a7d993ebab\") " pod="openstack/swift-proxy-ff7f9b9c9-g4mpb" Dec 06 01:23:38 crc kubenswrapper[4991]: I1206 01:23:38.214323 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbcad69c-20b6-46da-95b6-28a7d993ebab-config-data\") pod \"swift-proxy-ff7f9b9c9-g4mpb\" (UID: \"bbcad69c-20b6-46da-95b6-28a7d993ebab\") " pod="openstack/swift-proxy-ff7f9b9c9-g4mpb" Dec 06 01:23:38 crc kubenswrapper[4991]: I1206 01:23:38.214737 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 06 01:23:38 crc kubenswrapper[4991]: I1206 01:23:38.214767 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbcad69c-20b6-46da-95b6-28a7d993ebab-run-httpd\") pod \"swift-proxy-ff7f9b9c9-g4mpb\" (UID: \"bbcad69c-20b6-46da-95b6-28a7d993ebab\") " pod="openstack/swift-proxy-ff7f9b9c9-g4mpb" Dec 06 01:23:38 crc kubenswrapper[4991]: I1206 01:23:38.215132 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbcad69c-20b6-46da-95b6-28a7d993ebab-log-httpd\") pod \"swift-proxy-ff7f9b9c9-g4mpb\" (UID: \"bbcad69c-20b6-46da-95b6-28a7d993ebab\") " pod="openstack/swift-proxy-ff7f9b9c9-g4mpb" Dec 06 01:23:38 crc kubenswrapper[4991]: I1206 01:23:38.218237 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bbcad69c-20b6-46da-95b6-28a7d993ebab-etc-swift\") pod \"swift-proxy-ff7f9b9c9-g4mpb\" (UID: \"bbcad69c-20b6-46da-95b6-28a7d993ebab\") " pod="openstack/swift-proxy-ff7f9b9c9-g4mpb" Dec 06 01:23:38 crc kubenswrapper[4991]: I1206 01:23:38.218468 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbcad69c-20b6-46da-95b6-28a7d993ebab-config-data\") pod \"swift-proxy-ff7f9b9c9-g4mpb\" (UID: \"bbcad69c-20b6-46da-95b6-28a7d993ebab\") " pod="openstack/swift-proxy-ff7f9b9c9-g4mpb" Dec 06 01:23:38 crc kubenswrapper[4991]: I1206 01:23:38.220210 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbcad69c-20b6-46da-95b6-28a7d993ebab-internal-tls-certs\") pod \"swift-proxy-ff7f9b9c9-g4mpb\" (UID: \"bbcad69c-20b6-46da-95b6-28a7d993ebab\") " pod="openstack/swift-proxy-ff7f9b9c9-g4mpb" Dec 06 01:23:38 crc kubenswrapper[4991]: I1206 01:23:38.221716 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbcad69c-20b6-46da-95b6-28a7d993ebab-public-tls-certs\") pod \"swift-proxy-ff7f9b9c9-g4mpb\" (UID: \"bbcad69c-20b6-46da-95b6-28a7d993ebab\") " pod="openstack/swift-proxy-ff7f9b9c9-g4mpb" Dec 06 01:23:38 crc kubenswrapper[4991]: I1206 01:23:38.242520 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbcad69c-20b6-46da-95b6-28a7d993ebab-combined-ca-bundle\") pod \"swift-proxy-ff7f9b9c9-g4mpb\" (UID: \"bbcad69c-20b6-46da-95b6-28a7d993ebab\") " pod="openstack/swift-proxy-ff7f9b9c9-g4mpb" Dec 06 01:23:38 crc kubenswrapper[4991]: I1206 01:23:38.271303 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xp6l4\" (UniqueName: \"kubernetes.io/projected/bbcad69c-20b6-46da-95b6-28a7d993ebab-kube-api-access-xp6l4\") pod \"swift-proxy-ff7f9b9c9-g4mpb\" (UID: \"bbcad69c-20b6-46da-95b6-28a7d993ebab\") " pod="openstack/swift-proxy-ff7f9b9c9-g4mpb" Dec 06 01:23:38 crc kubenswrapper[4991]: I1206 01:23:38.299669 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-ff7f9b9c9-g4mpb" Dec 06 01:23:38 crc kubenswrapper[4991]: I1206 01:23:38.676896 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:23:38 crc kubenswrapper[4991]: I1206 01:23:38.677681 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eedff750-2cc3-4eff-a889-79e542f0ba39" containerName="ceilometer-central-agent" containerID="cri-o://d2fef9984ff4b0ab7f85374fa5466168c9dcb6faffa9ad40e39766f270ccda4b" gracePeriod=30 Dec 06 01:23:38 crc kubenswrapper[4991]: I1206 01:23:38.677954 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eedff750-2cc3-4eff-a889-79e542f0ba39" containerName="ceilometer-notification-agent" containerID="cri-o://2de9cb7275d9cb6984d1fd371537bb0907ed76276a4fb19a12226daea11c702a" gracePeriod=30 Dec 06 01:23:38 crc kubenswrapper[4991]: I1206 01:23:38.677964 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eedff750-2cc3-4eff-a889-79e542f0ba39" containerName="sg-core" containerID="cri-o://f363ce7370925afdc02e74af77eba91e67b1c55e34abd285b829bf240957e14c" gracePeriod=30 Dec 06 01:23:38 crc kubenswrapper[4991]: I1206 01:23:38.678140 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eedff750-2cc3-4eff-a889-79e542f0ba39" containerName="proxy-httpd" containerID="cri-o://6392ce5712490db9fbd3ce2c4ab4e277b59338eb3c0203a0a28c0ce93487bc14" gracePeriod=30 Dec 06 01:23:38 crc kubenswrapper[4991]: I1206 01:23:38.689155 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="eedff750-2cc3-4eff-a889-79e542f0ba39" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.163:3000/\": EOF" Dec 06 01:23:38 crc kubenswrapper[4991]: I1206 01:23:38.728566 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 06 01:23:38 crc kubenswrapper[4991]: I1206 01:23:38.923509 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-ff7f9b9c9-g4mpb"] Dec 06 01:23:38 crc kubenswrapper[4991]: W1206 01:23:38.923810 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbcad69c_20b6_46da_95b6_28a7d993ebab.slice/crio-95eb80d7aeb2e9559c42e52b5345022836515779dc5b33d2bba4588d305cc5d2 WatchSource:0}: Error finding container 95eb80d7aeb2e9559c42e52b5345022836515779dc5b33d2bba4588d305cc5d2: Status 404 returned error can't find the container with id 95eb80d7aeb2e9559c42e52b5345022836515779dc5b33d2bba4588d305cc5d2 Dec 06 01:23:39 crc kubenswrapper[4991]: I1206 01:23:39.163728 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"ff108074-7904-43a0-b621-6676980779f6","Type":"ContainerStarted","Data":"d412698c3af988bb7f269689778e1e92ddf0a87dc212f06cb7567e2874010037"} Dec 06 01:23:39 crc kubenswrapper[4991]: I1206 01:23:39.170709 4991 generic.go:334] "Generic (PLEG): container finished" podID="eedff750-2cc3-4eff-a889-79e542f0ba39" containerID="6392ce5712490db9fbd3ce2c4ab4e277b59338eb3c0203a0a28c0ce93487bc14" exitCode=0 Dec 06 01:23:39 crc kubenswrapper[4991]: I1206 01:23:39.170748 4991 generic.go:334] "Generic (PLEG): container finished" podID="eedff750-2cc3-4eff-a889-79e542f0ba39" containerID="f363ce7370925afdc02e74af77eba91e67b1c55e34abd285b829bf240957e14c" exitCode=2 Dec 06 01:23:39 crc kubenswrapper[4991]: I1206 01:23:39.170757 4991 generic.go:334] "Generic (PLEG): container finished" podID="eedff750-2cc3-4eff-a889-79e542f0ba39" containerID="d2fef9984ff4b0ab7f85374fa5466168c9dcb6faffa9ad40e39766f270ccda4b" exitCode=0 Dec 06 01:23:39 crc kubenswrapper[4991]: I1206 01:23:39.170782 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eedff750-2cc3-4eff-a889-79e542f0ba39","Type":"ContainerDied","Data":"6392ce5712490db9fbd3ce2c4ab4e277b59338eb3c0203a0a28c0ce93487bc14"} Dec 06 01:23:39 crc kubenswrapper[4991]: I1206 01:23:39.170833 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eedff750-2cc3-4eff-a889-79e542f0ba39","Type":"ContainerDied","Data":"f363ce7370925afdc02e74af77eba91e67b1c55e34abd285b829bf240957e14c"} Dec 06 01:23:39 crc kubenswrapper[4991]: I1206 01:23:39.170848 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eedff750-2cc3-4eff-a889-79e542f0ba39","Type":"ContainerDied","Data":"d2fef9984ff4b0ab7f85374fa5466168c9dcb6faffa9ad40e39766f270ccda4b"} Dec 06 01:23:39 crc kubenswrapper[4991]: I1206 01:23:39.172210 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-ff7f9b9c9-g4mpb" event={"ID":"bbcad69c-20b6-46da-95b6-28a7d993ebab","Type":"ContainerStarted","Data":"95eb80d7aeb2e9559c42e52b5345022836515779dc5b33d2bba4588d305cc5d2"} Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.035347 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.149790 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eedff750-2cc3-4eff-a889-79e542f0ba39-log-httpd\") pod \"eedff750-2cc3-4eff-a889-79e542f0ba39\" (UID: \"eedff750-2cc3-4eff-a889-79e542f0ba39\") " Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.149873 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eedff750-2cc3-4eff-a889-79e542f0ba39-run-httpd\") pod \"eedff750-2cc3-4eff-a889-79e542f0ba39\" (UID: \"eedff750-2cc3-4eff-a889-79e542f0ba39\") " Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.149893 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eedff750-2cc3-4eff-a889-79e542f0ba39-config-data\") pod \"eedff750-2cc3-4eff-a889-79e542f0ba39\" (UID: \"eedff750-2cc3-4eff-a889-79e542f0ba39\") " Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.150279 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ssn8\" (UniqueName: \"kubernetes.io/projected/eedff750-2cc3-4eff-a889-79e542f0ba39-kube-api-access-2ssn8\") pod \"eedff750-2cc3-4eff-a889-79e542f0ba39\" (UID: \"eedff750-2cc3-4eff-a889-79e542f0ba39\") " Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.150368 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eedff750-2cc3-4eff-a889-79e542f0ba39-sg-core-conf-yaml\") pod \"eedff750-2cc3-4eff-a889-79e542f0ba39\" (UID: \"eedff750-2cc3-4eff-a889-79e542f0ba39\") " Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.150467 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eedff750-2cc3-4eff-a889-79e542f0ba39-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "eedff750-2cc3-4eff-a889-79e542f0ba39" (UID: "eedff750-2cc3-4eff-a889-79e542f0ba39"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.150584 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eedff750-2cc3-4eff-a889-79e542f0ba39-combined-ca-bundle\") pod \"eedff750-2cc3-4eff-a889-79e542f0ba39\" (UID: \"eedff750-2cc3-4eff-a889-79e542f0ba39\") " Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.150638 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eedff750-2cc3-4eff-a889-79e542f0ba39-scripts\") pod \"eedff750-2cc3-4eff-a889-79e542f0ba39\" (UID: \"eedff750-2cc3-4eff-a889-79e542f0ba39\") " Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.152289 4991 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eedff750-2cc3-4eff-a889-79e542f0ba39-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.157160 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eedff750-2cc3-4eff-a889-79e542f0ba39-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "eedff750-2cc3-4eff-a889-79e542f0ba39" (UID: "eedff750-2cc3-4eff-a889-79e542f0ba39"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.162439 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eedff750-2cc3-4eff-a889-79e542f0ba39-kube-api-access-2ssn8" (OuterVolumeSpecName: "kube-api-access-2ssn8") pod "eedff750-2cc3-4eff-a889-79e542f0ba39" (UID: "eedff750-2cc3-4eff-a889-79e542f0ba39"). InnerVolumeSpecName "kube-api-access-2ssn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.170073 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eedff750-2cc3-4eff-a889-79e542f0ba39-scripts" (OuterVolumeSpecName: "scripts") pod "eedff750-2cc3-4eff-a889-79e542f0ba39" (UID: "eedff750-2cc3-4eff-a889-79e542f0ba39"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.211086 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eedff750-2cc3-4eff-a889-79e542f0ba39-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "eedff750-2cc3-4eff-a889-79e542f0ba39" (UID: "eedff750-2cc3-4eff-a889-79e542f0ba39"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.213396 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-ff7f9b9c9-g4mpb" event={"ID":"bbcad69c-20b6-46da-95b6-28a7d993ebab","Type":"ContainerStarted","Data":"813e9ba5c5edc51f9a0ac71a6571374bdfa52017340d2ddebb3801f054e91751"} Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.213434 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-ff7f9b9c9-g4mpb" event={"ID":"bbcad69c-20b6-46da-95b6-28a7d993ebab","Type":"ContainerStarted","Data":"ea9ffaa34608921351c5c22b94ca884e249a1d1d0f87e0f71c9619c863a1750f"} Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.214111 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-ff7f9b9c9-g4mpb" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.214161 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-ff7f9b9c9-g4mpb" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.233336 4991 generic.go:334] "Generic (PLEG): container finished" podID="eedff750-2cc3-4eff-a889-79e542f0ba39" containerID="2de9cb7275d9cb6984d1fd371537bb0907ed76276a4fb19a12226daea11c702a" exitCode=0 Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.233400 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eedff750-2cc3-4eff-a889-79e542f0ba39","Type":"ContainerDied","Data":"2de9cb7275d9cb6984d1fd371537bb0907ed76276a4fb19a12226daea11c702a"} Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.233441 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eedff750-2cc3-4eff-a889-79e542f0ba39","Type":"ContainerDied","Data":"454db986e019c82e2107cf7807112494fc6b1e8137dfc0c1626df344a412dbb7"} Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.233465 4991 scope.go:117] "RemoveContainer" containerID="6392ce5712490db9fbd3ce2c4ab4e277b59338eb3c0203a0a28c0ce93487bc14" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.233608 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.240099 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-ff7f9b9c9-g4mpb" podStartSLOduration=3.24007572 podStartE2EDuration="3.24007572s" podCreationTimestamp="2025-12-06 01:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:23:40.23350804 +0000 UTC m=+1293.583798508" watchObservedRunningTime="2025-12-06 01:23:40.24007572 +0000 UTC m=+1293.590366188" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.259238 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eedff750-2cc3-4eff-a889-79e542f0ba39-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.259292 4991 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eedff750-2cc3-4eff-a889-79e542f0ba39-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.259303 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ssn8\" (UniqueName: \"kubernetes.io/projected/eedff750-2cc3-4eff-a889-79e542f0ba39-kube-api-access-2ssn8\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.259316 4991 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eedff750-2cc3-4eff-a889-79e542f0ba39-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.306113 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eedff750-2cc3-4eff-a889-79e542f0ba39-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eedff750-2cc3-4eff-a889-79e542f0ba39" (UID: "eedff750-2cc3-4eff-a889-79e542f0ba39"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.307371 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eedff750-2cc3-4eff-a889-79e542f0ba39-config-data" (OuterVolumeSpecName: "config-data") pod "eedff750-2cc3-4eff-a889-79e542f0ba39" (UID: "eedff750-2cc3-4eff-a889-79e542f0ba39"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.361243 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eedff750-2cc3-4eff-a889-79e542f0ba39-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.361279 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eedff750-2cc3-4eff-a889-79e542f0ba39-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.366815 4991 scope.go:117] "RemoveContainer" containerID="f363ce7370925afdc02e74af77eba91e67b1c55e34abd285b829bf240957e14c" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.393591 4991 scope.go:117] "RemoveContainer" containerID="2de9cb7275d9cb6984d1fd371537bb0907ed76276a4fb19a12226daea11c702a" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.423186 4991 scope.go:117] "RemoveContainer" containerID="d2fef9984ff4b0ab7f85374fa5466168c9dcb6faffa9ad40e39766f270ccda4b" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.449511 4991 scope.go:117] "RemoveContainer" containerID="6392ce5712490db9fbd3ce2c4ab4e277b59338eb3c0203a0a28c0ce93487bc14" Dec 06 01:23:40 crc kubenswrapper[4991]: E1206 01:23:40.450170 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6392ce5712490db9fbd3ce2c4ab4e277b59338eb3c0203a0a28c0ce93487bc14\": container with ID starting with 6392ce5712490db9fbd3ce2c4ab4e277b59338eb3c0203a0a28c0ce93487bc14 not found: ID does not exist" containerID="6392ce5712490db9fbd3ce2c4ab4e277b59338eb3c0203a0a28c0ce93487bc14" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.450202 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6392ce5712490db9fbd3ce2c4ab4e277b59338eb3c0203a0a28c0ce93487bc14"} err="failed to get container status \"6392ce5712490db9fbd3ce2c4ab4e277b59338eb3c0203a0a28c0ce93487bc14\": rpc error: code = NotFound desc = could not find container \"6392ce5712490db9fbd3ce2c4ab4e277b59338eb3c0203a0a28c0ce93487bc14\": container with ID starting with 6392ce5712490db9fbd3ce2c4ab4e277b59338eb3c0203a0a28c0ce93487bc14 not found: ID does not exist" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.450223 4991 scope.go:117] "RemoveContainer" containerID="f363ce7370925afdc02e74af77eba91e67b1c55e34abd285b829bf240957e14c" Dec 06 01:23:40 crc kubenswrapper[4991]: E1206 01:23:40.450730 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f363ce7370925afdc02e74af77eba91e67b1c55e34abd285b829bf240957e14c\": container with ID starting with f363ce7370925afdc02e74af77eba91e67b1c55e34abd285b829bf240957e14c not found: ID does not exist" containerID="f363ce7370925afdc02e74af77eba91e67b1c55e34abd285b829bf240957e14c" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.450749 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f363ce7370925afdc02e74af77eba91e67b1c55e34abd285b829bf240957e14c"} err="failed to get container status \"f363ce7370925afdc02e74af77eba91e67b1c55e34abd285b829bf240957e14c\": rpc error: code = NotFound desc = could not find container \"f363ce7370925afdc02e74af77eba91e67b1c55e34abd285b829bf240957e14c\": container with ID starting with f363ce7370925afdc02e74af77eba91e67b1c55e34abd285b829bf240957e14c not found: ID does not exist" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.450764 4991 scope.go:117] "RemoveContainer" containerID="2de9cb7275d9cb6984d1fd371537bb0907ed76276a4fb19a12226daea11c702a" Dec 06 01:23:40 crc kubenswrapper[4991]: E1206 01:23:40.451221 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2de9cb7275d9cb6984d1fd371537bb0907ed76276a4fb19a12226daea11c702a\": container with ID starting with 2de9cb7275d9cb6984d1fd371537bb0907ed76276a4fb19a12226daea11c702a not found: ID does not exist" containerID="2de9cb7275d9cb6984d1fd371537bb0907ed76276a4fb19a12226daea11c702a" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.451245 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2de9cb7275d9cb6984d1fd371537bb0907ed76276a4fb19a12226daea11c702a"} err="failed to get container status \"2de9cb7275d9cb6984d1fd371537bb0907ed76276a4fb19a12226daea11c702a\": rpc error: code = NotFound desc = could not find container \"2de9cb7275d9cb6984d1fd371537bb0907ed76276a4fb19a12226daea11c702a\": container with ID starting with 2de9cb7275d9cb6984d1fd371537bb0907ed76276a4fb19a12226daea11c702a not found: ID does not exist" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.451258 4991 scope.go:117] "RemoveContainer" containerID="d2fef9984ff4b0ab7f85374fa5466168c9dcb6faffa9ad40e39766f270ccda4b" Dec 06 01:23:40 crc kubenswrapper[4991]: E1206 01:23:40.451552 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2fef9984ff4b0ab7f85374fa5466168c9dcb6faffa9ad40e39766f270ccda4b\": container with ID starting with d2fef9984ff4b0ab7f85374fa5466168c9dcb6faffa9ad40e39766f270ccda4b not found: ID does not exist" containerID="d2fef9984ff4b0ab7f85374fa5466168c9dcb6faffa9ad40e39766f270ccda4b" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.451570 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2fef9984ff4b0ab7f85374fa5466168c9dcb6faffa9ad40e39766f270ccda4b"} err="failed to get container status \"d2fef9984ff4b0ab7f85374fa5466168c9dcb6faffa9ad40e39766f270ccda4b\": rpc error: code = NotFound desc = could not find container \"d2fef9984ff4b0ab7f85374fa5466168c9dcb6faffa9ad40e39766f270ccda4b\": container with ID starting with d2fef9984ff4b0ab7f85374fa5466168c9dcb6faffa9ad40e39766f270ccda4b not found: ID does not exist" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.579923 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.591702 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.605215 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:23:40 crc kubenswrapper[4991]: E1206 01:23:40.605571 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eedff750-2cc3-4eff-a889-79e542f0ba39" containerName="sg-core" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.605589 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="eedff750-2cc3-4eff-a889-79e542f0ba39" containerName="sg-core" Dec 06 01:23:40 crc kubenswrapper[4991]: E1206 01:23:40.605603 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eedff750-2cc3-4eff-a889-79e542f0ba39" containerName="ceilometer-central-agent" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.605610 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="eedff750-2cc3-4eff-a889-79e542f0ba39" containerName="ceilometer-central-agent" Dec 06 01:23:40 crc kubenswrapper[4991]: E1206 01:23:40.605628 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eedff750-2cc3-4eff-a889-79e542f0ba39" containerName="ceilometer-notification-agent" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.605633 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="eedff750-2cc3-4eff-a889-79e542f0ba39" containerName="ceilometer-notification-agent" Dec 06 01:23:40 crc kubenswrapper[4991]: E1206 01:23:40.605645 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eedff750-2cc3-4eff-a889-79e542f0ba39" containerName="proxy-httpd" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.605651 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="eedff750-2cc3-4eff-a889-79e542f0ba39" containerName="proxy-httpd" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.605824 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="eedff750-2cc3-4eff-a889-79e542f0ba39" containerName="ceilometer-central-agent" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.605841 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="eedff750-2cc3-4eff-a889-79e542f0ba39" containerName="proxy-httpd" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.605855 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="eedff750-2cc3-4eff-a889-79e542f0ba39" containerName="sg-core" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.605866 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="eedff750-2cc3-4eff-a889-79e542f0ba39" containerName="ceilometer-notification-agent" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.607434 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.616225 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.618226 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.667046 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.668650 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcksl\" (UniqueName: \"kubernetes.io/projected/ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df-kube-api-access-jcksl\") pod \"ceilometer-0\" (UID: \"ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df\") " pod="openstack/ceilometer-0" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.668784 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df-config-data\") pod \"ceilometer-0\" (UID: \"ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df\") " pod="openstack/ceilometer-0" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.668823 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df-run-httpd\") pod \"ceilometer-0\" (UID: \"ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df\") " pod="openstack/ceilometer-0" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.668859 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df\") " pod="openstack/ceilometer-0" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.668902 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df-scripts\") pod \"ceilometer-0\" (UID: \"ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df\") " pod="openstack/ceilometer-0" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.668930 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df\") " pod="openstack/ceilometer-0" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.668946 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df-log-httpd\") pod \"ceilometer-0\" (UID: \"ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df\") " pod="openstack/ceilometer-0" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.771947 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcksl\" (UniqueName: \"kubernetes.io/projected/ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df-kube-api-access-jcksl\") pod \"ceilometer-0\" (UID: \"ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df\") " pod="openstack/ceilometer-0" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.772085 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df-config-data\") pod \"ceilometer-0\" (UID: \"ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df\") " pod="openstack/ceilometer-0" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.772123 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df-run-httpd\") pod \"ceilometer-0\" (UID: \"ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df\") " pod="openstack/ceilometer-0" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.772160 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df\") " pod="openstack/ceilometer-0" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.772199 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df-scripts\") pod \"ceilometer-0\" (UID: \"ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df\") " pod="openstack/ceilometer-0" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.772230 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df\") " pod="openstack/ceilometer-0" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.772246 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df-log-httpd\") pod \"ceilometer-0\" (UID: \"ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df\") " pod="openstack/ceilometer-0" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.774010 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df-log-httpd\") pod \"ceilometer-0\" (UID: \"ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df\") " pod="openstack/ceilometer-0" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.778251 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df-config-data\") pod \"ceilometer-0\" (UID: \"ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df\") " pod="openstack/ceilometer-0" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.779233 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df\") " pod="openstack/ceilometer-0" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.780255 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df-run-httpd\") pod \"ceilometer-0\" (UID: \"ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df\") " pod="openstack/ceilometer-0" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.780686 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df\") " pod="openstack/ceilometer-0" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.780831 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df-scripts\") pod \"ceilometer-0\" (UID: \"ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df\") " pod="openstack/ceilometer-0" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.792073 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcksl\" (UniqueName: \"kubernetes.io/projected/ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df-kube-api-access-jcksl\") pod \"ceilometer-0\" (UID: \"ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df\") " pod="openstack/ceilometer-0" Dec 06 01:23:40 crc kubenswrapper[4991]: I1206 01:23:40.931397 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 01:23:41 crc kubenswrapper[4991]: I1206 01:23:41.051216 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eedff750-2cc3-4eff-a889-79e542f0ba39" path="/var/lib/kubelet/pods/eedff750-2cc3-4eff-a889-79e542f0ba39/volumes" Dec 06 01:23:41 crc kubenswrapper[4991]: I1206 01:23:41.417213 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:23:42 crc kubenswrapper[4991]: I1206 01:23:42.258592 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df","Type":"ContainerStarted","Data":"3ece86d357539c41d0d1f8c96488a1beb479768fa40ba1531b3ab1a5fcb28037"} Dec 06 01:23:43 crc kubenswrapper[4991]: I1206 01:23:43.279292 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df","Type":"ContainerStarted","Data":"9f88b126dec0d1d3473971ee13033e799488e5c3d3b54e9df2099fc7d6800b75"} Dec 06 01:23:43 crc kubenswrapper[4991]: I1206 01:23:43.280916 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df","Type":"ContainerStarted","Data":"705726e8fb183ed551b7054800f94200b20bb1ad78a6726d735b4d61f51ae176"} Dec 06 01:23:44 crc kubenswrapper[4991]: I1206 01:23:44.300347 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df","Type":"ContainerStarted","Data":"055aceabd52c109b016a6fb686943afc74a51ba7d72452a9ee7db76bf33cca35"} Dec 06 01:23:44 crc kubenswrapper[4991]: I1206 01:23:44.486943 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-b4cd4b7d8-99kjr" podUID="ff6a4088-3509-40aa-a3f6-917e289cd44f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Dec 06 01:23:45 crc kubenswrapper[4991]: I1206 01:23:45.312775 4991 generic.go:334] "Generic (PLEG): container finished" podID="5bf196fe-b548-409b-9297-8e181d01ef54" containerID="fc80a01f2f8f2b9705e46f8aa3b37adce4e66449214245dc0943f89b38c4e570" exitCode=137 Dec 06 01:23:45 crc kubenswrapper[4991]: I1206 01:23:45.312816 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5bf196fe-b548-409b-9297-8e181d01ef54","Type":"ContainerDied","Data":"fc80a01f2f8f2b9705e46f8aa3b37adce4e66449214245dc0943f89b38c4e570"} Dec 06 01:23:45 crc kubenswrapper[4991]: I1206 01:23:45.476956 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="5bf196fe-b548-409b-9297-8e181d01ef54" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.162:8776/healthcheck\": dial tcp 10.217.0.162:8776: connect: connection refused" Dec 06 01:23:48 crc kubenswrapper[4991]: I1206 01:23:48.224870 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:23:48 crc kubenswrapper[4991]: I1206 01:23:48.307260 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-ff7f9b9c9-g4mpb" Dec 06 01:23:48 crc kubenswrapper[4991]: I1206 01:23:48.310103 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-ff7f9b9c9-g4mpb" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.278127 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.311008 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bf196fe-b548-409b-9297-8e181d01ef54-config-data\") pod \"5bf196fe-b548-409b-9297-8e181d01ef54\" (UID: \"5bf196fe-b548-409b-9297-8e181d01ef54\") " Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.311269 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bf196fe-b548-409b-9297-8e181d01ef54-logs\") pod \"5bf196fe-b548-409b-9297-8e181d01ef54\" (UID: \"5bf196fe-b548-409b-9297-8e181d01ef54\") " Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.311383 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5bf196fe-b548-409b-9297-8e181d01ef54-config-data-custom\") pod \"5bf196fe-b548-409b-9297-8e181d01ef54\" (UID: \"5bf196fe-b548-409b-9297-8e181d01ef54\") " Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.311490 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bf196fe-b548-409b-9297-8e181d01ef54-scripts\") pod \"5bf196fe-b548-409b-9297-8e181d01ef54\" (UID: \"5bf196fe-b548-409b-9297-8e181d01ef54\") " Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.311573 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5bf196fe-b548-409b-9297-8e181d01ef54-etc-machine-id\") pod \"5bf196fe-b548-409b-9297-8e181d01ef54\" (UID: \"5bf196fe-b548-409b-9297-8e181d01ef54\") " Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.311679 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4784\" (UniqueName: \"kubernetes.io/projected/5bf196fe-b548-409b-9297-8e181d01ef54-kube-api-access-q4784\") pod \"5bf196fe-b548-409b-9297-8e181d01ef54\" (UID: \"5bf196fe-b548-409b-9297-8e181d01ef54\") " Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.311804 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bf196fe-b548-409b-9297-8e181d01ef54-combined-ca-bundle\") pod \"5bf196fe-b548-409b-9297-8e181d01ef54\" (UID: \"5bf196fe-b548-409b-9297-8e181d01ef54\") " Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.319783 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bf196fe-b548-409b-9297-8e181d01ef54-logs" (OuterVolumeSpecName: "logs") pod "5bf196fe-b548-409b-9297-8e181d01ef54" (UID: "5bf196fe-b548-409b-9297-8e181d01ef54"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.320052 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5bf196fe-b548-409b-9297-8e181d01ef54-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5bf196fe-b548-409b-9297-8e181d01ef54" (UID: "5bf196fe-b548-409b-9297-8e181d01ef54"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.320252 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bf196fe-b548-409b-9297-8e181d01ef54-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5bf196fe-b548-409b-9297-8e181d01ef54" (UID: "5bf196fe-b548-409b-9297-8e181d01ef54"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.325434 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bf196fe-b548-409b-9297-8e181d01ef54-scripts" (OuterVolumeSpecName: "scripts") pod "5bf196fe-b548-409b-9297-8e181d01ef54" (UID: "5bf196fe-b548-409b-9297-8e181d01ef54"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.327832 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bf196fe-b548-409b-9297-8e181d01ef54-kube-api-access-q4784" (OuterVolumeSpecName: "kube-api-access-q4784") pod "5bf196fe-b548-409b-9297-8e181d01ef54" (UID: "5bf196fe-b548-409b-9297-8e181d01ef54"). InnerVolumeSpecName "kube-api-access-q4784". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.346664 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bf196fe-b548-409b-9297-8e181d01ef54-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5bf196fe-b548-409b-9297-8e181d01ef54" (UID: "5bf196fe-b548-409b-9297-8e181d01ef54"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.373344 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bf196fe-b548-409b-9297-8e181d01ef54-config-data" (OuterVolumeSpecName: "config-data") pod "5bf196fe-b548-409b-9297-8e181d01ef54" (UID: "5bf196fe-b548-409b-9297-8e181d01ef54"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.374550 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"ff108074-7904-43a0-b621-6676980779f6","Type":"ContainerStarted","Data":"5d4dc9d3d5682c1bb899fa89c3c288e694a509ae493adabed4f136dc80afc0e4"} Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.378558 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5bf196fe-b548-409b-9297-8e181d01ef54","Type":"ContainerDied","Data":"c6774d23f01e847a64f1d8264057d11a5ef63a9dea0e0705b679f8eb3a60842c"} Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.378618 4991 scope.go:117] "RemoveContainer" containerID="fc80a01f2f8f2b9705e46f8aa3b37adce4e66449214245dc0943f89b38c4e570" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.378773 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.414115 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bf196fe-b548-409b-9297-8e181d01ef54-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.414165 4991 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bf196fe-b548-409b-9297-8e181d01ef54-logs\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.414174 4991 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5bf196fe-b548-409b-9297-8e181d01ef54-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.414185 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bf196fe-b548-409b-9297-8e181d01ef54-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.414196 4991 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5bf196fe-b548-409b-9297-8e181d01ef54-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.414205 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4784\" (UniqueName: \"kubernetes.io/projected/5bf196fe-b548-409b-9297-8e181d01ef54-kube-api-access-q4784\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.414215 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bf196fe-b548-409b-9297-8e181d01ef54-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.423242 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.625676831 podStartE2EDuration="13.423211038s" podCreationTimestamp="2025-12-06 01:23:37 +0000 UTC" firstStartedPulling="2025-12-06 01:23:38.732410632 +0000 UTC m=+1292.082701100" lastFinishedPulling="2025-12-06 01:23:49.529944839 +0000 UTC m=+1302.880235307" observedRunningTime="2025-12-06 01:23:50.393970552 +0000 UTC m=+1303.744261020" watchObservedRunningTime="2025-12-06 01:23:50.423211038 +0000 UTC m=+1303.773501496" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.446124 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.458843 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.484499 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 06 01:23:50 crc kubenswrapper[4991]: E1206 01:23:50.485084 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bf196fe-b548-409b-9297-8e181d01ef54" containerName="cinder-api-log" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.485105 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bf196fe-b548-409b-9297-8e181d01ef54" containerName="cinder-api-log" Dec 06 01:23:50 crc kubenswrapper[4991]: E1206 01:23:50.485116 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bf196fe-b548-409b-9297-8e181d01ef54" containerName="cinder-api" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.485124 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bf196fe-b548-409b-9297-8e181d01ef54" containerName="cinder-api" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.485321 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bf196fe-b548-409b-9297-8e181d01ef54" containerName="cinder-api-log" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.485359 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bf196fe-b548-409b-9297-8e181d01ef54" containerName="cinder-api" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.486451 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.491306 4991 scope.go:117] "RemoveContainer" containerID="9af773cf8d803bbb5355798f2f498e8064a8c08bae72ec3bca5d7e4ffc19fefc" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.491475 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.491739 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.491788 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.495199 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.516445 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/612d471c-b8ae-46ac-b3ff-9920f5615bd4-scripts\") pod \"cinder-api-0\" (UID: \"612d471c-b8ae-46ac-b3ff-9920f5615bd4\") " pod="openstack/cinder-api-0" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.516632 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/612d471c-b8ae-46ac-b3ff-9920f5615bd4-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"612d471c-b8ae-46ac-b3ff-9920f5615bd4\") " pod="openstack/cinder-api-0" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.516675 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/612d471c-b8ae-46ac-b3ff-9920f5615bd4-config-data-custom\") pod \"cinder-api-0\" (UID: \"612d471c-b8ae-46ac-b3ff-9920f5615bd4\") " pod="openstack/cinder-api-0" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.516701 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/612d471c-b8ae-46ac-b3ff-9920f5615bd4-etc-machine-id\") pod \"cinder-api-0\" (UID: \"612d471c-b8ae-46ac-b3ff-9920f5615bd4\") " pod="openstack/cinder-api-0" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.516753 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfzt7\" (UniqueName: \"kubernetes.io/projected/612d471c-b8ae-46ac-b3ff-9920f5615bd4-kube-api-access-rfzt7\") pod \"cinder-api-0\" (UID: \"612d471c-b8ae-46ac-b3ff-9920f5615bd4\") " pod="openstack/cinder-api-0" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.516888 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/612d471c-b8ae-46ac-b3ff-9920f5615bd4-public-tls-certs\") pod \"cinder-api-0\" (UID: \"612d471c-b8ae-46ac-b3ff-9920f5615bd4\") " pod="openstack/cinder-api-0" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.516937 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/612d471c-b8ae-46ac-b3ff-9920f5615bd4-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"612d471c-b8ae-46ac-b3ff-9920f5615bd4\") " pod="openstack/cinder-api-0" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.517025 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/612d471c-b8ae-46ac-b3ff-9920f5615bd4-config-data\") pod \"cinder-api-0\" (UID: \"612d471c-b8ae-46ac-b3ff-9920f5615bd4\") " pod="openstack/cinder-api-0" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.517080 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/612d471c-b8ae-46ac-b3ff-9920f5615bd4-logs\") pod \"cinder-api-0\" (UID: \"612d471c-b8ae-46ac-b3ff-9920f5615bd4\") " pod="openstack/cinder-api-0" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.619066 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/612d471c-b8ae-46ac-b3ff-9920f5615bd4-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"612d471c-b8ae-46ac-b3ff-9920f5615bd4\") " pod="openstack/cinder-api-0" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.619120 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/612d471c-b8ae-46ac-b3ff-9920f5615bd4-config-data-custom\") pod \"cinder-api-0\" (UID: \"612d471c-b8ae-46ac-b3ff-9920f5615bd4\") " pod="openstack/cinder-api-0" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.619139 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/612d471c-b8ae-46ac-b3ff-9920f5615bd4-etc-machine-id\") pod \"cinder-api-0\" (UID: \"612d471c-b8ae-46ac-b3ff-9920f5615bd4\") " pod="openstack/cinder-api-0" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.619167 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfzt7\" (UniqueName: \"kubernetes.io/projected/612d471c-b8ae-46ac-b3ff-9920f5615bd4-kube-api-access-rfzt7\") pod \"cinder-api-0\" (UID: \"612d471c-b8ae-46ac-b3ff-9920f5615bd4\") " pod="openstack/cinder-api-0" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.619188 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/612d471c-b8ae-46ac-b3ff-9920f5615bd4-public-tls-certs\") pod \"cinder-api-0\" (UID: \"612d471c-b8ae-46ac-b3ff-9920f5615bd4\") " pod="openstack/cinder-api-0" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.619211 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/612d471c-b8ae-46ac-b3ff-9920f5615bd4-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"612d471c-b8ae-46ac-b3ff-9920f5615bd4\") " pod="openstack/cinder-api-0" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.619246 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/612d471c-b8ae-46ac-b3ff-9920f5615bd4-config-data\") pod \"cinder-api-0\" (UID: \"612d471c-b8ae-46ac-b3ff-9920f5615bd4\") " pod="openstack/cinder-api-0" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.619274 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/612d471c-b8ae-46ac-b3ff-9920f5615bd4-logs\") pod \"cinder-api-0\" (UID: \"612d471c-b8ae-46ac-b3ff-9920f5615bd4\") " pod="openstack/cinder-api-0" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.619314 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/612d471c-b8ae-46ac-b3ff-9920f5615bd4-scripts\") pod \"cinder-api-0\" (UID: \"612d471c-b8ae-46ac-b3ff-9920f5615bd4\") " pod="openstack/cinder-api-0" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.620532 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/612d471c-b8ae-46ac-b3ff-9920f5615bd4-etc-machine-id\") pod \"cinder-api-0\" (UID: \"612d471c-b8ae-46ac-b3ff-9920f5615bd4\") " pod="openstack/cinder-api-0" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.621264 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/612d471c-b8ae-46ac-b3ff-9920f5615bd4-logs\") pod \"cinder-api-0\" (UID: \"612d471c-b8ae-46ac-b3ff-9920f5615bd4\") " pod="openstack/cinder-api-0" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.624699 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/612d471c-b8ae-46ac-b3ff-9920f5615bd4-config-data-custom\") pod \"cinder-api-0\" (UID: \"612d471c-b8ae-46ac-b3ff-9920f5615bd4\") " pod="openstack/cinder-api-0" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.628791 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/612d471c-b8ae-46ac-b3ff-9920f5615bd4-config-data\") pod \"cinder-api-0\" (UID: \"612d471c-b8ae-46ac-b3ff-9920f5615bd4\") " pod="openstack/cinder-api-0" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.629554 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/612d471c-b8ae-46ac-b3ff-9920f5615bd4-public-tls-certs\") pod \"cinder-api-0\" (UID: \"612d471c-b8ae-46ac-b3ff-9920f5615bd4\") " pod="openstack/cinder-api-0" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.637196 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/612d471c-b8ae-46ac-b3ff-9920f5615bd4-scripts\") pod \"cinder-api-0\" (UID: \"612d471c-b8ae-46ac-b3ff-9920f5615bd4\") " pod="openstack/cinder-api-0" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.637574 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/612d471c-b8ae-46ac-b3ff-9920f5615bd4-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"612d471c-b8ae-46ac-b3ff-9920f5615bd4\") " pod="openstack/cinder-api-0" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.638427 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/612d471c-b8ae-46ac-b3ff-9920f5615bd4-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"612d471c-b8ae-46ac-b3ff-9920f5615bd4\") " pod="openstack/cinder-api-0" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.643973 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfzt7\" (UniqueName: \"kubernetes.io/projected/612d471c-b8ae-46ac-b3ff-9920f5615bd4-kube-api-access-rfzt7\") pod \"cinder-api-0\" (UID: \"612d471c-b8ae-46ac-b3ff-9920f5615bd4\") " pod="openstack/cinder-api-0" Dec 06 01:23:50 crc kubenswrapper[4991]: I1206 01:23:50.825066 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 06 01:23:51 crc kubenswrapper[4991]: I1206 01:23:51.058907 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bf196fe-b548-409b-9297-8e181d01ef54" path="/var/lib/kubelet/pods/5bf196fe-b548-409b-9297-8e181d01ef54/volumes" Dec 06 01:23:51 crc kubenswrapper[4991]: I1206 01:23:51.315389 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 06 01:23:51 crc kubenswrapper[4991]: W1206 01:23:51.323399 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod612d471c_b8ae_46ac_b3ff_9920f5615bd4.slice/crio-493fdf3d593691e777dd84a8e8b4aed94d1a076575a722fffb43dcd293131b68 WatchSource:0}: Error finding container 493fdf3d593691e777dd84a8e8b4aed94d1a076575a722fffb43dcd293131b68: Status 404 returned error can't find the container with id 493fdf3d593691e777dd84a8e8b4aed94d1a076575a722fffb43dcd293131b68 Dec 06 01:23:51 crc kubenswrapper[4991]: I1206 01:23:51.402503 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"612d471c-b8ae-46ac-b3ff-9920f5615bd4","Type":"ContainerStarted","Data":"493fdf3d593691e777dd84a8e8b4aed94d1a076575a722fffb43dcd293131b68"} Dec 06 01:23:51 crc kubenswrapper[4991]: I1206 01:23:51.405699 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df","Type":"ContainerStarted","Data":"5752c4000c15c7891d0b11112f03c72083d64c6e0d35bf63d3bd13a5af540ee5"} Dec 06 01:23:51 crc kubenswrapper[4991]: I1206 01:23:51.405797 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df" containerName="ceilometer-central-agent" containerID="cri-o://705726e8fb183ed551b7054800f94200b20bb1ad78a6726d735b4d61f51ae176" gracePeriod=30 Dec 06 01:23:51 crc kubenswrapper[4991]: I1206 01:23:51.405825 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 06 01:23:51 crc kubenswrapper[4991]: I1206 01:23:51.405819 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df" containerName="proxy-httpd" containerID="cri-o://5752c4000c15c7891d0b11112f03c72083d64c6e0d35bf63d3bd13a5af540ee5" gracePeriod=30 Dec 06 01:23:51 crc kubenswrapper[4991]: I1206 01:23:51.405867 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df" containerName="sg-core" containerID="cri-o://055aceabd52c109b016a6fb686943afc74a51ba7d72452a9ee7db76bf33cca35" gracePeriod=30 Dec 06 01:23:51 crc kubenswrapper[4991]: I1206 01:23:51.405916 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df" containerName="ceilometer-notification-agent" containerID="cri-o://9f88b126dec0d1d3473971ee13033e799488e5c3d3b54e9df2099fc7d6800b75" gracePeriod=30 Dec 06 01:23:51 crc kubenswrapper[4991]: I1206 01:23:51.428641 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.067979898 podStartE2EDuration="11.428615397s" podCreationTimestamp="2025-12-06 01:23:40 +0000 UTC" firstStartedPulling="2025-12-06 01:23:41.427464305 +0000 UTC m=+1294.777754763" lastFinishedPulling="2025-12-06 01:23:50.788099794 +0000 UTC m=+1304.138390262" observedRunningTime="2025-12-06 01:23:51.423907896 +0000 UTC m=+1304.774198364" watchObservedRunningTime="2025-12-06 01:23:51.428615397 +0000 UTC m=+1304.778905865" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.244246 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.353257 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df-run-httpd\") pod \"ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df\" (UID: \"ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df\") " Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.353312 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df-config-data\") pod \"ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df\" (UID: \"ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df\") " Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.353401 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcksl\" (UniqueName: \"kubernetes.io/projected/ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df-kube-api-access-jcksl\") pod \"ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df\" (UID: \"ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df\") " Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.353450 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df-sg-core-conf-yaml\") pod \"ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df\" (UID: \"ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df\") " Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.353543 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df-log-httpd\") pod \"ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df\" (UID: \"ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df\") " Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.353580 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df-combined-ca-bundle\") pod \"ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df\" (UID: \"ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df\") " Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.353626 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df-scripts\") pod \"ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df\" (UID: \"ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df\") " Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.354726 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df" (UID: "ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.354868 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df" (UID: "ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.358013 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df-kube-api-access-jcksl" (OuterVolumeSpecName: "kube-api-access-jcksl") pod "ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df" (UID: "ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df"). InnerVolumeSpecName "kube-api-access-jcksl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.358031 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df-scripts" (OuterVolumeSpecName: "scripts") pod "ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df" (UID: "ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.386677 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df" (UID: "ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.434513 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"612d471c-b8ae-46ac-b3ff-9920f5615bd4","Type":"ContainerStarted","Data":"185219f6368a3681de361a4e087597572c20983ce2dafa70801f9b471c80bbbe"} Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.443128 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df" (UID: "ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.448511 4991 generic.go:334] "Generic (PLEG): container finished" podID="ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df" containerID="5752c4000c15c7891d0b11112f03c72083d64c6e0d35bf63d3bd13a5af540ee5" exitCode=0 Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.448545 4991 generic.go:334] "Generic (PLEG): container finished" podID="ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df" containerID="055aceabd52c109b016a6fb686943afc74a51ba7d72452a9ee7db76bf33cca35" exitCode=2 Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.448554 4991 generic.go:334] "Generic (PLEG): container finished" podID="ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df" containerID="9f88b126dec0d1d3473971ee13033e799488e5c3d3b54e9df2099fc7d6800b75" exitCode=0 Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.448561 4991 generic.go:334] "Generic (PLEG): container finished" podID="ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df" containerID="705726e8fb183ed551b7054800f94200b20bb1ad78a6726d735b4d61f51ae176" exitCode=0 Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.448582 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df","Type":"ContainerDied","Data":"5752c4000c15c7891d0b11112f03c72083d64c6e0d35bf63d3bd13a5af540ee5"} Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.448608 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df","Type":"ContainerDied","Data":"055aceabd52c109b016a6fb686943afc74a51ba7d72452a9ee7db76bf33cca35"} Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.448618 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df","Type":"ContainerDied","Data":"9f88b126dec0d1d3473971ee13033e799488e5c3d3b54e9df2099fc7d6800b75"} Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.448626 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df","Type":"ContainerDied","Data":"705726e8fb183ed551b7054800f94200b20bb1ad78a6726d735b4d61f51ae176"} Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.448637 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df","Type":"ContainerDied","Data":"3ece86d357539c41d0d1f8c96488a1beb479768fa40ba1531b3ab1a5fcb28037"} Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.448652 4991 scope.go:117] "RemoveContainer" containerID="5752c4000c15c7891d0b11112f03c72083d64c6e0d35bf63d3bd13a5af540ee5" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.448760 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.457750 4991 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.457780 4991 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.457789 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.457798 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.457807 4991 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.457814 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcksl\" (UniqueName: \"kubernetes.io/projected/ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df-kube-api-access-jcksl\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.480394 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df-config-data" (OuterVolumeSpecName: "config-data") pod "ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df" (UID: "ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.559281 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.609298 4991 scope.go:117] "RemoveContainer" containerID="055aceabd52c109b016a6fb686943afc74a51ba7d72452a9ee7db76bf33cca35" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.652304 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-2l6rx"] Dec 06 01:23:52 crc kubenswrapper[4991]: E1206 01:23:52.652888 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df" containerName="ceilometer-central-agent" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.652902 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df" containerName="ceilometer-central-agent" Dec 06 01:23:52 crc kubenswrapper[4991]: E1206 01:23:52.652926 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df" containerName="proxy-httpd" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.652934 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df" containerName="proxy-httpd" Dec 06 01:23:52 crc kubenswrapper[4991]: E1206 01:23:52.652963 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df" containerName="ceilometer-notification-agent" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.652970 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df" containerName="ceilometer-notification-agent" Dec 06 01:23:52 crc kubenswrapper[4991]: E1206 01:23:52.652999 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df" containerName="sg-core" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.653008 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df" containerName="sg-core" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.653210 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df" containerName="ceilometer-central-agent" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.653224 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df" containerName="sg-core" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.653251 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df" containerName="ceilometer-notification-agent" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.653268 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df" containerName="proxy-httpd" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.654055 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2l6rx" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.671443 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-2l6rx"] Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.675159 4991 scope.go:117] "RemoveContainer" containerID="9f88b126dec0d1d3473971ee13033e799488e5c3d3b54e9df2099fc7d6800b75" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.753073 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-qt8jb"] Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.754801 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qt8jb" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.765431 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz5wn\" (UniqueName: \"kubernetes.io/projected/cad619ec-5672-46aa-b7d8-2f996561383d-kube-api-access-sz5wn\") pod \"nova-api-db-create-2l6rx\" (UID: \"cad619ec-5672-46aa-b7d8-2f996561383d\") " pod="openstack/nova-api-db-create-2l6rx" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.765697 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cad619ec-5672-46aa-b7d8-2f996561383d-operator-scripts\") pod \"nova-api-db-create-2l6rx\" (UID: \"cad619ec-5672-46aa-b7d8-2f996561383d\") " pod="openstack/nova-api-db-create-2l6rx" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.781142 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-qt8jb"] Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.793734 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-adf2-account-create-update-57sjq"] Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.795324 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-adf2-account-create-update-57sjq" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.804799 4991 scope.go:117] "RemoveContainer" containerID="705726e8fb183ed551b7054800f94200b20bb1ad78a6726d735b4d61f51ae176" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.805237 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.819244 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-adf2-account-create-update-57sjq"] Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.842899 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.850905 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.855031 4991 scope.go:117] "RemoveContainer" containerID="5752c4000c15c7891d0b11112f03c72083d64c6e0d35bf63d3bd13a5af540ee5" Dec 06 01:23:52 crc kubenswrapper[4991]: E1206 01:23:52.857678 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5752c4000c15c7891d0b11112f03c72083d64c6e0d35bf63d3bd13a5af540ee5\": container with ID starting with 5752c4000c15c7891d0b11112f03c72083d64c6e0d35bf63d3bd13a5af540ee5 not found: ID does not exist" containerID="5752c4000c15c7891d0b11112f03c72083d64c6e0d35bf63d3bd13a5af540ee5" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.857750 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5752c4000c15c7891d0b11112f03c72083d64c6e0d35bf63d3bd13a5af540ee5"} err="failed to get container status \"5752c4000c15c7891d0b11112f03c72083d64c6e0d35bf63d3bd13a5af540ee5\": rpc error: code = NotFound desc = could not find container \"5752c4000c15c7891d0b11112f03c72083d64c6e0d35bf63d3bd13a5af540ee5\": container with ID starting with 5752c4000c15c7891d0b11112f03c72083d64c6e0d35bf63d3bd13a5af540ee5 not found: ID does not exist" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.857821 4991 scope.go:117] "RemoveContainer" containerID="055aceabd52c109b016a6fb686943afc74a51ba7d72452a9ee7db76bf33cca35" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.858389 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.861060 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.865642 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.865920 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.868146 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cad619ec-5672-46aa-b7d8-2f996561383d-operator-scripts\") pod \"nova-api-db-create-2l6rx\" (UID: \"cad619ec-5672-46aa-b7d8-2f996561383d\") " pod="openstack/nova-api-db-create-2l6rx" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.868294 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbf1edef-446b-4eb8-beba-fde054153a2b-operator-scripts\") pod \"nova-cell0-db-create-qt8jb\" (UID: \"dbf1edef-446b-4eb8-beba-fde054153a2b\") " pod="openstack/nova-cell0-db-create-qt8jb" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.868330 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sz5wn\" (UniqueName: \"kubernetes.io/projected/cad619ec-5672-46aa-b7d8-2f996561383d-kube-api-access-sz5wn\") pod \"nova-api-db-create-2l6rx\" (UID: \"cad619ec-5672-46aa-b7d8-2f996561383d\") " pod="openstack/nova-api-db-create-2l6rx" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.868376 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj5j9\" (UniqueName: \"kubernetes.io/projected/dbf1edef-446b-4eb8-beba-fde054153a2b-kube-api-access-jj5j9\") pod \"nova-cell0-db-create-qt8jb\" (UID: \"dbf1edef-446b-4eb8-beba-fde054153a2b\") " pod="openstack/nova-cell0-db-create-qt8jb" Dec 06 01:23:52 crc kubenswrapper[4991]: E1206 01:23:52.868608 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"055aceabd52c109b016a6fb686943afc74a51ba7d72452a9ee7db76bf33cca35\": container with ID starting with 055aceabd52c109b016a6fb686943afc74a51ba7d72452a9ee7db76bf33cca35 not found: ID does not exist" containerID="055aceabd52c109b016a6fb686943afc74a51ba7d72452a9ee7db76bf33cca35" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.868656 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"055aceabd52c109b016a6fb686943afc74a51ba7d72452a9ee7db76bf33cca35"} err="failed to get container status \"055aceabd52c109b016a6fb686943afc74a51ba7d72452a9ee7db76bf33cca35\": rpc error: code = NotFound desc = could not find container \"055aceabd52c109b016a6fb686943afc74a51ba7d72452a9ee7db76bf33cca35\": container with ID starting with 055aceabd52c109b016a6fb686943afc74a51ba7d72452a9ee7db76bf33cca35 not found: ID does not exist" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.868690 4991 scope.go:117] "RemoveContainer" containerID="9f88b126dec0d1d3473971ee13033e799488e5c3d3b54e9df2099fc7d6800b75" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.870145 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cad619ec-5672-46aa-b7d8-2f996561383d-operator-scripts\") pod \"nova-api-db-create-2l6rx\" (UID: \"cad619ec-5672-46aa-b7d8-2f996561383d\") " pod="openstack/nova-api-db-create-2l6rx" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.874894 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:23:52 crc kubenswrapper[4991]: E1206 01:23:52.878291 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f88b126dec0d1d3473971ee13033e799488e5c3d3b54e9df2099fc7d6800b75\": container with ID starting with 9f88b126dec0d1d3473971ee13033e799488e5c3d3b54e9df2099fc7d6800b75 not found: ID does not exist" containerID="9f88b126dec0d1d3473971ee13033e799488e5c3d3b54e9df2099fc7d6800b75" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.878337 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f88b126dec0d1d3473971ee13033e799488e5c3d3b54e9df2099fc7d6800b75"} err="failed to get container status \"9f88b126dec0d1d3473971ee13033e799488e5c3d3b54e9df2099fc7d6800b75\": rpc error: code = NotFound desc = could not find container \"9f88b126dec0d1d3473971ee13033e799488e5c3d3b54e9df2099fc7d6800b75\": container with ID starting with 9f88b126dec0d1d3473971ee13033e799488e5c3d3b54e9df2099fc7d6800b75 not found: ID does not exist" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.878378 4991 scope.go:117] "RemoveContainer" containerID="705726e8fb183ed551b7054800f94200b20bb1ad78a6726d735b4d61f51ae176" Dec 06 01:23:52 crc kubenswrapper[4991]: E1206 01:23:52.879325 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"705726e8fb183ed551b7054800f94200b20bb1ad78a6726d735b4d61f51ae176\": container with ID starting with 705726e8fb183ed551b7054800f94200b20bb1ad78a6726d735b4d61f51ae176 not found: ID does not exist" containerID="705726e8fb183ed551b7054800f94200b20bb1ad78a6726d735b4d61f51ae176" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.879372 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"705726e8fb183ed551b7054800f94200b20bb1ad78a6726d735b4d61f51ae176"} err="failed to get container status \"705726e8fb183ed551b7054800f94200b20bb1ad78a6726d735b4d61f51ae176\": rpc error: code = NotFound desc = could not find container \"705726e8fb183ed551b7054800f94200b20bb1ad78a6726d735b4d61f51ae176\": container with ID starting with 705726e8fb183ed551b7054800f94200b20bb1ad78a6726d735b4d61f51ae176 not found: ID does not exist" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.879406 4991 scope.go:117] "RemoveContainer" containerID="5752c4000c15c7891d0b11112f03c72083d64c6e0d35bf63d3bd13a5af540ee5" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.890313 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5752c4000c15c7891d0b11112f03c72083d64c6e0d35bf63d3bd13a5af540ee5"} err="failed to get container status \"5752c4000c15c7891d0b11112f03c72083d64c6e0d35bf63d3bd13a5af540ee5\": rpc error: code = NotFound desc = could not find container \"5752c4000c15c7891d0b11112f03c72083d64c6e0d35bf63d3bd13a5af540ee5\": container with ID starting with 5752c4000c15c7891d0b11112f03c72083d64c6e0d35bf63d3bd13a5af540ee5 not found: ID does not exist" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.890408 4991 scope.go:117] "RemoveContainer" containerID="055aceabd52c109b016a6fb686943afc74a51ba7d72452a9ee7db76bf33cca35" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.898387 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"055aceabd52c109b016a6fb686943afc74a51ba7d72452a9ee7db76bf33cca35"} err="failed to get container status \"055aceabd52c109b016a6fb686943afc74a51ba7d72452a9ee7db76bf33cca35\": rpc error: code = NotFound desc = could not find container \"055aceabd52c109b016a6fb686943afc74a51ba7d72452a9ee7db76bf33cca35\": container with ID starting with 055aceabd52c109b016a6fb686943afc74a51ba7d72452a9ee7db76bf33cca35 not found: ID does not exist" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.898447 4991 scope.go:117] "RemoveContainer" containerID="9f88b126dec0d1d3473971ee13033e799488e5c3d3b54e9df2099fc7d6800b75" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.901122 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz5wn\" (UniqueName: \"kubernetes.io/projected/cad619ec-5672-46aa-b7d8-2f996561383d-kube-api-access-sz5wn\") pod \"nova-api-db-create-2l6rx\" (UID: \"cad619ec-5672-46aa-b7d8-2f996561383d\") " pod="openstack/nova-api-db-create-2l6rx" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.971941 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s58b\" (UniqueName: \"kubernetes.io/projected/4899a057-9b8a-4dd0-8219-9a626d671fdf-kube-api-access-8s58b\") pod \"nova-api-adf2-account-create-update-57sjq\" (UID: \"4899a057-9b8a-4dd0-8219-9a626d671fdf\") " pod="openstack/nova-api-adf2-account-create-update-57sjq" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.972293 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b475d4e-b396-4747-9aba-b6ff6bed11a1-config-data\") pod \"ceilometer-0\" (UID: \"4b475d4e-b396-4747-9aba-b6ff6bed11a1\") " pod="openstack/ceilometer-0" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.972469 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4899a057-9b8a-4dd0-8219-9a626d671fdf-operator-scripts\") pod \"nova-api-adf2-account-create-update-57sjq\" (UID: \"4899a057-9b8a-4dd0-8219-9a626d671fdf\") " pod="openstack/nova-api-adf2-account-create-update-57sjq" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.972827 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbf1edef-446b-4eb8-beba-fde054153a2b-operator-scripts\") pod \"nova-cell0-db-create-qt8jb\" (UID: \"dbf1edef-446b-4eb8-beba-fde054153a2b\") " pod="openstack/nova-cell0-db-create-qt8jb" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.973503 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj5j9\" (UniqueName: \"kubernetes.io/projected/dbf1edef-446b-4eb8-beba-fde054153a2b-kube-api-access-jj5j9\") pod \"nova-cell0-db-create-qt8jb\" (UID: \"dbf1edef-446b-4eb8-beba-fde054153a2b\") " pod="openstack/nova-cell0-db-create-qt8jb" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.972738 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-hctc4"] Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.973864 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b475d4e-b396-4747-9aba-b6ff6bed11a1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4b475d4e-b396-4747-9aba-b6ff6bed11a1\") " pod="openstack/ceilometer-0" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.974038 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b475d4e-b396-4747-9aba-b6ff6bed11a1-scripts\") pod \"ceilometer-0\" (UID: \"4b475d4e-b396-4747-9aba-b6ff6bed11a1\") " pod="openstack/ceilometer-0" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.974111 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b475d4e-b396-4747-9aba-b6ff6bed11a1-log-httpd\") pod \"ceilometer-0\" (UID: \"4b475d4e-b396-4747-9aba-b6ff6bed11a1\") " pod="openstack/ceilometer-0" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.974145 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6j42\" (UniqueName: \"kubernetes.io/projected/4b475d4e-b396-4747-9aba-b6ff6bed11a1-kube-api-access-l6j42\") pod \"ceilometer-0\" (UID: \"4b475d4e-b396-4747-9aba-b6ff6bed11a1\") " pod="openstack/ceilometer-0" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.974181 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b475d4e-b396-4747-9aba-b6ff6bed11a1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4b475d4e-b396-4747-9aba-b6ff6bed11a1\") " pod="openstack/ceilometer-0" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.974254 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b475d4e-b396-4747-9aba-b6ff6bed11a1-run-httpd\") pod \"ceilometer-0\" (UID: \"4b475d4e-b396-4747-9aba-b6ff6bed11a1\") " pod="openstack/ceilometer-0" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.976597 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbf1edef-446b-4eb8-beba-fde054153a2b-operator-scripts\") pod \"nova-cell0-db-create-qt8jb\" (UID: \"dbf1edef-446b-4eb8-beba-fde054153a2b\") " pod="openstack/nova-cell0-db-create-qt8jb" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.976646 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hctc4" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.977461 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2l6rx" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.978844 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f88b126dec0d1d3473971ee13033e799488e5c3d3b54e9df2099fc7d6800b75"} err="failed to get container status \"9f88b126dec0d1d3473971ee13033e799488e5c3d3b54e9df2099fc7d6800b75\": rpc error: code = NotFound desc = could not find container \"9f88b126dec0d1d3473971ee13033e799488e5c3d3b54e9df2099fc7d6800b75\": container with ID starting with 9f88b126dec0d1d3473971ee13033e799488e5c3d3b54e9df2099fc7d6800b75 not found: ID does not exist" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.978883 4991 scope.go:117] "RemoveContainer" containerID="705726e8fb183ed551b7054800f94200b20bb1ad78a6726d735b4d61f51ae176" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.980967 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"705726e8fb183ed551b7054800f94200b20bb1ad78a6726d735b4d61f51ae176"} err="failed to get container status \"705726e8fb183ed551b7054800f94200b20bb1ad78a6726d735b4d61f51ae176\": rpc error: code = NotFound desc = could not find container \"705726e8fb183ed551b7054800f94200b20bb1ad78a6726d735b4d61f51ae176\": container with ID starting with 705726e8fb183ed551b7054800f94200b20bb1ad78a6726d735b4d61f51ae176 not found: ID does not exist" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.981156 4991 scope.go:117] "RemoveContainer" containerID="5752c4000c15c7891d0b11112f03c72083d64c6e0d35bf63d3bd13a5af540ee5" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.981690 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5752c4000c15c7891d0b11112f03c72083d64c6e0d35bf63d3bd13a5af540ee5"} err="failed to get container status \"5752c4000c15c7891d0b11112f03c72083d64c6e0d35bf63d3bd13a5af540ee5\": rpc error: code = NotFound desc = could not find container \"5752c4000c15c7891d0b11112f03c72083d64c6e0d35bf63d3bd13a5af540ee5\": container with ID starting with 5752c4000c15c7891d0b11112f03c72083d64c6e0d35bf63d3bd13a5af540ee5 not found: ID does not exist" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.981747 4991 scope.go:117] "RemoveContainer" containerID="055aceabd52c109b016a6fb686943afc74a51ba7d72452a9ee7db76bf33cca35" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.982194 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"055aceabd52c109b016a6fb686943afc74a51ba7d72452a9ee7db76bf33cca35"} err="failed to get container status \"055aceabd52c109b016a6fb686943afc74a51ba7d72452a9ee7db76bf33cca35\": rpc error: code = NotFound desc = could not find container \"055aceabd52c109b016a6fb686943afc74a51ba7d72452a9ee7db76bf33cca35\": container with ID starting with 055aceabd52c109b016a6fb686943afc74a51ba7d72452a9ee7db76bf33cca35 not found: ID does not exist" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.982236 4991 scope.go:117] "RemoveContainer" containerID="9f88b126dec0d1d3473971ee13033e799488e5c3d3b54e9df2099fc7d6800b75" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.982491 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f88b126dec0d1d3473971ee13033e799488e5c3d3b54e9df2099fc7d6800b75"} err="failed to get container status \"9f88b126dec0d1d3473971ee13033e799488e5c3d3b54e9df2099fc7d6800b75\": rpc error: code = NotFound desc = could not find container \"9f88b126dec0d1d3473971ee13033e799488e5c3d3b54e9df2099fc7d6800b75\": container with ID starting with 9f88b126dec0d1d3473971ee13033e799488e5c3d3b54e9df2099fc7d6800b75 not found: ID does not exist" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.982519 4991 scope.go:117] "RemoveContainer" containerID="705726e8fb183ed551b7054800f94200b20bb1ad78a6726d735b4d61f51ae176" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.982943 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"705726e8fb183ed551b7054800f94200b20bb1ad78a6726d735b4d61f51ae176"} err="failed to get container status \"705726e8fb183ed551b7054800f94200b20bb1ad78a6726d735b4d61f51ae176\": rpc error: code = NotFound desc = could not find container \"705726e8fb183ed551b7054800f94200b20bb1ad78a6726d735b4d61f51ae176\": container with ID starting with 705726e8fb183ed551b7054800f94200b20bb1ad78a6726d735b4d61f51ae176 not found: ID does not exist" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.983062 4991 scope.go:117] "RemoveContainer" containerID="5752c4000c15c7891d0b11112f03c72083d64c6e0d35bf63d3bd13a5af540ee5" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.984226 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5752c4000c15c7891d0b11112f03c72083d64c6e0d35bf63d3bd13a5af540ee5"} err="failed to get container status \"5752c4000c15c7891d0b11112f03c72083d64c6e0d35bf63d3bd13a5af540ee5\": rpc error: code = NotFound desc = could not find container \"5752c4000c15c7891d0b11112f03c72083d64c6e0d35bf63d3bd13a5af540ee5\": container with ID starting with 5752c4000c15c7891d0b11112f03c72083d64c6e0d35bf63d3bd13a5af540ee5 not found: ID does not exist" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.984281 4991 scope.go:117] "RemoveContainer" containerID="055aceabd52c109b016a6fb686943afc74a51ba7d72452a9ee7db76bf33cca35" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.984727 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"055aceabd52c109b016a6fb686943afc74a51ba7d72452a9ee7db76bf33cca35"} err="failed to get container status \"055aceabd52c109b016a6fb686943afc74a51ba7d72452a9ee7db76bf33cca35\": rpc error: code = NotFound desc = could not find container \"055aceabd52c109b016a6fb686943afc74a51ba7d72452a9ee7db76bf33cca35\": container with ID starting with 055aceabd52c109b016a6fb686943afc74a51ba7d72452a9ee7db76bf33cca35 not found: ID does not exist" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.984815 4991 scope.go:117] "RemoveContainer" containerID="9f88b126dec0d1d3473971ee13033e799488e5c3d3b54e9df2099fc7d6800b75" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.985110 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f88b126dec0d1d3473971ee13033e799488e5c3d3b54e9df2099fc7d6800b75"} err="failed to get container status \"9f88b126dec0d1d3473971ee13033e799488e5c3d3b54e9df2099fc7d6800b75\": rpc error: code = NotFound desc = could not find container \"9f88b126dec0d1d3473971ee13033e799488e5c3d3b54e9df2099fc7d6800b75\": container with ID starting with 9f88b126dec0d1d3473971ee13033e799488e5c3d3b54e9df2099fc7d6800b75 not found: ID does not exist" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.985194 4991 scope.go:117] "RemoveContainer" containerID="705726e8fb183ed551b7054800f94200b20bb1ad78a6726d735b4d61f51ae176" Dec 06 01:23:52 crc kubenswrapper[4991]: I1206 01:23:52.986178 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"705726e8fb183ed551b7054800f94200b20bb1ad78a6726d735b4d61f51ae176"} err="failed to get container status \"705726e8fb183ed551b7054800f94200b20bb1ad78a6726d735b4d61f51ae176\": rpc error: code = NotFound desc = could not find container \"705726e8fb183ed551b7054800f94200b20bb1ad78a6726d735b4d61f51ae176\": container with ID starting with 705726e8fb183ed551b7054800f94200b20bb1ad78a6726d735b4d61f51ae176 not found: ID does not exist" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:52.999188 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj5j9\" (UniqueName: \"kubernetes.io/projected/dbf1edef-446b-4eb8-beba-fde054153a2b-kube-api-access-jj5j9\") pod \"nova-cell0-db-create-qt8jb\" (UID: \"dbf1edef-446b-4eb8-beba-fde054153a2b\") " pod="openstack/nova-cell0-db-create-qt8jb" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.006089 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-hctc4"] Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.053583 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df" path="/var/lib/kubelet/pods/ba51c3aa-e6ee-4341-b0f0-6fa5abbf75df/volumes" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.060158 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-92a2-account-create-update-8xs8q"] Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.062130 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-92a2-account-create-update-8xs8q" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.065169 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.073219 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-92a2-account-create-update-8xs8q"] Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.079994 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qt8jb" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.084655 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdwzt\" (UniqueName: \"kubernetes.io/projected/6f786a82-5143-4f90-aed1-f9d4ffa49f00-kube-api-access-rdwzt\") pod \"nova-cell1-db-create-hctc4\" (UID: \"6f786a82-5143-4f90-aed1-f9d4ffa49f00\") " pod="openstack/nova-cell1-db-create-hctc4" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.089246 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8s58b\" (UniqueName: \"kubernetes.io/projected/4899a057-9b8a-4dd0-8219-9a626d671fdf-kube-api-access-8s58b\") pod \"nova-api-adf2-account-create-update-57sjq\" (UID: \"4899a057-9b8a-4dd0-8219-9a626d671fdf\") " pod="openstack/nova-api-adf2-account-create-update-57sjq" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.089297 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b475d4e-b396-4747-9aba-b6ff6bed11a1-config-data\") pod \"ceilometer-0\" (UID: \"4b475d4e-b396-4747-9aba-b6ff6bed11a1\") " pod="openstack/ceilometer-0" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.089438 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4899a057-9b8a-4dd0-8219-9a626d671fdf-operator-scripts\") pod \"nova-api-adf2-account-create-update-57sjq\" (UID: \"4899a057-9b8a-4dd0-8219-9a626d671fdf\") " pod="openstack/nova-api-adf2-account-create-update-57sjq" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.089631 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b475d4e-b396-4747-9aba-b6ff6bed11a1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4b475d4e-b396-4747-9aba-b6ff6bed11a1\") " pod="openstack/ceilometer-0" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.089727 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b475d4e-b396-4747-9aba-b6ff6bed11a1-scripts\") pod \"ceilometer-0\" (UID: \"4b475d4e-b396-4747-9aba-b6ff6bed11a1\") " pod="openstack/ceilometer-0" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.089768 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f786a82-5143-4f90-aed1-f9d4ffa49f00-operator-scripts\") pod \"nova-cell1-db-create-hctc4\" (UID: \"6f786a82-5143-4f90-aed1-f9d4ffa49f00\") " pod="openstack/nova-cell1-db-create-hctc4" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.089834 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b475d4e-b396-4747-9aba-b6ff6bed11a1-log-httpd\") pod \"ceilometer-0\" (UID: \"4b475d4e-b396-4747-9aba-b6ff6bed11a1\") " pod="openstack/ceilometer-0" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.089861 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6j42\" (UniqueName: \"kubernetes.io/projected/4b475d4e-b396-4747-9aba-b6ff6bed11a1-kube-api-access-l6j42\") pod \"ceilometer-0\" (UID: \"4b475d4e-b396-4747-9aba-b6ff6bed11a1\") " pod="openstack/ceilometer-0" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.089893 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b475d4e-b396-4747-9aba-b6ff6bed11a1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4b475d4e-b396-4747-9aba-b6ff6bed11a1\") " pod="openstack/ceilometer-0" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.089968 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b475d4e-b396-4747-9aba-b6ff6bed11a1-run-httpd\") pod \"ceilometer-0\" (UID: \"4b475d4e-b396-4747-9aba-b6ff6bed11a1\") " pod="openstack/ceilometer-0" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.090456 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b475d4e-b396-4747-9aba-b6ff6bed11a1-run-httpd\") pod \"ceilometer-0\" (UID: \"4b475d4e-b396-4747-9aba-b6ff6bed11a1\") " pod="openstack/ceilometer-0" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.093230 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4899a057-9b8a-4dd0-8219-9a626d671fdf-operator-scripts\") pod \"nova-api-adf2-account-create-update-57sjq\" (UID: \"4899a057-9b8a-4dd0-8219-9a626d671fdf\") " pod="openstack/nova-api-adf2-account-create-update-57sjq" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.094419 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b475d4e-b396-4747-9aba-b6ff6bed11a1-log-httpd\") pod \"ceilometer-0\" (UID: \"4b475d4e-b396-4747-9aba-b6ff6bed11a1\") " pod="openstack/ceilometer-0" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.100532 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b475d4e-b396-4747-9aba-b6ff6bed11a1-config-data\") pod \"ceilometer-0\" (UID: \"4b475d4e-b396-4747-9aba-b6ff6bed11a1\") " pod="openstack/ceilometer-0" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.102014 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b475d4e-b396-4747-9aba-b6ff6bed11a1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4b475d4e-b396-4747-9aba-b6ff6bed11a1\") " pod="openstack/ceilometer-0" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.108071 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b475d4e-b396-4747-9aba-b6ff6bed11a1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4b475d4e-b396-4747-9aba-b6ff6bed11a1\") " pod="openstack/ceilometer-0" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.108341 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b475d4e-b396-4747-9aba-b6ff6bed11a1-scripts\") pod \"ceilometer-0\" (UID: \"4b475d4e-b396-4747-9aba-b6ff6bed11a1\") " pod="openstack/ceilometer-0" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.110326 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s58b\" (UniqueName: \"kubernetes.io/projected/4899a057-9b8a-4dd0-8219-9a626d671fdf-kube-api-access-8s58b\") pod \"nova-api-adf2-account-create-update-57sjq\" (UID: \"4899a057-9b8a-4dd0-8219-9a626d671fdf\") " pod="openstack/nova-api-adf2-account-create-update-57sjq" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.114415 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6j42\" (UniqueName: \"kubernetes.io/projected/4b475d4e-b396-4747-9aba-b6ff6bed11a1-kube-api-access-l6j42\") pod \"ceilometer-0\" (UID: \"4b475d4e-b396-4747-9aba-b6ff6bed11a1\") " pod="openstack/ceilometer-0" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.131461 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-f23d-account-create-update-lk7cd"] Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.132693 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f23d-account-create-update-lk7cd" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.135289 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.138876 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-adf2-account-create-update-57sjq" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.144509 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-f23d-account-create-update-lk7cd"] Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.193671 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f786a82-5143-4f90-aed1-f9d4ffa49f00-operator-scripts\") pod \"nova-cell1-db-create-hctc4\" (UID: \"6f786a82-5143-4f90-aed1-f9d4ffa49f00\") " pod="openstack/nova-cell1-db-create-hctc4" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.194451 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22pcc\" (UniqueName: \"kubernetes.io/projected/22ac1a63-4582-436a-a235-bb4fe90fe6f3-kube-api-access-22pcc\") pod \"nova-cell0-92a2-account-create-update-8xs8q\" (UID: \"22ac1a63-4582-436a-a235-bb4fe90fe6f3\") " pod="openstack/nova-cell0-92a2-account-create-update-8xs8q" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.194593 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwzt\" (UniqueName: \"kubernetes.io/projected/6f786a82-5143-4f90-aed1-f9d4ffa49f00-kube-api-access-rdwzt\") pod \"nova-cell1-db-create-hctc4\" (UID: \"6f786a82-5143-4f90-aed1-f9d4ffa49f00\") " pod="openstack/nova-cell1-db-create-hctc4" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.194670 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f786a82-5143-4f90-aed1-f9d4ffa49f00-operator-scripts\") pod \"nova-cell1-db-create-hctc4\" (UID: \"6f786a82-5143-4f90-aed1-f9d4ffa49f00\") " pod="openstack/nova-cell1-db-create-hctc4" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.194709 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22ac1a63-4582-436a-a235-bb4fe90fe6f3-operator-scripts\") pod \"nova-cell0-92a2-account-create-update-8xs8q\" (UID: \"22ac1a63-4582-436a-a235-bb4fe90fe6f3\") " pod="openstack/nova-cell0-92a2-account-create-update-8xs8q" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.200223 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.216370 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwzt\" (UniqueName: \"kubernetes.io/projected/6f786a82-5143-4f90-aed1-f9d4ffa49f00-kube-api-access-rdwzt\") pod \"nova-cell1-db-create-hctc4\" (UID: \"6f786a82-5143-4f90-aed1-f9d4ffa49f00\") " pod="openstack/nova-cell1-db-create-hctc4" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.298371 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f87ca3d-5755-4aa8-8cd0-99f6841bbd83-operator-scripts\") pod \"nova-cell1-f23d-account-create-update-lk7cd\" (UID: \"9f87ca3d-5755-4aa8-8cd0-99f6841bbd83\") " pod="openstack/nova-cell1-f23d-account-create-update-lk7cd" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.298432 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kzc2\" (UniqueName: \"kubernetes.io/projected/9f87ca3d-5755-4aa8-8cd0-99f6841bbd83-kube-api-access-2kzc2\") pod \"nova-cell1-f23d-account-create-update-lk7cd\" (UID: \"9f87ca3d-5755-4aa8-8cd0-99f6841bbd83\") " pod="openstack/nova-cell1-f23d-account-create-update-lk7cd" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.298502 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22ac1a63-4582-436a-a235-bb4fe90fe6f3-operator-scripts\") pod \"nova-cell0-92a2-account-create-update-8xs8q\" (UID: \"22ac1a63-4582-436a-a235-bb4fe90fe6f3\") " pod="openstack/nova-cell0-92a2-account-create-update-8xs8q" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.298630 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22pcc\" (UniqueName: \"kubernetes.io/projected/22ac1a63-4582-436a-a235-bb4fe90fe6f3-kube-api-access-22pcc\") pod \"nova-cell0-92a2-account-create-update-8xs8q\" (UID: \"22ac1a63-4582-436a-a235-bb4fe90fe6f3\") " pod="openstack/nova-cell0-92a2-account-create-update-8xs8q" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.299714 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22ac1a63-4582-436a-a235-bb4fe90fe6f3-operator-scripts\") pod \"nova-cell0-92a2-account-create-update-8xs8q\" (UID: \"22ac1a63-4582-436a-a235-bb4fe90fe6f3\") " pod="openstack/nova-cell0-92a2-account-create-update-8xs8q" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.318040 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22pcc\" (UniqueName: \"kubernetes.io/projected/22ac1a63-4582-436a-a235-bb4fe90fe6f3-kube-api-access-22pcc\") pod \"nova-cell0-92a2-account-create-update-8xs8q\" (UID: \"22ac1a63-4582-436a-a235-bb4fe90fe6f3\") " pod="openstack/nova-cell0-92a2-account-create-update-8xs8q" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.339210 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hctc4" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.400902 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f87ca3d-5755-4aa8-8cd0-99f6841bbd83-operator-scripts\") pod \"nova-cell1-f23d-account-create-update-lk7cd\" (UID: \"9f87ca3d-5755-4aa8-8cd0-99f6841bbd83\") " pod="openstack/nova-cell1-f23d-account-create-update-lk7cd" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.400966 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kzc2\" (UniqueName: \"kubernetes.io/projected/9f87ca3d-5755-4aa8-8cd0-99f6841bbd83-kube-api-access-2kzc2\") pod \"nova-cell1-f23d-account-create-update-lk7cd\" (UID: \"9f87ca3d-5755-4aa8-8cd0-99f6841bbd83\") " pod="openstack/nova-cell1-f23d-account-create-update-lk7cd" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.402619 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f87ca3d-5755-4aa8-8cd0-99f6841bbd83-operator-scripts\") pod \"nova-cell1-f23d-account-create-update-lk7cd\" (UID: \"9f87ca3d-5755-4aa8-8cd0-99f6841bbd83\") " pod="openstack/nova-cell1-f23d-account-create-update-lk7cd" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.422092 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kzc2\" (UniqueName: \"kubernetes.io/projected/9f87ca3d-5755-4aa8-8cd0-99f6841bbd83-kube-api-access-2kzc2\") pod \"nova-cell1-f23d-account-create-update-lk7cd\" (UID: \"9f87ca3d-5755-4aa8-8cd0-99f6841bbd83\") " pod="openstack/nova-cell1-f23d-account-create-update-lk7cd" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.456706 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-92a2-account-create-update-8xs8q" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.481947 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f23d-account-create-update-lk7cd" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.485783 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.494893 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"612d471c-b8ae-46ac-b3ff-9920f5615bd4","Type":"ContainerStarted","Data":"9919ef2b0d1f244ce212f2c5a6ca2ed2c0af5df8cc13bddc9c2cc4b906711e47"} Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.496340 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.537002 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.536951897 podStartE2EDuration="3.536951897s" podCreationTimestamp="2025-12-06 01:23:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:23:53.517154165 +0000 UTC m=+1306.867444623" watchObservedRunningTime="2025-12-06 01:23:53.536951897 +0000 UTC m=+1306.887242365" Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.623001 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-2l6rx"] Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.729301 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-qt8jb"] Dec 06 01:23:53 crc kubenswrapper[4991]: I1206 01:23:53.896676 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-adf2-account-create-update-57sjq"] Dec 06 01:23:53 crc kubenswrapper[4991]: W1206 01:23:53.913882 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4899a057_9b8a_4dd0_8219_9a626d671fdf.slice/crio-57f88fb9a6ef4447a7bef1a1f03998daf8454adbfc8da11e0625632e726139fb WatchSource:0}: Error finding container 57f88fb9a6ef4447a7bef1a1f03998daf8454adbfc8da11e0625632e726139fb: Status 404 returned error can't find the container with id 57f88fb9a6ef4447a7bef1a1f03998daf8454adbfc8da11e0625632e726139fb Dec 06 01:23:54 crc kubenswrapper[4991]: I1206 01:23:54.097709 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:23:54 crc kubenswrapper[4991]: W1206 01:23:54.101808 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b475d4e_b396_4747_9aba_b6ff6bed11a1.slice/crio-627114f2d025c2690b99271060a1df93ebb8c9b1a85f9f110a2ea045f5cdbefa WatchSource:0}: Error finding container 627114f2d025c2690b99271060a1df93ebb8c9b1a85f9f110a2ea045f5cdbefa: Status 404 returned error can't find the container with id 627114f2d025c2690b99271060a1df93ebb8c9b1a85f9f110a2ea045f5cdbefa Dec 06 01:23:54 crc kubenswrapper[4991]: I1206 01:23:54.180395 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-hctc4"] Dec 06 01:23:54 crc kubenswrapper[4991]: I1206 01:23:54.199502 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-92a2-account-create-update-8xs8q"] Dec 06 01:23:54 crc kubenswrapper[4991]: I1206 01:23:54.214928 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-f23d-account-create-update-lk7cd"] Dec 06 01:23:54 crc kubenswrapper[4991]: I1206 01:23:54.487116 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-b4cd4b7d8-99kjr" podUID="ff6a4088-3509-40aa-a3f6-917e289cd44f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Dec 06 01:23:54 crc kubenswrapper[4991]: I1206 01:23:54.487486 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-b4cd4b7d8-99kjr" Dec 06 01:23:54 crc kubenswrapper[4991]: I1206 01:23:54.529939 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b475d4e-b396-4747-9aba-b6ff6bed11a1","Type":"ContainerStarted","Data":"627114f2d025c2690b99271060a1df93ebb8c9b1a85f9f110a2ea045f5cdbefa"} Dec 06 01:23:54 crc kubenswrapper[4991]: I1206 01:23:54.533806 4991 generic.go:334] "Generic (PLEG): container finished" podID="4899a057-9b8a-4dd0-8219-9a626d671fdf" containerID="3b491f4c057603a34220deafe82de216d1ee9347fc6ae89a40de77462dabc0af" exitCode=0 Dec 06 01:23:54 crc kubenswrapper[4991]: I1206 01:23:54.534227 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-adf2-account-create-update-57sjq" event={"ID":"4899a057-9b8a-4dd0-8219-9a626d671fdf","Type":"ContainerDied","Data":"3b491f4c057603a34220deafe82de216d1ee9347fc6ae89a40de77462dabc0af"} Dec 06 01:23:54 crc kubenswrapper[4991]: I1206 01:23:54.534247 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-adf2-account-create-update-57sjq" event={"ID":"4899a057-9b8a-4dd0-8219-9a626d671fdf","Type":"ContainerStarted","Data":"57f88fb9a6ef4447a7bef1a1f03998daf8454adbfc8da11e0625632e726139fb"} Dec 06 01:23:54 crc kubenswrapper[4991]: I1206 01:23:54.545170 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hctc4" event={"ID":"6f786a82-5143-4f90-aed1-f9d4ffa49f00","Type":"ContainerStarted","Data":"0235a66113624b53f29859d28fc5f6a0ffa576010f106a33ba72c4c1ea8aa8d8"} Dec 06 01:23:54 crc kubenswrapper[4991]: I1206 01:23:54.545226 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hctc4" event={"ID":"6f786a82-5143-4f90-aed1-f9d4ffa49f00","Type":"ContainerStarted","Data":"b018eeb4158051ea23fa4e1cfb520f7cfdc9638e5bb644a5165f6fef6a71830e"} Dec 06 01:23:54 crc kubenswrapper[4991]: I1206 01:23:54.547468 4991 generic.go:334] "Generic (PLEG): container finished" podID="cad619ec-5672-46aa-b7d8-2f996561383d" containerID="f4a6f9d709cfe08d2d52f893780718dbcf0330c9cf1fc125856dd67768020bed" exitCode=0 Dec 06 01:23:54 crc kubenswrapper[4991]: I1206 01:23:54.547525 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-2l6rx" event={"ID":"cad619ec-5672-46aa-b7d8-2f996561383d","Type":"ContainerDied","Data":"f4a6f9d709cfe08d2d52f893780718dbcf0330c9cf1fc125856dd67768020bed"} Dec 06 01:23:54 crc kubenswrapper[4991]: I1206 01:23:54.547544 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-2l6rx" event={"ID":"cad619ec-5672-46aa-b7d8-2f996561383d","Type":"ContainerStarted","Data":"78bf0922cb95bd67bcfc6a83b30a1532d8235b045c48c1edc968547f5269c86c"} Dec 06 01:23:54 crc kubenswrapper[4991]: I1206 01:23:54.555178 4991 generic.go:334] "Generic (PLEG): container finished" podID="dbf1edef-446b-4eb8-beba-fde054153a2b" containerID="84f20f5129d9f77ae1b49058cf2080084cf5f23d46cb8e6fd7eeff9066a5d747" exitCode=0 Dec 06 01:23:54 crc kubenswrapper[4991]: I1206 01:23:54.555221 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qt8jb" event={"ID":"dbf1edef-446b-4eb8-beba-fde054153a2b","Type":"ContainerDied","Data":"84f20f5129d9f77ae1b49058cf2080084cf5f23d46cb8e6fd7eeff9066a5d747"} Dec 06 01:23:54 crc kubenswrapper[4991]: I1206 01:23:54.555238 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qt8jb" event={"ID":"dbf1edef-446b-4eb8-beba-fde054153a2b","Type":"ContainerStarted","Data":"4e43bc060a3fda812ac15af7d60f95aaf526c8c537bcd92a7dc49b4b7db66cb4"} Dec 06 01:23:54 crc kubenswrapper[4991]: I1206 01:23:54.560564 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-92a2-account-create-update-8xs8q" event={"ID":"22ac1a63-4582-436a-a235-bb4fe90fe6f3","Type":"ContainerStarted","Data":"cdae0988a135b2009094dfe2b50f0009f6a1e39a58e62e7df4e94c668352eba7"} Dec 06 01:23:54 crc kubenswrapper[4991]: I1206 01:23:54.562851 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f23d-account-create-update-lk7cd" event={"ID":"9f87ca3d-5755-4aa8-8cd0-99f6841bbd83","Type":"ContainerStarted","Data":"942e7baa801e6c5fc2fb45207217f9f95c223d787e0b87f2d209dd870358a392"} Dec 06 01:23:54 crc kubenswrapper[4991]: I1206 01:23:54.680928 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-hctc4" podStartSLOduration=2.680897259 podStartE2EDuration="2.680897259s" podCreationTimestamp="2025-12-06 01:23:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:23:54.623018432 +0000 UTC m=+1307.973308910" watchObservedRunningTime="2025-12-06 01:23:54.680897259 +0000 UTC m=+1308.031187717" Dec 06 01:23:55 crc kubenswrapper[4991]: I1206 01:23:55.573773 4991 generic.go:334] "Generic (PLEG): container finished" podID="22ac1a63-4582-436a-a235-bb4fe90fe6f3" containerID="8e09c4543f5a3160784e09fed0e4849d9c5931ae2962530def345e545c7e8e05" exitCode=0 Dec 06 01:23:55 crc kubenswrapper[4991]: I1206 01:23:55.574266 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-92a2-account-create-update-8xs8q" event={"ID":"22ac1a63-4582-436a-a235-bb4fe90fe6f3","Type":"ContainerDied","Data":"8e09c4543f5a3160784e09fed0e4849d9c5931ae2962530def345e545c7e8e05"} Dec 06 01:23:55 crc kubenswrapper[4991]: I1206 01:23:55.578094 4991 generic.go:334] "Generic (PLEG): container finished" podID="9f87ca3d-5755-4aa8-8cd0-99f6841bbd83" containerID="cd23f41039c06876fc04bf7affc4879fd72f6c958db0e42354341dce0b857b95" exitCode=0 Dec 06 01:23:55 crc kubenswrapper[4991]: I1206 01:23:55.578156 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f23d-account-create-update-lk7cd" event={"ID":"9f87ca3d-5755-4aa8-8cd0-99f6841bbd83","Type":"ContainerDied","Data":"cd23f41039c06876fc04bf7affc4879fd72f6c958db0e42354341dce0b857b95"} Dec 06 01:23:55 crc kubenswrapper[4991]: I1206 01:23:55.584433 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b475d4e-b396-4747-9aba-b6ff6bed11a1","Type":"ContainerStarted","Data":"5b713ecd2f19e5cfef2d08df7794411b3b31acae252f465ac5ecfce528146af7"} Dec 06 01:23:55 crc kubenswrapper[4991]: I1206 01:23:55.584481 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b475d4e-b396-4747-9aba-b6ff6bed11a1","Type":"ContainerStarted","Data":"e8d24f81c46907da68d292f3bf6c8d16230dc3ffe0ed92b1391a1069494e827f"} Dec 06 01:23:55 crc kubenswrapper[4991]: I1206 01:23:55.591219 4991 generic.go:334] "Generic (PLEG): container finished" podID="6f786a82-5143-4f90-aed1-f9d4ffa49f00" containerID="0235a66113624b53f29859d28fc5f6a0ffa576010f106a33ba72c4c1ea8aa8d8" exitCode=0 Dec 06 01:23:55 crc kubenswrapper[4991]: I1206 01:23:55.591463 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hctc4" event={"ID":"6f786a82-5143-4f90-aed1-f9d4ffa49f00","Type":"ContainerDied","Data":"0235a66113624b53f29859d28fc5f6a0ffa576010f106a33ba72c4c1ea8aa8d8"} Dec 06 01:23:56 crc kubenswrapper[4991]: I1206 01:23:56.044298 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2l6rx" Dec 06 01:23:56 crc kubenswrapper[4991]: I1206 01:23:56.102337 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-adf2-account-create-update-57sjq" Dec 06 01:23:56 crc kubenswrapper[4991]: I1206 01:23:56.106746 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qt8jb" Dec 06 01:23:56 crc kubenswrapper[4991]: I1206 01:23:56.165297 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cad619ec-5672-46aa-b7d8-2f996561383d-operator-scripts\") pod \"cad619ec-5672-46aa-b7d8-2f996561383d\" (UID: \"cad619ec-5672-46aa-b7d8-2f996561383d\") " Dec 06 01:23:56 crc kubenswrapper[4991]: I1206 01:23:56.165381 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sz5wn\" (UniqueName: \"kubernetes.io/projected/cad619ec-5672-46aa-b7d8-2f996561383d-kube-api-access-sz5wn\") pod \"cad619ec-5672-46aa-b7d8-2f996561383d\" (UID: \"cad619ec-5672-46aa-b7d8-2f996561383d\") " Dec 06 01:23:56 crc kubenswrapper[4991]: I1206 01:23:56.169217 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cad619ec-5672-46aa-b7d8-2f996561383d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cad619ec-5672-46aa-b7d8-2f996561383d" (UID: "cad619ec-5672-46aa-b7d8-2f996561383d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:23:56 crc kubenswrapper[4991]: I1206 01:23:56.175569 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cad619ec-5672-46aa-b7d8-2f996561383d-kube-api-access-sz5wn" (OuterVolumeSpecName: "kube-api-access-sz5wn") pod "cad619ec-5672-46aa-b7d8-2f996561383d" (UID: "cad619ec-5672-46aa-b7d8-2f996561383d"). InnerVolumeSpecName "kube-api-access-sz5wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:23:56 crc kubenswrapper[4991]: I1206 01:23:56.267438 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4899a057-9b8a-4dd0-8219-9a626d671fdf-operator-scripts\") pod \"4899a057-9b8a-4dd0-8219-9a626d671fdf\" (UID: \"4899a057-9b8a-4dd0-8219-9a626d671fdf\") " Dec 06 01:23:56 crc kubenswrapper[4991]: I1206 01:23:56.267623 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jj5j9\" (UniqueName: \"kubernetes.io/projected/dbf1edef-446b-4eb8-beba-fde054153a2b-kube-api-access-jj5j9\") pod \"dbf1edef-446b-4eb8-beba-fde054153a2b\" (UID: \"dbf1edef-446b-4eb8-beba-fde054153a2b\") " Dec 06 01:23:56 crc kubenswrapper[4991]: I1206 01:23:56.267648 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8s58b\" (UniqueName: \"kubernetes.io/projected/4899a057-9b8a-4dd0-8219-9a626d671fdf-kube-api-access-8s58b\") pod \"4899a057-9b8a-4dd0-8219-9a626d671fdf\" (UID: \"4899a057-9b8a-4dd0-8219-9a626d671fdf\") " Dec 06 01:23:56 crc kubenswrapper[4991]: I1206 01:23:56.267683 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbf1edef-446b-4eb8-beba-fde054153a2b-operator-scripts\") pod \"dbf1edef-446b-4eb8-beba-fde054153a2b\" (UID: \"dbf1edef-446b-4eb8-beba-fde054153a2b\") " Dec 06 01:23:56 crc kubenswrapper[4991]: I1206 01:23:56.267949 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4899a057-9b8a-4dd0-8219-9a626d671fdf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4899a057-9b8a-4dd0-8219-9a626d671fdf" (UID: "4899a057-9b8a-4dd0-8219-9a626d671fdf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:23:56 crc kubenswrapper[4991]: I1206 01:23:56.268366 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbf1edef-446b-4eb8-beba-fde054153a2b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dbf1edef-446b-4eb8-beba-fde054153a2b" (UID: "dbf1edef-446b-4eb8-beba-fde054153a2b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:23:56 crc kubenswrapper[4991]: I1206 01:23:56.268662 4991 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4899a057-9b8a-4dd0-8219-9a626d671fdf-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:56 crc kubenswrapper[4991]: I1206 01:23:56.268685 4991 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cad619ec-5672-46aa-b7d8-2f996561383d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:56 crc kubenswrapper[4991]: I1206 01:23:56.268695 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sz5wn\" (UniqueName: \"kubernetes.io/projected/cad619ec-5672-46aa-b7d8-2f996561383d-kube-api-access-sz5wn\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:56 crc kubenswrapper[4991]: I1206 01:23:56.268707 4991 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbf1edef-446b-4eb8-beba-fde054153a2b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:56 crc kubenswrapper[4991]: I1206 01:23:56.273572 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbf1edef-446b-4eb8-beba-fde054153a2b-kube-api-access-jj5j9" (OuterVolumeSpecName: "kube-api-access-jj5j9") pod "dbf1edef-446b-4eb8-beba-fde054153a2b" (UID: "dbf1edef-446b-4eb8-beba-fde054153a2b"). InnerVolumeSpecName "kube-api-access-jj5j9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:23:56 crc kubenswrapper[4991]: I1206 01:23:56.274036 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4899a057-9b8a-4dd0-8219-9a626d671fdf-kube-api-access-8s58b" (OuterVolumeSpecName: "kube-api-access-8s58b") pod "4899a057-9b8a-4dd0-8219-9a626d671fdf" (UID: "4899a057-9b8a-4dd0-8219-9a626d671fdf"). InnerVolumeSpecName "kube-api-access-8s58b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:23:56 crc kubenswrapper[4991]: I1206 01:23:56.370632 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jj5j9\" (UniqueName: \"kubernetes.io/projected/dbf1edef-446b-4eb8-beba-fde054153a2b-kube-api-access-jj5j9\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:56 crc kubenswrapper[4991]: I1206 01:23:56.370685 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8s58b\" (UniqueName: \"kubernetes.io/projected/4899a057-9b8a-4dd0-8219-9a626d671fdf-kube-api-access-8s58b\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:56 crc kubenswrapper[4991]: I1206 01:23:56.603894 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b475d4e-b396-4747-9aba-b6ff6bed11a1","Type":"ContainerStarted","Data":"454fed63e9fe8536a5bdd1a09ee7d293696c92e3af429d2bef307d31648504a8"} Dec 06 01:23:56 crc kubenswrapper[4991]: I1206 01:23:56.605671 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-adf2-account-create-update-57sjq" event={"ID":"4899a057-9b8a-4dd0-8219-9a626d671fdf","Type":"ContainerDied","Data":"57f88fb9a6ef4447a7bef1a1f03998daf8454adbfc8da11e0625632e726139fb"} Dec 06 01:23:56 crc kubenswrapper[4991]: I1206 01:23:56.605706 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57f88fb9a6ef4447a7bef1a1f03998daf8454adbfc8da11e0625632e726139fb" Dec 06 01:23:56 crc kubenswrapper[4991]: I1206 01:23:56.605765 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-adf2-account-create-update-57sjq" Dec 06 01:23:56 crc kubenswrapper[4991]: I1206 01:23:56.611008 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2l6rx" Dec 06 01:23:56 crc kubenswrapper[4991]: I1206 01:23:56.611035 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-2l6rx" event={"ID":"cad619ec-5672-46aa-b7d8-2f996561383d","Type":"ContainerDied","Data":"78bf0922cb95bd67bcfc6a83b30a1532d8235b045c48c1edc968547f5269c86c"} Dec 06 01:23:56 crc kubenswrapper[4991]: I1206 01:23:56.611068 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78bf0922cb95bd67bcfc6a83b30a1532d8235b045c48c1edc968547f5269c86c" Dec 06 01:23:56 crc kubenswrapper[4991]: I1206 01:23:56.612748 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qt8jb" Dec 06 01:23:56 crc kubenswrapper[4991]: I1206 01:23:56.615517 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qt8jb" event={"ID":"dbf1edef-446b-4eb8-beba-fde054153a2b","Type":"ContainerDied","Data":"4e43bc060a3fda812ac15af7d60f95aaf526c8c537bcd92a7dc49b4b7db66cb4"} Dec 06 01:23:56 crc kubenswrapper[4991]: I1206 01:23:56.615550 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e43bc060a3fda812ac15af7d60f95aaf526c8c537bcd92a7dc49b4b7db66cb4" Dec 06 01:23:56 crc kubenswrapper[4991]: I1206 01:23:56.957594 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hctc4" Dec 06 01:23:57 crc kubenswrapper[4991]: I1206 01:23:57.082156 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f786a82-5143-4f90-aed1-f9d4ffa49f00-operator-scripts\") pod \"6f786a82-5143-4f90-aed1-f9d4ffa49f00\" (UID: \"6f786a82-5143-4f90-aed1-f9d4ffa49f00\") " Dec 06 01:23:57 crc kubenswrapper[4991]: I1206 01:23:57.082255 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdwzt\" (UniqueName: \"kubernetes.io/projected/6f786a82-5143-4f90-aed1-f9d4ffa49f00-kube-api-access-rdwzt\") pod \"6f786a82-5143-4f90-aed1-f9d4ffa49f00\" (UID: \"6f786a82-5143-4f90-aed1-f9d4ffa49f00\") " Dec 06 01:23:57 crc kubenswrapper[4991]: I1206 01:23:57.094051 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f786a82-5143-4f90-aed1-f9d4ffa49f00-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6f786a82-5143-4f90-aed1-f9d4ffa49f00" (UID: "6f786a82-5143-4f90-aed1-f9d4ffa49f00"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:23:57 crc kubenswrapper[4991]: I1206 01:23:57.094404 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f786a82-5143-4f90-aed1-f9d4ffa49f00-kube-api-access-rdwzt" (OuterVolumeSpecName: "kube-api-access-rdwzt") pod "6f786a82-5143-4f90-aed1-f9d4ffa49f00" (UID: "6f786a82-5143-4f90-aed1-f9d4ffa49f00"). InnerVolumeSpecName "kube-api-access-rdwzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:23:57 crc kubenswrapper[4991]: I1206 01:23:57.165467 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f23d-account-create-update-lk7cd" Dec 06 01:23:57 crc kubenswrapper[4991]: I1206 01:23:57.167552 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-92a2-account-create-update-8xs8q" Dec 06 01:23:57 crc kubenswrapper[4991]: I1206 01:23:57.202816 4991 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f786a82-5143-4f90-aed1-f9d4ffa49f00-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:57 crc kubenswrapper[4991]: I1206 01:23:57.202925 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdwzt\" (UniqueName: \"kubernetes.io/projected/6f786a82-5143-4f90-aed1-f9d4ffa49f00-kube-api-access-rdwzt\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:57 crc kubenswrapper[4991]: I1206 01:23:57.304508 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kzc2\" (UniqueName: \"kubernetes.io/projected/9f87ca3d-5755-4aa8-8cd0-99f6841bbd83-kube-api-access-2kzc2\") pod \"9f87ca3d-5755-4aa8-8cd0-99f6841bbd83\" (UID: \"9f87ca3d-5755-4aa8-8cd0-99f6841bbd83\") " Dec 06 01:23:57 crc kubenswrapper[4991]: I1206 01:23:57.304598 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f87ca3d-5755-4aa8-8cd0-99f6841bbd83-operator-scripts\") pod \"9f87ca3d-5755-4aa8-8cd0-99f6841bbd83\" (UID: \"9f87ca3d-5755-4aa8-8cd0-99f6841bbd83\") " Dec 06 01:23:57 crc kubenswrapper[4991]: I1206 01:23:57.304677 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22ac1a63-4582-436a-a235-bb4fe90fe6f3-operator-scripts\") pod \"22ac1a63-4582-436a-a235-bb4fe90fe6f3\" (UID: \"22ac1a63-4582-436a-a235-bb4fe90fe6f3\") " Dec 06 01:23:57 crc kubenswrapper[4991]: I1206 01:23:57.304705 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22pcc\" (UniqueName: \"kubernetes.io/projected/22ac1a63-4582-436a-a235-bb4fe90fe6f3-kube-api-access-22pcc\") pod \"22ac1a63-4582-436a-a235-bb4fe90fe6f3\" (UID: \"22ac1a63-4582-436a-a235-bb4fe90fe6f3\") " Dec 06 01:23:57 crc kubenswrapper[4991]: I1206 01:23:57.305612 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22ac1a63-4582-436a-a235-bb4fe90fe6f3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "22ac1a63-4582-436a-a235-bb4fe90fe6f3" (UID: "22ac1a63-4582-436a-a235-bb4fe90fe6f3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:23:57 crc kubenswrapper[4991]: I1206 01:23:57.306201 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f87ca3d-5755-4aa8-8cd0-99f6841bbd83-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9f87ca3d-5755-4aa8-8cd0-99f6841bbd83" (UID: "9f87ca3d-5755-4aa8-8cd0-99f6841bbd83"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:23:57 crc kubenswrapper[4991]: I1206 01:23:57.306949 4991 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f87ca3d-5755-4aa8-8cd0-99f6841bbd83-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:57 crc kubenswrapper[4991]: I1206 01:23:57.307022 4991 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22ac1a63-4582-436a-a235-bb4fe90fe6f3-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:57 crc kubenswrapper[4991]: I1206 01:23:57.309747 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22ac1a63-4582-436a-a235-bb4fe90fe6f3-kube-api-access-22pcc" (OuterVolumeSpecName: "kube-api-access-22pcc") pod "22ac1a63-4582-436a-a235-bb4fe90fe6f3" (UID: "22ac1a63-4582-436a-a235-bb4fe90fe6f3"). InnerVolumeSpecName "kube-api-access-22pcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:23:57 crc kubenswrapper[4991]: I1206 01:23:57.310059 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f87ca3d-5755-4aa8-8cd0-99f6841bbd83-kube-api-access-2kzc2" (OuterVolumeSpecName: "kube-api-access-2kzc2") pod "9f87ca3d-5755-4aa8-8cd0-99f6841bbd83" (UID: "9f87ca3d-5755-4aa8-8cd0-99f6841bbd83"). InnerVolumeSpecName "kube-api-access-2kzc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:23:57 crc kubenswrapper[4991]: I1206 01:23:57.409580 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kzc2\" (UniqueName: \"kubernetes.io/projected/9f87ca3d-5755-4aa8-8cd0-99f6841bbd83-kube-api-access-2kzc2\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:57 crc kubenswrapper[4991]: I1206 01:23:57.410038 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22pcc\" (UniqueName: \"kubernetes.io/projected/22ac1a63-4582-436a-a235-bb4fe90fe6f3-kube-api-access-22pcc\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:57 crc kubenswrapper[4991]: I1206 01:23:57.623910 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-92a2-account-create-update-8xs8q" event={"ID":"22ac1a63-4582-436a-a235-bb4fe90fe6f3","Type":"ContainerDied","Data":"cdae0988a135b2009094dfe2b50f0009f6a1e39a58e62e7df4e94c668352eba7"} Dec 06 01:23:57 crc kubenswrapper[4991]: I1206 01:23:57.623942 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-92a2-account-create-update-8xs8q" Dec 06 01:23:57 crc kubenswrapper[4991]: I1206 01:23:57.623960 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cdae0988a135b2009094dfe2b50f0009f6a1e39a58e62e7df4e94c668352eba7" Dec 06 01:23:57 crc kubenswrapper[4991]: I1206 01:23:57.626258 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f23d-account-create-update-lk7cd" event={"ID":"9f87ca3d-5755-4aa8-8cd0-99f6841bbd83","Type":"ContainerDied","Data":"942e7baa801e6c5fc2fb45207217f9f95c223d787e0b87f2d209dd870358a392"} Dec 06 01:23:57 crc kubenswrapper[4991]: I1206 01:23:57.626287 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="942e7baa801e6c5fc2fb45207217f9f95c223d787e0b87f2d209dd870358a392" Dec 06 01:23:57 crc kubenswrapper[4991]: I1206 01:23:57.626353 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f23d-account-create-update-lk7cd" Dec 06 01:23:57 crc kubenswrapper[4991]: I1206 01:23:57.635196 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b475d4e-b396-4747-9aba-b6ff6bed11a1","Type":"ContainerStarted","Data":"0624249c154e64232e5835a9d8a401445dd224985723ef2f16bf3a0796438550"} Dec 06 01:23:57 crc kubenswrapper[4991]: I1206 01:23:57.635392 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b475d4e-b396-4747-9aba-b6ff6bed11a1" containerName="ceilometer-central-agent" containerID="cri-o://e8d24f81c46907da68d292f3bf6c8d16230dc3ffe0ed92b1391a1069494e827f" gracePeriod=30 Dec 06 01:23:57 crc kubenswrapper[4991]: I1206 01:23:57.635482 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 06 01:23:57 crc kubenswrapper[4991]: I1206 01:23:57.635819 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b475d4e-b396-4747-9aba-b6ff6bed11a1" containerName="proxy-httpd" containerID="cri-o://0624249c154e64232e5835a9d8a401445dd224985723ef2f16bf3a0796438550" gracePeriod=30 Dec 06 01:23:57 crc kubenswrapper[4991]: I1206 01:23:57.635870 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b475d4e-b396-4747-9aba-b6ff6bed11a1" containerName="sg-core" containerID="cri-o://454fed63e9fe8536a5bdd1a09ee7d293696c92e3af429d2bef307d31648504a8" gracePeriod=30 Dec 06 01:23:57 crc kubenswrapper[4991]: I1206 01:23:57.635909 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b475d4e-b396-4747-9aba-b6ff6bed11a1" containerName="ceilometer-notification-agent" containerID="cri-o://5b713ecd2f19e5cfef2d08df7794411b3b31acae252f465ac5ecfce528146af7" gracePeriod=30 Dec 06 01:23:57 crc kubenswrapper[4991]: I1206 01:23:57.656318 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hctc4" event={"ID":"6f786a82-5143-4f90-aed1-f9d4ffa49f00","Type":"ContainerDied","Data":"b018eeb4158051ea23fa4e1cfb520f7cfdc9638e5bb644a5165f6fef6a71830e"} Dec 06 01:23:57 crc kubenswrapper[4991]: I1206 01:23:57.656372 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b018eeb4158051ea23fa4e1cfb520f7cfdc9638e5bb644a5165f6fef6a71830e" Dec 06 01:23:57 crc kubenswrapper[4991]: I1206 01:23:57.656487 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hctc4" Dec 06 01:23:57 crc kubenswrapper[4991]: I1206 01:23:57.668888 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.57617238 podStartE2EDuration="5.668850365s" podCreationTimestamp="2025-12-06 01:23:52 +0000 UTC" firstStartedPulling="2025-12-06 01:23:54.109605155 +0000 UTC m=+1307.459895623" lastFinishedPulling="2025-12-06 01:23:57.20228314 +0000 UTC m=+1310.552573608" observedRunningTime="2025-12-06 01:23:57.655033268 +0000 UTC m=+1311.005323736" watchObservedRunningTime="2025-12-06 01:23:57.668850365 +0000 UTC m=+1311.019140853" Dec 06 01:23:58 crc kubenswrapper[4991]: I1206 01:23:58.668290 4991 generic.go:334] "Generic (PLEG): container finished" podID="4b475d4e-b396-4747-9aba-b6ff6bed11a1" containerID="0624249c154e64232e5835a9d8a401445dd224985723ef2f16bf3a0796438550" exitCode=0 Dec 06 01:23:58 crc kubenswrapper[4991]: I1206 01:23:58.668656 4991 generic.go:334] "Generic (PLEG): container finished" podID="4b475d4e-b396-4747-9aba-b6ff6bed11a1" containerID="454fed63e9fe8536a5bdd1a09ee7d293696c92e3af429d2bef307d31648504a8" exitCode=2 Dec 06 01:23:58 crc kubenswrapper[4991]: I1206 01:23:58.668672 4991 generic.go:334] "Generic (PLEG): container finished" podID="4b475d4e-b396-4747-9aba-b6ff6bed11a1" containerID="5b713ecd2f19e5cfef2d08df7794411b3b31acae252f465ac5ecfce528146af7" exitCode=0 Dec 06 01:23:58 crc kubenswrapper[4991]: I1206 01:23:58.668497 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b475d4e-b396-4747-9aba-b6ff6bed11a1","Type":"ContainerDied","Data":"0624249c154e64232e5835a9d8a401445dd224985723ef2f16bf3a0796438550"} Dec 06 01:23:58 crc kubenswrapper[4991]: I1206 01:23:58.668717 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b475d4e-b396-4747-9aba-b6ff6bed11a1","Type":"ContainerDied","Data":"454fed63e9fe8536a5bdd1a09ee7d293696c92e3af429d2bef307d31648504a8"} Dec 06 01:23:58 crc kubenswrapper[4991]: I1206 01:23:58.668736 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b475d4e-b396-4747-9aba-b6ff6bed11a1","Type":"ContainerDied","Data":"5b713ecd2f19e5cfef2d08df7794411b3b31acae252f465ac5ecfce528146af7"} Dec 06 01:23:59 crc kubenswrapper[4991]: I1206 01:23:59.684555 4991 generic.go:334] "Generic (PLEG): container finished" podID="ff6a4088-3509-40aa-a3f6-917e289cd44f" containerID="9c8d337db25b8faca94ece7dc0776b6fb45830b2b42f2b7d52e4a5a4778a214e" exitCode=137 Dec 06 01:23:59 crc kubenswrapper[4991]: I1206 01:23:59.684682 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b4cd4b7d8-99kjr" event={"ID":"ff6a4088-3509-40aa-a3f6-917e289cd44f","Type":"ContainerDied","Data":"9c8d337db25b8faca94ece7dc0776b6fb45830b2b42f2b7d52e4a5a4778a214e"} Dec 06 01:23:59 crc kubenswrapper[4991]: I1206 01:23:59.685097 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b4cd4b7d8-99kjr" event={"ID":"ff6a4088-3509-40aa-a3f6-917e289cd44f","Type":"ContainerDied","Data":"5343ad0f9b14637fe0702df3ffd775e7921cceca8119e9ad5a88b4e1f5d15d44"} Dec 06 01:23:59 crc kubenswrapper[4991]: I1206 01:23:59.685119 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5343ad0f9b14637fe0702df3ffd775e7921cceca8119e9ad5a88b4e1f5d15d44" Dec 06 01:23:59 crc kubenswrapper[4991]: I1206 01:23:59.687903 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b4cd4b7d8-99kjr" Dec 06 01:23:59 crc kubenswrapper[4991]: I1206 01:23:59.776685 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff6a4088-3509-40aa-a3f6-917e289cd44f-horizon-tls-certs\") pod \"ff6a4088-3509-40aa-a3f6-917e289cd44f\" (UID: \"ff6a4088-3509-40aa-a3f6-917e289cd44f\") " Dec 06 01:23:59 crc kubenswrapper[4991]: I1206 01:23:59.776740 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ff6a4088-3509-40aa-a3f6-917e289cd44f-horizon-secret-key\") pod \"ff6a4088-3509-40aa-a3f6-917e289cd44f\" (UID: \"ff6a4088-3509-40aa-a3f6-917e289cd44f\") " Dec 06 01:23:59 crc kubenswrapper[4991]: I1206 01:23:59.776760 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff6a4088-3509-40aa-a3f6-917e289cd44f-combined-ca-bundle\") pod \"ff6a4088-3509-40aa-a3f6-917e289cd44f\" (UID: \"ff6a4088-3509-40aa-a3f6-917e289cd44f\") " Dec 06 01:23:59 crc kubenswrapper[4991]: I1206 01:23:59.776862 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff6a4088-3509-40aa-a3f6-917e289cd44f-scripts\") pod \"ff6a4088-3509-40aa-a3f6-917e289cd44f\" (UID: \"ff6a4088-3509-40aa-a3f6-917e289cd44f\") " Dec 06 01:23:59 crc kubenswrapper[4991]: I1206 01:23:59.776915 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff6a4088-3509-40aa-a3f6-917e289cd44f-logs\") pod \"ff6a4088-3509-40aa-a3f6-917e289cd44f\" (UID: \"ff6a4088-3509-40aa-a3f6-917e289cd44f\") " Dec 06 01:23:59 crc kubenswrapper[4991]: I1206 01:23:59.777593 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff6a4088-3509-40aa-a3f6-917e289cd44f-logs" (OuterVolumeSpecName: "logs") pod "ff6a4088-3509-40aa-a3f6-917e289cd44f" (UID: "ff6a4088-3509-40aa-a3f6-917e289cd44f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:23:59 crc kubenswrapper[4991]: I1206 01:23:59.777821 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvl2d\" (UniqueName: \"kubernetes.io/projected/ff6a4088-3509-40aa-a3f6-917e289cd44f-kube-api-access-xvl2d\") pod \"ff6a4088-3509-40aa-a3f6-917e289cd44f\" (UID: \"ff6a4088-3509-40aa-a3f6-917e289cd44f\") " Dec 06 01:23:59 crc kubenswrapper[4991]: I1206 01:23:59.777887 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ff6a4088-3509-40aa-a3f6-917e289cd44f-config-data\") pod \"ff6a4088-3509-40aa-a3f6-917e289cd44f\" (UID: \"ff6a4088-3509-40aa-a3f6-917e289cd44f\") " Dec 06 01:23:59 crc kubenswrapper[4991]: I1206 01:23:59.778445 4991 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff6a4088-3509-40aa-a3f6-917e289cd44f-logs\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:59 crc kubenswrapper[4991]: I1206 01:23:59.784732 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff6a4088-3509-40aa-a3f6-917e289cd44f-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "ff6a4088-3509-40aa-a3f6-917e289cd44f" (UID: "ff6a4088-3509-40aa-a3f6-917e289cd44f"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:23:59 crc kubenswrapper[4991]: I1206 01:23:59.800221 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff6a4088-3509-40aa-a3f6-917e289cd44f-kube-api-access-xvl2d" (OuterVolumeSpecName: "kube-api-access-xvl2d") pod "ff6a4088-3509-40aa-a3f6-917e289cd44f" (UID: "ff6a4088-3509-40aa-a3f6-917e289cd44f"). InnerVolumeSpecName "kube-api-access-xvl2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:23:59 crc kubenswrapper[4991]: I1206 01:23:59.805272 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff6a4088-3509-40aa-a3f6-917e289cd44f-scripts" (OuterVolumeSpecName: "scripts") pod "ff6a4088-3509-40aa-a3f6-917e289cd44f" (UID: "ff6a4088-3509-40aa-a3f6-917e289cd44f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:23:59 crc kubenswrapper[4991]: I1206 01:23:59.805797 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff6a4088-3509-40aa-a3f6-917e289cd44f-config-data" (OuterVolumeSpecName: "config-data") pod "ff6a4088-3509-40aa-a3f6-917e289cd44f" (UID: "ff6a4088-3509-40aa-a3f6-917e289cd44f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:23:59 crc kubenswrapper[4991]: I1206 01:23:59.812615 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff6a4088-3509-40aa-a3f6-917e289cd44f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff6a4088-3509-40aa-a3f6-917e289cd44f" (UID: "ff6a4088-3509-40aa-a3f6-917e289cd44f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:23:59 crc kubenswrapper[4991]: I1206 01:23:59.859373 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff6a4088-3509-40aa-a3f6-917e289cd44f-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "ff6a4088-3509-40aa-a3f6-917e289cd44f" (UID: "ff6a4088-3509-40aa-a3f6-917e289cd44f"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:23:59 crc kubenswrapper[4991]: I1206 01:23:59.880623 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ff6a4088-3509-40aa-a3f6-917e289cd44f-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:59 crc kubenswrapper[4991]: I1206 01:23:59.880683 4991 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff6a4088-3509-40aa-a3f6-917e289cd44f-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:59 crc kubenswrapper[4991]: I1206 01:23:59.880695 4991 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ff6a4088-3509-40aa-a3f6-917e289cd44f-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:59 crc kubenswrapper[4991]: I1206 01:23:59.880710 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff6a4088-3509-40aa-a3f6-917e289cd44f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:59 crc kubenswrapper[4991]: I1206 01:23:59.880720 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff6a4088-3509-40aa-a3f6-917e289cd44f-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:23:59 crc kubenswrapper[4991]: I1206 01:23:59.880731 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvl2d\" (UniqueName: \"kubernetes.io/projected/ff6a4088-3509-40aa-a3f6-917e289cd44f-kube-api-access-xvl2d\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:00 crc kubenswrapper[4991]: I1206 01:24:00.694049 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b4cd4b7d8-99kjr" Dec 06 01:24:00 crc kubenswrapper[4991]: I1206 01:24:00.743791 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-b4cd4b7d8-99kjr"] Dec 06 01:24:00 crc kubenswrapper[4991]: I1206 01:24:00.752576 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-b4cd4b7d8-99kjr"] Dec 06 01:24:01 crc kubenswrapper[4991]: I1206 01:24:01.049486 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff6a4088-3509-40aa-a3f6-917e289cd44f" path="/var/lib/kubelet/pods/ff6a4088-3509-40aa-a3f6-917e289cd44f/volumes" Dec 06 01:24:01 crc kubenswrapper[4991]: I1206 01:24:01.704249 4991 generic.go:334] "Generic (PLEG): container finished" podID="4b475d4e-b396-4747-9aba-b6ff6bed11a1" containerID="e8d24f81c46907da68d292f3bf6c8d16230dc3ffe0ed92b1391a1069494e827f" exitCode=0 Dec 06 01:24:01 crc kubenswrapper[4991]: I1206 01:24:01.704327 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b475d4e-b396-4747-9aba-b6ff6bed11a1","Type":"ContainerDied","Data":"e8d24f81c46907da68d292f3bf6c8d16230dc3ffe0ed92b1391a1069494e827f"} Dec 06 01:24:01 crc kubenswrapper[4991]: I1206 01:24:01.776397 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 01:24:01 crc kubenswrapper[4991]: I1206 01:24:01.824067 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b475d4e-b396-4747-9aba-b6ff6bed11a1-log-httpd\") pod \"4b475d4e-b396-4747-9aba-b6ff6bed11a1\" (UID: \"4b475d4e-b396-4747-9aba-b6ff6bed11a1\") " Dec 06 01:24:01 crc kubenswrapper[4991]: I1206 01:24:01.824131 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b475d4e-b396-4747-9aba-b6ff6bed11a1-sg-core-conf-yaml\") pod \"4b475d4e-b396-4747-9aba-b6ff6bed11a1\" (UID: \"4b475d4e-b396-4747-9aba-b6ff6bed11a1\") " Dec 06 01:24:01 crc kubenswrapper[4991]: I1206 01:24:01.824244 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b475d4e-b396-4747-9aba-b6ff6bed11a1-run-httpd\") pod \"4b475d4e-b396-4747-9aba-b6ff6bed11a1\" (UID: \"4b475d4e-b396-4747-9aba-b6ff6bed11a1\") " Dec 06 01:24:01 crc kubenswrapper[4991]: I1206 01:24:01.824273 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b475d4e-b396-4747-9aba-b6ff6bed11a1-config-data\") pod \"4b475d4e-b396-4747-9aba-b6ff6bed11a1\" (UID: \"4b475d4e-b396-4747-9aba-b6ff6bed11a1\") " Dec 06 01:24:01 crc kubenswrapper[4991]: I1206 01:24:01.824302 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b475d4e-b396-4747-9aba-b6ff6bed11a1-scripts\") pod \"4b475d4e-b396-4747-9aba-b6ff6bed11a1\" (UID: \"4b475d4e-b396-4747-9aba-b6ff6bed11a1\") " Dec 06 01:24:01 crc kubenswrapper[4991]: I1206 01:24:01.824331 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6j42\" (UniqueName: \"kubernetes.io/projected/4b475d4e-b396-4747-9aba-b6ff6bed11a1-kube-api-access-l6j42\") pod \"4b475d4e-b396-4747-9aba-b6ff6bed11a1\" (UID: \"4b475d4e-b396-4747-9aba-b6ff6bed11a1\") " Dec 06 01:24:01 crc kubenswrapper[4991]: I1206 01:24:01.824355 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b475d4e-b396-4747-9aba-b6ff6bed11a1-combined-ca-bundle\") pod \"4b475d4e-b396-4747-9aba-b6ff6bed11a1\" (UID: \"4b475d4e-b396-4747-9aba-b6ff6bed11a1\") " Dec 06 01:24:01 crc kubenswrapper[4991]: I1206 01:24:01.825613 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b475d4e-b396-4747-9aba-b6ff6bed11a1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4b475d4e-b396-4747-9aba-b6ff6bed11a1" (UID: "4b475d4e-b396-4747-9aba-b6ff6bed11a1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:24:01 crc kubenswrapper[4991]: I1206 01:24:01.825838 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b475d4e-b396-4747-9aba-b6ff6bed11a1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4b475d4e-b396-4747-9aba-b6ff6bed11a1" (UID: "4b475d4e-b396-4747-9aba-b6ff6bed11a1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:24:01 crc kubenswrapper[4991]: I1206 01:24:01.838613 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b475d4e-b396-4747-9aba-b6ff6bed11a1-scripts" (OuterVolumeSpecName: "scripts") pod "4b475d4e-b396-4747-9aba-b6ff6bed11a1" (UID: "4b475d4e-b396-4747-9aba-b6ff6bed11a1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:24:01 crc kubenswrapper[4991]: I1206 01:24:01.838688 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b475d4e-b396-4747-9aba-b6ff6bed11a1-kube-api-access-l6j42" (OuterVolumeSpecName: "kube-api-access-l6j42") pod "4b475d4e-b396-4747-9aba-b6ff6bed11a1" (UID: "4b475d4e-b396-4747-9aba-b6ff6bed11a1"). InnerVolumeSpecName "kube-api-access-l6j42". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:24:01 crc kubenswrapper[4991]: I1206 01:24:01.855825 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b475d4e-b396-4747-9aba-b6ff6bed11a1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4b475d4e-b396-4747-9aba-b6ff6bed11a1" (UID: "4b475d4e-b396-4747-9aba-b6ff6bed11a1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:24:01 crc kubenswrapper[4991]: I1206 01:24:01.915477 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b475d4e-b396-4747-9aba-b6ff6bed11a1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b475d4e-b396-4747-9aba-b6ff6bed11a1" (UID: "4b475d4e-b396-4747-9aba-b6ff6bed11a1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:24:01 crc kubenswrapper[4991]: I1206 01:24:01.927216 4991 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b475d4e-b396-4747-9aba-b6ff6bed11a1-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:01 crc kubenswrapper[4991]: I1206 01:24:01.927249 4991 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b475d4e-b396-4747-9aba-b6ff6bed11a1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:01 crc kubenswrapper[4991]: I1206 01:24:01.927261 4991 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b475d4e-b396-4747-9aba-b6ff6bed11a1-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:01 crc kubenswrapper[4991]: I1206 01:24:01.927271 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b475d4e-b396-4747-9aba-b6ff6bed11a1-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:01 crc kubenswrapper[4991]: I1206 01:24:01.927283 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6j42\" (UniqueName: \"kubernetes.io/projected/4b475d4e-b396-4747-9aba-b6ff6bed11a1-kube-api-access-l6j42\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:01 crc kubenswrapper[4991]: I1206 01:24:01.927295 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b475d4e-b396-4747-9aba-b6ff6bed11a1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:01 crc kubenswrapper[4991]: I1206 01:24:01.935129 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b475d4e-b396-4747-9aba-b6ff6bed11a1-config-data" (OuterVolumeSpecName: "config-data") pod "4b475d4e-b396-4747-9aba-b6ff6bed11a1" (UID: "4b475d4e-b396-4747-9aba-b6ff6bed11a1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:24:02 crc kubenswrapper[4991]: I1206 01:24:02.029457 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b475d4e-b396-4747-9aba-b6ff6bed11a1-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:02 crc kubenswrapper[4991]: I1206 01:24:02.714827 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b475d4e-b396-4747-9aba-b6ff6bed11a1","Type":"ContainerDied","Data":"627114f2d025c2690b99271060a1df93ebb8c9b1a85f9f110a2ea045f5cdbefa"} Dec 06 01:24:02 crc kubenswrapper[4991]: I1206 01:24:02.714891 4991 scope.go:117] "RemoveContainer" containerID="0624249c154e64232e5835a9d8a401445dd224985723ef2f16bf3a0796438550" Dec 06 01:24:02 crc kubenswrapper[4991]: I1206 01:24:02.715543 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 01:24:02 crc kubenswrapper[4991]: I1206 01:24:02.741167 4991 scope.go:117] "RemoveContainer" containerID="454fed63e9fe8536a5bdd1a09ee7d293696c92e3af429d2bef307d31648504a8" Dec 06 01:24:02 crc kubenswrapper[4991]: I1206 01:24:02.762911 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:24:02 crc kubenswrapper[4991]: I1206 01:24:02.764539 4991 scope.go:117] "RemoveContainer" containerID="5b713ecd2f19e5cfef2d08df7794411b3b31acae252f465ac5ecfce528146af7" Dec 06 01:24:02 crc kubenswrapper[4991]: I1206 01:24:02.777244 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:24:02 crc kubenswrapper[4991]: I1206 01:24:02.824254 4991 scope.go:117] "RemoveContainer" containerID="e8d24f81c46907da68d292f3bf6c8d16230dc3ffe0ed92b1391a1069494e827f" Dec 06 01:24:02 crc kubenswrapper[4991]: I1206 01:24:02.868776 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:24:02 crc kubenswrapper[4991]: E1206 01:24:02.869191 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff6a4088-3509-40aa-a3f6-917e289cd44f" containerName="horizon" Dec 06 01:24:02 crc kubenswrapper[4991]: I1206 01:24:02.869208 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff6a4088-3509-40aa-a3f6-917e289cd44f" containerName="horizon" Dec 06 01:24:02 crc kubenswrapper[4991]: E1206 01:24:02.869218 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4899a057-9b8a-4dd0-8219-9a626d671fdf" containerName="mariadb-account-create-update" Dec 06 01:24:02 crc kubenswrapper[4991]: I1206 01:24:02.869224 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="4899a057-9b8a-4dd0-8219-9a626d671fdf" containerName="mariadb-account-create-update" Dec 06 01:24:02 crc kubenswrapper[4991]: E1206 01:24:02.869236 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b475d4e-b396-4747-9aba-b6ff6bed11a1" containerName="ceilometer-central-agent" Dec 06 01:24:02 crc kubenswrapper[4991]: I1206 01:24:02.869242 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b475d4e-b396-4747-9aba-b6ff6bed11a1" containerName="ceilometer-central-agent" Dec 06 01:24:02 crc kubenswrapper[4991]: E1206 01:24:02.869251 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff6a4088-3509-40aa-a3f6-917e289cd44f" containerName="horizon-log" Dec 06 01:24:02 crc kubenswrapper[4991]: I1206 01:24:02.869257 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff6a4088-3509-40aa-a3f6-917e289cd44f" containerName="horizon-log" Dec 06 01:24:02 crc kubenswrapper[4991]: E1206 01:24:02.869267 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b475d4e-b396-4747-9aba-b6ff6bed11a1" containerName="ceilometer-notification-agent" Dec 06 01:24:02 crc kubenswrapper[4991]: I1206 01:24:02.869273 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b475d4e-b396-4747-9aba-b6ff6bed11a1" containerName="ceilometer-notification-agent" Dec 06 01:24:02 crc kubenswrapper[4991]: E1206 01:24:02.869288 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b475d4e-b396-4747-9aba-b6ff6bed11a1" containerName="proxy-httpd" Dec 06 01:24:02 crc kubenswrapper[4991]: I1206 01:24:02.869294 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b475d4e-b396-4747-9aba-b6ff6bed11a1" containerName="proxy-httpd" Dec 06 01:24:02 crc kubenswrapper[4991]: E1206 01:24:02.869312 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22ac1a63-4582-436a-a235-bb4fe90fe6f3" containerName="mariadb-account-create-update" Dec 06 01:24:02 crc kubenswrapper[4991]: I1206 01:24:02.869317 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="22ac1a63-4582-436a-a235-bb4fe90fe6f3" containerName="mariadb-account-create-update" Dec 06 01:24:02 crc kubenswrapper[4991]: E1206 01:24:02.869328 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b475d4e-b396-4747-9aba-b6ff6bed11a1" containerName="sg-core" Dec 06 01:24:02 crc kubenswrapper[4991]: I1206 01:24:02.869335 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b475d4e-b396-4747-9aba-b6ff6bed11a1" containerName="sg-core" Dec 06 01:24:02 crc kubenswrapper[4991]: E1206 01:24:02.869344 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f87ca3d-5755-4aa8-8cd0-99f6841bbd83" containerName="mariadb-account-create-update" Dec 06 01:24:02 crc kubenswrapper[4991]: I1206 01:24:02.869350 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f87ca3d-5755-4aa8-8cd0-99f6841bbd83" containerName="mariadb-account-create-update" Dec 06 01:24:02 crc kubenswrapper[4991]: E1206 01:24:02.869362 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbf1edef-446b-4eb8-beba-fde054153a2b" containerName="mariadb-database-create" Dec 06 01:24:02 crc kubenswrapper[4991]: I1206 01:24:02.869368 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbf1edef-446b-4eb8-beba-fde054153a2b" containerName="mariadb-database-create" Dec 06 01:24:02 crc kubenswrapper[4991]: E1206 01:24:02.869380 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cad619ec-5672-46aa-b7d8-2f996561383d" containerName="mariadb-database-create" Dec 06 01:24:02 crc kubenswrapper[4991]: I1206 01:24:02.869386 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="cad619ec-5672-46aa-b7d8-2f996561383d" containerName="mariadb-database-create" Dec 06 01:24:02 crc kubenswrapper[4991]: E1206 01:24:02.869604 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f786a82-5143-4f90-aed1-f9d4ffa49f00" containerName="mariadb-database-create" Dec 06 01:24:02 crc kubenswrapper[4991]: I1206 01:24:02.869611 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f786a82-5143-4f90-aed1-f9d4ffa49f00" containerName="mariadb-database-create" Dec 06 01:24:02 crc kubenswrapper[4991]: I1206 01:24:02.869783 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff6a4088-3509-40aa-a3f6-917e289cd44f" containerName="horizon" Dec 06 01:24:02 crc kubenswrapper[4991]: I1206 01:24:02.869795 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b475d4e-b396-4747-9aba-b6ff6bed11a1" containerName="ceilometer-notification-agent" Dec 06 01:24:02 crc kubenswrapper[4991]: I1206 01:24:02.869804 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="22ac1a63-4582-436a-a235-bb4fe90fe6f3" containerName="mariadb-account-create-update" Dec 06 01:24:02 crc kubenswrapper[4991]: I1206 01:24:02.869817 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b475d4e-b396-4747-9aba-b6ff6bed11a1" containerName="proxy-httpd" Dec 06 01:24:02 crc kubenswrapper[4991]: I1206 01:24:02.869830 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbf1edef-446b-4eb8-beba-fde054153a2b" containerName="mariadb-database-create" Dec 06 01:24:02 crc kubenswrapper[4991]: I1206 01:24:02.869843 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="cad619ec-5672-46aa-b7d8-2f996561383d" containerName="mariadb-database-create" Dec 06 01:24:02 crc kubenswrapper[4991]: I1206 01:24:02.869855 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff6a4088-3509-40aa-a3f6-917e289cd44f" containerName="horizon-log" Dec 06 01:24:02 crc kubenswrapper[4991]: I1206 01:24:02.869863 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="4899a057-9b8a-4dd0-8219-9a626d671fdf" containerName="mariadb-account-create-update" Dec 06 01:24:02 crc kubenswrapper[4991]: I1206 01:24:02.869876 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b475d4e-b396-4747-9aba-b6ff6bed11a1" containerName="ceilometer-central-agent" Dec 06 01:24:02 crc kubenswrapper[4991]: I1206 01:24:02.869884 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f786a82-5143-4f90-aed1-f9d4ffa49f00" containerName="mariadb-database-create" Dec 06 01:24:02 crc kubenswrapper[4991]: I1206 01:24:02.869894 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f87ca3d-5755-4aa8-8cd0-99f6841bbd83" containerName="mariadb-account-create-update" Dec 06 01:24:02 crc kubenswrapper[4991]: I1206 01:24:02.869902 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b475d4e-b396-4747-9aba-b6ff6bed11a1" containerName="sg-core" Dec 06 01:24:02 crc kubenswrapper[4991]: I1206 01:24:02.872590 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 01:24:02 crc kubenswrapper[4991]: I1206 01:24:02.886844 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 06 01:24:02 crc kubenswrapper[4991]: I1206 01:24:02.909424 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 06 01:24:02 crc kubenswrapper[4991]: I1206 01:24:02.929767 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:24:02 crc kubenswrapper[4991]: I1206 01:24:02.950033 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e394cbd-aea5-4c26-841a-9c504c6a49ef-log-httpd\") pod \"ceilometer-0\" (UID: \"0e394cbd-aea5-4c26-841a-9c504c6a49ef\") " pod="openstack/ceilometer-0" Dec 06 01:24:02 crc kubenswrapper[4991]: I1206 01:24:02.950093 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e394cbd-aea5-4c26-841a-9c504c6a49ef-scripts\") pod \"ceilometer-0\" (UID: \"0e394cbd-aea5-4c26-841a-9c504c6a49ef\") " pod="openstack/ceilometer-0" Dec 06 01:24:02 crc kubenswrapper[4991]: I1206 01:24:02.950124 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw6d6\" (UniqueName: \"kubernetes.io/projected/0e394cbd-aea5-4c26-841a-9c504c6a49ef-kube-api-access-pw6d6\") pod \"ceilometer-0\" (UID: \"0e394cbd-aea5-4c26-841a-9c504c6a49ef\") " pod="openstack/ceilometer-0" Dec 06 01:24:02 crc kubenswrapper[4991]: I1206 01:24:02.950178 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0e394cbd-aea5-4c26-841a-9c504c6a49ef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0e394cbd-aea5-4c26-841a-9c504c6a49ef\") " pod="openstack/ceilometer-0" Dec 06 01:24:02 crc kubenswrapper[4991]: I1206 01:24:02.950212 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e394cbd-aea5-4c26-841a-9c504c6a49ef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0e394cbd-aea5-4c26-841a-9c504c6a49ef\") " pod="openstack/ceilometer-0" Dec 06 01:24:02 crc kubenswrapper[4991]: I1206 01:24:02.950237 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e394cbd-aea5-4c26-841a-9c504c6a49ef-run-httpd\") pod \"ceilometer-0\" (UID: \"0e394cbd-aea5-4c26-841a-9c504c6a49ef\") " pod="openstack/ceilometer-0" Dec 06 01:24:02 crc kubenswrapper[4991]: I1206 01:24:02.950253 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e394cbd-aea5-4c26-841a-9c504c6a49ef-config-data\") pod \"ceilometer-0\" (UID: \"0e394cbd-aea5-4c26-841a-9c504c6a49ef\") " pod="openstack/ceilometer-0" Dec 06 01:24:03 crc kubenswrapper[4991]: I1206 01:24:03.052039 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b475d4e-b396-4747-9aba-b6ff6bed11a1" path="/var/lib/kubelet/pods/4b475d4e-b396-4747-9aba-b6ff6bed11a1/volumes" Dec 06 01:24:03 crc kubenswrapper[4991]: I1206 01:24:03.052191 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e394cbd-aea5-4c26-841a-9c504c6a49ef-log-httpd\") pod \"ceilometer-0\" (UID: \"0e394cbd-aea5-4c26-841a-9c504c6a49ef\") " pod="openstack/ceilometer-0" Dec 06 01:24:03 crc kubenswrapper[4991]: I1206 01:24:03.052810 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e394cbd-aea5-4c26-841a-9c504c6a49ef-scripts\") pod \"ceilometer-0\" (UID: \"0e394cbd-aea5-4c26-841a-9c504c6a49ef\") " pod="openstack/ceilometer-0" Dec 06 01:24:03 crc kubenswrapper[4991]: I1206 01:24:03.052910 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw6d6\" (UniqueName: \"kubernetes.io/projected/0e394cbd-aea5-4c26-841a-9c504c6a49ef-kube-api-access-pw6d6\") pod \"ceilometer-0\" (UID: \"0e394cbd-aea5-4c26-841a-9c504c6a49ef\") " pod="openstack/ceilometer-0" Dec 06 01:24:03 crc kubenswrapper[4991]: I1206 01:24:03.052642 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e394cbd-aea5-4c26-841a-9c504c6a49ef-log-httpd\") pod \"ceilometer-0\" (UID: \"0e394cbd-aea5-4c26-841a-9c504c6a49ef\") " pod="openstack/ceilometer-0" Dec 06 01:24:03 crc kubenswrapper[4991]: I1206 01:24:03.053132 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0e394cbd-aea5-4c26-841a-9c504c6a49ef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0e394cbd-aea5-4c26-841a-9c504c6a49ef\") " pod="openstack/ceilometer-0" Dec 06 01:24:03 crc kubenswrapper[4991]: I1206 01:24:03.053237 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e394cbd-aea5-4c26-841a-9c504c6a49ef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0e394cbd-aea5-4c26-841a-9c504c6a49ef\") " pod="openstack/ceilometer-0" Dec 06 01:24:03 crc kubenswrapper[4991]: I1206 01:24:03.053325 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e394cbd-aea5-4c26-841a-9c504c6a49ef-run-httpd\") pod \"ceilometer-0\" (UID: \"0e394cbd-aea5-4c26-841a-9c504c6a49ef\") " pod="openstack/ceilometer-0" Dec 06 01:24:03 crc kubenswrapper[4991]: I1206 01:24:03.053398 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e394cbd-aea5-4c26-841a-9c504c6a49ef-config-data\") pod \"ceilometer-0\" (UID: \"0e394cbd-aea5-4c26-841a-9c504c6a49ef\") " pod="openstack/ceilometer-0" Dec 06 01:24:03 crc kubenswrapper[4991]: I1206 01:24:03.056371 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e394cbd-aea5-4c26-841a-9c504c6a49ef-run-httpd\") pod \"ceilometer-0\" (UID: \"0e394cbd-aea5-4c26-841a-9c504c6a49ef\") " pod="openstack/ceilometer-0" Dec 06 01:24:03 crc kubenswrapper[4991]: I1206 01:24:03.064910 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e394cbd-aea5-4c26-841a-9c504c6a49ef-scripts\") pod \"ceilometer-0\" (UID: \"0e394cbd-aea5-4c26-841a-9c504c6a49ef\") " pod="openstack/ceilometer-0" Dec 06 01:24:03 crc kubenswrapper[4991]: I1206 01:24:03.071685 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0e394cbd-aea5-4c26-841a-9c504c6a49ef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0e394cbd-aea5-4c26-841a-9c504c6a49ef\") " pod="openstack/ceilometer-0" Dec 06 01:24:03 crc kubenswrapper[4991]: I1206 01:24:03.072173 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e394cbd-aea5-4c26-841a-9c504c6a49ef-config-data\") pod \"ceilometer-0\" (UID: \"0e394cbd-aea5-4c26-841a-9c504c6a49ef\") " pod="openstack/ceilometer-0" Dec 06 01:24:03 crc kubenswrapper[4991]: I1206 01:24:03.072400 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e394cbd-aea5-4c26-841a-9c504c6a49ef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0e394cbd-aea5-4c26-841a-9c504c6a49ef\") " pod="openstack/ceilometer-0" Dec 06 01:24:03 crc kubenswrapper[4991]: I1206 01:24:03.077430 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw6d6\" (UniqueName: \"kubernetes.io/projected/0e394cbd-aea5-4c26-841a-9c504c6a49ef-kube-api-access-pw6d6\") pod \"ceilometer-0\" (UID: \"0e394cbd-aea5-4c26-841a-9c504c6a49ef\") " pod="openstack/ceilometer-0" Dec 06 01:24:03 crc kubenswrapper[4991]: I1206 01:24:03.111806 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 06 01:24:03 crc kubenswrapper[4991]: I1206 01:24:03.197064 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 01:24:03 crc kubenswrapper[4991]: I1206 01:24:03.437881 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ktmxx"] Dec 06 01:24:03 crc kubenswrapper[4991]: I1206 01:24:03.443669 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ktmxx" Dec 06 01:24:03 crc kubenswrapper[4991]: I1206 01:24:03.457504 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 06 01:24:03 crc kubenswrapper[4991]: I1206 01:24:03.457694 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-pdgft" Dec 06 01:24:03 crc kubenswrapper[4991]: I1206 01:24:03.457815 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 06 01:24:03 crc kubenswrapper[4991]: I1206 01:24:03.465088 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4490e9af-2abc-4d2c-83e2-ce5b7324a32a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-ktmxx\" (UID: \"4490e9af-2abc-4d2c-83e2-ce5b7324a32a\") " pod="openstack/nova-cell0-conductor-db-sync-ktmxx" Dec 06 01:24:03 crc kubenswrapper[4991]: I1206 01:24:03.465201 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4490e9af-2abc-4d2c-83e2-ce5b7324a32a-scripts\") pod \"nova-cell0-conductor-db-sync-ktmxx\" (UID: \"4490e9af-2abc-4d2c-83e2-ce5b7324a32a\") " pod="openstack/nova-cell0-conductor-db-sync-ktmxx" Dec 06 01:24:03 crc kubenswrapper[4991]: I1206 01:24:03.465226 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jngdk\" (UniqueName: \"kubernetes.io/projected/4490e9af-2abc-4d2c-83e2-ce5b7324a32a-kube-api-access-jngdk\") pod \"nova-cell0-conductor-db-sync-ktmxx\" (UID: \"4490e9af-2abc-4d2c-83e2-ce5b7324a32a\") " pod="openstack/nova-cell0-conductor-db-sync-ktmxx" Dec 06 01:24:03 crc kubenswrapper[4991]: I1206 01:24:03.465564 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4490e9af-2abc-4d2c-83e2-ce5b7324a32a-config-data\") pod \"nova-cell0-conductor-db-sync-ktmxx\" (UID: \"4490e9af-2abc-4d2c-83e2-ce5b7324a32a\") " pod="openstack/nova-cell0-conductor-db-sync-ktmxx" Dec 06 01:24:03 crc kubenswrapper[4991]: I1206 01:24:03.468256 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ktmxx"] Dec 06 01:24:03 crc kubenswrapper[4991]: I1206 01:24:03.567531 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4490e9af-2abc-4d2c-83e2-ce5b7324a32a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-ktmxx\" (UID: \"4490e9af-2abc-4d2c-83e2-ce5b7324a32a\") " pod="openstack/nova-cell0-conductor-db-sync-ktmxx" Dec 06 01:24:03 crc kubenswrapper[4991]: I1206 01:24:03.567677 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4490e9af-2abc-4d2c-83e2-ce5b7324a32a-scripts\") pod \"nova-cell0-conductor-db-sync-ktmxx\" (UID: \"4490e9af-2abc-4d2c-83e2-ce5b7324a32a\") " pod="openstack/nova-cell0-conductor-db-sync-ktmxx" Dec 06 01:24:03 crc kubenswrapper[4991]: I1206 01:24:03.567699 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jngdk\" (UniqueName: \"kubernetes.io/projected/4490e9af-2abc-4d2c-83e2-ce5b7324a32a-kube-api-access-jngdk\") pod \"nova-cell0-conductor-db-sync-ktmxx\" (UID: \"4490e9af-2abc-4d2c-83e2-ce5b7324a32a\") " pod="openstack/nova-cell0-conductor-db-sync-ktmxx" Dec 06 01:24:03 crc kubenswrapper[4991]: I1206 01:24:03.567740 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4490e9af-2abc-4d2c-83e2-ce5b7324a32a-config-data\") pod \"nova-cell0-conductor-db-sync-ktmxx\" (UID: \"4490e9af-2abc-4d2c-83e2-ce5b7324a32a\") " pod="openstack/nova-cell0-conductor-db-sync-ktmxx" Dec 06 01:24:03 crc kubenswrapper[4991]: I1206 01:24:03.576564 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4490e9af-2abc-4d2c-83e2-ce5b7324a32a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-ktmxx\" (UID: \"4490e9af-2abc-4d2c-83e2-ce5b7324a32a\") " pod="openstack/nova-cell0-conductor-db-sync-ktmxx" Dec 06 01:24:03 crc kubenswrapper[4991]: I1206 01:24:03.579758 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4490e9af-2abc-4d2c-83e2-ce5b7324a32a-scripts\") pod \"nova-cell0-conductor-db-sync-ktmxx\" (UID: \"4490e9af-2abc-4d2c-83e2-ce5b7324a32a\") " pod="openstack/nova-cell0-conductor-db-sync-ktmxx" Dec 06 01:24:03 crc kubenswrapper[4991]: I1206 01:24:03.580529 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4490e9af-2abc-4d2c-83e2-ce5b7324a32a-config-data\") pod \"nova-cell0-conductor-db-sync-ktmxx\" (UID: \"4490e9af-2abc-4d2c-83e2-ce5b7324a32a\") " pod="openstack/nova-cell0-conductor-db-sync-ktmxx" Dec 06 01:24:03 crc kubenswrapper[4991]: I1206 01:24:03.607061 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jngdk\" (UniqueName: \"kubernetes.io/projected/4490e9af-2abc-4d2c-83e2-ce5b7324a32a-kube-api-access-jngdk\") pod \"nova-cell0-conductor-db-sync-ktmxx\" (UID: \"4490e9af-2abc-4d2c-83e2-ce5b7324a32a\") " pod="openstack/nova-cell0-conductor-db-sync-ktmxx" Dec 06 01:24:03 crc kubenswrapper[4991]: I1206 01:24:03.765513 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:24:03 crc kubenswrapper[4991]: I1206 01:24:03.778232 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ktmxx" Dec 06 01:24:04 crc kubenswrapper[4991]: I1206 01:24:04.318509 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ktmxx"] Dec 06 01:24:04 crc kubenswrapper[4991]: W1206 01:24:04.329252 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4490e9af_2abc_4d2c_83e2_ce5b7324a32a.slice/crio-306db855ef1d81c35620b9147446ff8f5d8cbbaef9a6511cb79d5ffd37b2338d WatchSource:0}: Error finding container 306db855ef1d81c35620b9147446ff8f5d8cbbaef9a6511cb79d5ffd37b2338d: Status 404 returned error can't find the container with id 306db855ef1d81c35620b9147446ff8f5d8cbbaef9a6511cb79d5ffd37b2338d Dec 06 01:24:04 crc kubenswrapper[4991]: I1206 01:24:04.740960 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e394cbd-aea5-4c26-841a-9c504c6a49ef","Type":"ContainerStarted","Data":"64d0945d2549d8e61f2ece12542b441467d8076591d57e1483a74b8e31819706"} Dec 06 01:24:04 crc kubenswrapper[4991]: I1206 01:24:04.741561 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e394cbd-aea5-4c26-841a-9c504c6a49ef","Type":"ContainerStarted","Data":"8282f5f91b8b8d38f59ebc6f7ee47efed49d21e9b0d47d33041177844b3704e7"} Dec 06 01:24:04 crc kubenswrapper[4991]: I1206 01:24:04.742501 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ktmxx" event={"ID":"4490e9af-2abc-4d2c-83e2-ce5b7324a32a","Type":"ContainerStarted","Data":"306db855ef1d81c35620b9147446ff8f5d8cbbaef9a6511cb79d5ffd37b2338d"} Dec 06 01:24:05 crc kubenswrapper[4991]: I1206 01:24:05.758461 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e394cbd-aea5-4c26-841a-9c504c6a49ef","Type":"ContainerStarted","Data":"d2c8da9bd39427422a5b745f5d5747fa0636df3e7c129c4244da753ffd593dca"} Dec 06 01:24:07 crc kubenswrapper[4991]: I1206 01:24:07.581175 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:24:07 crc kubenswrapper[4991]: I1206 01:24:07.782417 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e394cbd-aea5-4c26-841a-9c504c6a49ef","Type":"ContainerStarted","Data":"f106749d82599cac0a5c27f80233b2c668719e84730836ce58ec4ed148edeb9b"} Dec 06 01:24:12 crc kubenswrapper[4991]: I1206 01:24:12.409541 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 01:24:12 crc kubenswrapper[4991]: I1206 01:24:12.410333 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e74c1123-26eb-43c0-935f-4e81cfebf716" containerName="glance-log" containerID="cri-o://aa2eb5f0f8a82137ce365cd29c102ac4411dd75fd6727867985f43f055aecaae" gracePeriod=30 Dec 06 01:24:12 crc kubenswrapper[4991]: I1206 01:24:12.410426 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e74c1123-26eb-43c0-935f-4e81cfebf716" containerName="glance-httpd" containerID="cri-o://5cf4078301fb73ac5dccc6888d936b4d9b3cf6e69e8ef008ad49e9f54b072c31" gracePeriod=30 Dec 06 01:24:12 crc kubenswrapper[4991]: I1206 01:24:12.939364 4991 generic.go:334] "Generic (PLEG): container finished" podID="e74c1123-26eb-43c0-935f-4e81cfebf716" containerID="aa2eb5f0f8a82137ce365cd29c102ac4411dd75fd6727867985f43f055aecaae" exitCode=143 Dec 06 01:24:12 crc kubenswrapper[4991]: I1206 01:24:12.939433 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e74c1123-26eb-43c0-935f-4e81cfebf716","Type":"ContainerDied","Data":"aa2eb5f0f8a82137ce365cd29c102ac4411dd75fd6727867985f43f055aecaae"} Dec 06 01:24:14 crc kubenswrapper[4991]: I1206 01:24:14.966968 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ktmxx" event={"ID":"4490e9af-2abc-4d2c-83e2-ce5b7324a32a","Type":"ContainerStarted","Data":"e4520f4c284293c26259544c9abb2578872b250069cdf4ad7a9791a263cff66c"} Dec 06 01:24:14 crc kubenswrapper[4991]: I1206 01:24:14.973764 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e394cbd-aea5-4c26-841a-9c504c6a49ef","Type":"ContainerStarted","Data":"1b1aa8166dce0a49bb56209af0eb5abb57f06febaef3cbc839845cc48289692c"} Dec 06 01:24:14 crc kubenswrapper[4991]: I1206 01:24:14.974009 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 06 01:24:14 crc kubenswrapper[4991]: I1206 01:24:14.974089 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0e394cbd-aea5-4c26-841a-9c504c6a49ef" containerName="sg-core" containerID="cri-o://f106749d82599cac0a5c27f80233b2c668719e84730836ce58ec4ed148edeb9b" gracePeriod=30 Dec 06 01:24:14 crc kubenswrapper[4991]: I1206 01:24:14.974117 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0e394cbd-aea5-4c26-841a-9c504c6a49ef" containerName="ceilometer-notification-agent" containerID="cri-o://d2c8da9bd39427422a5b745f5d5747fa0636df3e7c129c4244da753ffd593dca" gracePeriod=30 Dec 06 01:24:14 crc kubenswrapper[4991]: I1206 01:24:14.974207 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0e394cbd-aea5-4c26-841a-9c504c6a49ef" containerName="proxy-httpd" containerID="cri-o://1b1aa8166dce0a49bb56209af0eb5abb57f06febaef3cbc839845cc48289692c" gracePeriod=30 Dec 06 01:24:14 crc kubenswrapper[4991]: I1206 01:24:14.974302 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0e394cbd-aea5-4c26-841a-9c504c6a49ef" containerName="ceilometer-central-agent" containerID="cri-o://64d0945d2549d8e61f2ece12542b441467d8076591d57e1483a74b8e31819706" gracePeriod=30 Dec 06 01:24:15 crc kubenswrapper[4991]: I1206 01:24:15.009365 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-ktmxx" podStartSLOduration=2.11741735 podStartE2EDuration="12.009347917s" podCreationTimestamp="2025-12-06 01:24:03 +0000 UTC" firstStartedPulling="2025-12-06 01:24:04.334460863 +0000 UTC m=+1317.684751331" lastFinishedPulling="2025-12-06 01:24:14.22639143 +0000 UTC m=+1327.576681898" observedRunningTime="2025-12-06 01:24:14.999518903 +0000 UTC m=+1328.349809371" watchObservedRunningTime="2025-12-06 01:24:15.009347917 +0000 UTC m=+1328.359638385" Dec 06 01:24:15 crc kubenswrapper[4991]: I1206 01:24:15.038190 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.592947217 podStartE2EDuration="13.038170282s" podCreationTimestamp="2025-12-06 01:24:02 +0000 UTC" firstStartedPulling="2025-12-06 01:24:03.778197768 +0000 UTC m=+1317.128488236" lastFinishedPulling="2025-12-06 01:24:14.223420833 +0000 UTC m=+1327.573711301" observedRunningTime="2025-12-06 01:24:15.031292155 +0000 UTC m=+1328.381582623" watchObservedRunningTime="2025-12-06 01:24:15.038170282 +0000 UTC m=+1328.388460750" Dec 06 01:24:15 crc kubenswrapper[4991]: I1206 01:24:15.369063 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 01:24:15 crc kubenswrapper[4991]: I1206 01:24:15.369610 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7f8e1b96-e706-4f50-b033-b0fb66936deb" containerName="glance-log" containerID="cri-o://70e84139579a76c64d6888d736431bcd66bd236eb2d0d8df41fd650fb013a1e7" gracePeriod=30 Dec 06 01:24:15 crc kubenswrapper[4991]: I1206 01:24:15.369665 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7f8e1b96-e706-4f50-b033-b0fb66936deb" containerName="glance-httpd" containerID="cri-o://026fafed64204e02cd1b4e5a4aa3520e431c2610853beac3abc984b0f672b8b6" gracePeriod=30 Dec 06 01:24:15 crc kubenswrapper[4991]: I1206 01:24:15.825689 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 01:24:15 crc kubenswrapper[4991]: I1206 01:24:15.892044 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e394cbd-aea5-4c26-841a-9c504c6a49ef-combined-ca-bundle\") pod \"0e394cbd-aea5-4c26-841a-9c504c6a49ef\" (UID: \"0e394cbd-aea5-4c26-841a-9c504c6a49ef\") " Dec 06 01:24:15 crc kubenswrapper[4991]: I1206 01:24:15.892127 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e394cbd-aea5-4c26-841a-9c504c6a49ef-scripts\") pod \"0e394cbd-aea5-4c26-841a-9c504c6a49ef\" (UID: \"0e394cbd-aea5-4c26-841a-9c504c6a49ef\") " Dec 06 01:24:15 crc kubenswrapper[4991]: I1206 01:24:15.892172 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e394cbd-aea5-4c26-841a-9c504c6a49ef-run-httpd\") pod \"0e394cbd-aea5-4c26-841a-9c504c6a49ef\" (UID: \"0e394cbd-aea5-4c26-841a-9c504c6a49ef\") " Dec 06 01:24:15 crc kubenswrapper[4991]: I1206 01:24:15.892368 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e394cbd-aea5-4c26-841a-9c504c6a49ef-log-httpd\") pod \"0e394cbd-aea5-4c26-841a-9c504c6a49ef\" (UID: \"0e394cbd-aea5-4c26-841a-9c504c6a49ef\") " Dec 06 01:24:15 crc kubenswrapper[4991]: I1206 01:24:15.892469 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pw6d6\" (UniqueName: \"kubernetes.io/projected/0e394cbd-aea5-4c26-841a-9c504c6a49ef-kube-api-access-pw6d6\") pod \"0e394cbd-aea5-4c26-841a-9c504c6a49ef\" (UID: \"0e394cbd-aea5-4c26-841a-9c504c6a49ef\") " Dec 06 01:24:15 crc kubenswrapper[4991]: I1206 01:24:15.892502 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0e394cbd-aea5-4c26-841a-9c504c6a49ef-sg-core-conf-yaml\") pod \"0e394cbd-aea5-4c26-841a-9c504c6a49ef\" (UID: \"0e394cbd-aea5-4c26-841a-9c504c6a49ef\") " Dec 06 01:24:15 crc kubenswrapper[4991]: I1206 01:24:15.892523 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e394cbd-aea5-4c26-841a-9c504c6a49ef-config-data\") pod \"0e394cbd-aea5-4c26-841a-9c504c6a49ef\" (UID: \"0e394cbd-aea5-4c26-841a-9c504c6a49ef\") " Dec 06 01:24:15 crc kubenswrapper[4991]: I1206 01:24:15.893349 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e394cbd-aea5-4c26-841a-9c504c6a49ef-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0e394cbd-aea5-4c26-841a-9c504c6a49ef" (UID: "0e394cbd-aea5-4c26-841a-9c504c6a49ef"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:24:15 crc kubenswrapper[4991]: I1206 01:24:15.899294 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e394cbd-aea5-4c26-841a-9c504c6a49ef-kube-api-access-pw6d6" (OuterVolumeSpecName: "kube-api-access-pw6d6") pod "0e394cbd-aea5-4c26-841a-9c504c6a49ef" (UID: "0e394cbd-aea5-4c26-841a-9c504c6a49ef"). InnerVolumeSpecName "kube-api-access-pw6d6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:24:15 crc kubenswrapper[4991]: I1206 01:24:15.899662 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e394cbd-aea5-4c26-841a-9c504c6a49ef-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0e394cbd-aea5-4c26-841a-9c504c6a49ef" (UID: "0e394cbd-aea5-4c26-841a-9c504c6a49ef"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:24:15 crc kubenswrapper[4991]: I1206 01:24:15.910071 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e394cbd-aea5-4c26-841a-9c504c6a49ef-scripts" (OuterVolumeSpecName: "scripts") pod "0e394cbd-aea5-4c26-841a-9c504c6a49ef" (UID: "0e394cbd-aea5-4c26-841a-9c504c6a49ef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:24:15 crc kubenswrapper[4991]: I1206 01:24:15.947508 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e394cbd-aea5-4c26-841a-9c504c6a49ef-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0e394cbd-aea5-4c26-841a-9c504c6a49ef" (UID: "0e394cbd-aea5-4c26-841a-9c504c6a49ef"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:24:15 crc kubenswrapper[4991]: I1206 01:24:15.985263 4991 generic.go:334] "Generic (PLEG): container finished" podID="7f8e1b96-e706-4f50-b033-b0fb66936deb" containerID="70e84139579a76c64d6888d736431bcd66bd236eb2d0d8df41fd650fb013a1e7" exitCode=143 Dec 06 01:24:15 crc kubenswrapper[4991]: I1206 01:24:15.985337 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7f8e1b96-e706-4f50-b033-b0fb66936deb","Type":"ContainerDied","Data":"70e84139579a76c64d6888d736431bcd66bd236eb2d0d8df41fd650fb013a1e7"} Dec 06 01:24:15 crc kubenswrapper[4991]: I1206 01:24:15.997484 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e394cbd-aea5-4c26-841a-9c504c6a49ef-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:15 crc kubenswrapper[4991]: I1206 01:24:15.997507 4991 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e394cbd-aea5-4c26-841a-9c504c6a49ef-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:15 crc kubenswrapper[4991]: I1206 01:24:15.997516 4991 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e394cbd-aea5-4c26-841a-9c504c6a49ef-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:15 crc kubenswrapper[4991]: I1206 01:24:15.997527 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pw6d6\" (UniqueName: \"kubernetes.io/projected/0e394cbd-aea5-4c26-841a-9c504c6a49ef-kube-api-access-pw6d6\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:15 crc kubenswrapper[4991]: I1206 01:24:15.997537 4991 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0e394cbd-aea5-4c26-841a-9c504c6a49ef-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.004320 4991 generic.go:334] "Generic (PLEG): container finished" podID="0e394cbd-aea5-4c26-841a-9c504c6a49ef" containerID="1b1aa8166dce0a49bb56209af0eb5abb57f06febaef3cbc839845cc48289692c" exitCode=0 Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.004362 4991 generic.go:334] "Generic (PLEG): container finished" podID="0e394cbd-aea5-4c26-841a-9c504c6a49ef" containerID="f106749d82599cac0a5c27f80233b2c668719e84730836ce58ec4ed148edeb9b" exitCode=2 Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.004374 4991 generic.go:334] "Generic (PLEG): container finished" podID="0e394cbd-aea5-4c26-841a-9c504c6a49ef" containerID="d2c8da9bd39427422a5b745f5d5747fa0636df3e7c129c4244da753ffd593dca" exitCode=0 Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.004384 4991 generic.go:334] "Generic (PLEG): container finished" podID="0e394cbd-aea5-4c26-841a-9c504c6a49ef" containerID="64d0945d2549d8e61f2ece12542b441467d8076591d57e1483a74b8e31819706" exitCode=0 Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.004418 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.004451 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e394cbd-aea5-4c26-841a-9c504c6a49ef","Type":"ContainerDied","Data":"1b1aa8166dce0a49bb56209af0eb5abb57f06febaef3cbc839845cc48289692c"} Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.004493 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e394cbd-aea5-4c26-841a-9c504c6a49ef","Type":"ContainerDied","Data":"f106749d82599cac0a5c27f80233b2c668719e84730836ce58ec4ed148edeb9b"} Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.004503 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e394cbd-aea5-4c26-841a-9c504c6a49ef","Type":"ContainerDied","Data":"d2c8da9bd39427422a5b745f5d5747fa0636df3e7c129c4244da753ffd593dca"} Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.004513 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e394cbd-aea5-4c26-841a-9c504c6a49ef","Type":"ContainerDied","Data":"64d0945d2549d8e61f2ece12542b441467d8076591d57e1483a74b8e31819706"} Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.004521 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e394cbd-aea5-4c26-841a-9c504c6a49ef","Type":"ContainerDied","Data":"8282f5f91b8b8d38f59ebc6f7ee47efed49d21e9b0d47d33041177844b3704e7"} Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.004541 4991 scope.go:117] "RemoveContainer" containerID="1b1aa8166dce0a49bb56209af0eb5abb57f06febaef3cbc839845cc48289692c" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.016528 4991 generic.go:334] "Generic (PLEG): container finished" podID="e74c1123-26eb-43c0-935f-4e81cfebf716" containerID="5cf4078301fb73ac5dccc6888d936b4d9b3cf6e69e8ef008ad49e9f54b072c31" exitCode=0 Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.018218 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e74c1123-26eb-43c0-935f-4e81cfebf716","Type":"ContainerDied","Data":"5cf4078301fb73ac5dccc6888d936b4d9b3cf6e69e8ef008ad49e9f54b072c31"} Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.023383 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e394cbd-aea5-4c26-841a-9c504c6a49ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e394cbd-aea5-4c26-841a-9c504c6a49ef" (UID: "0e394cbd-aea5-4c26-841a-9c504c6a49ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.027973 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e394cbd-aea5-4c26-841a-9c504c6a49ef-config-data" (OuterVolumeSpecName: "config-data") pod "0e394cbd-aea5-4c26-841a-9c504c6a49ef" (UID: "0e394cbd-aea5-4c26-841a-9c504c6a49ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.069414 4991 scope.go:117] "RemoveContainer" containerID="f106749d82599cac0a5c27f80233b2c668719e84730836ce58ec4ed148edeb9b" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.105124 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e394cbd-aea5-4c26-841a-9c504c6a49ef-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.105176 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e394cbd-aea5-4c26-841a-9c504c6a49ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.109637 4991 scope.go:117] "RemoveContainer" containerID="d2c8da9bd39427422a5b745f5d5747fa0636df3e7c129c4244da753ffd593dca" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.191096 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.247558 4991 scope.go:117] "RemoveContainer" containerID="64d0945d2549d8e61f2ece12542b441467d8076591d57e1483a74b8e31819706" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.310677 4991 scope.go:117] "RemoveContainer" containerID="1b1aa8166dce0a49bb56209af0eb5abb57f06febaef3cbc839845cc48289692c" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.310860 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e74c1123-26eb-43c0-935f-4e81cfebf716-scripts\") pod \"e74c1123-26eb-43c0-935f-4e81cfebf716\" (UID: \"e74c1123-26eb-43c0-935f-4e81cfebf716\") " Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.311642 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e74c1123-26eb-43c0-935f-4e81cfebf716-httpd-run\") pod \"e74c1123-26eb-43c0-935f-4e81cfebf716\" (UID: \"e74c1123-26eb-43c0-935f-4e81cfebf716\") " Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.311708 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e74c1123-26eb-43c0-935f-4e81cfebf716-config-data\") pod \"e74c1123-26eb-43c0-935f-4e81cfebf716\" (UID: \"e74c1123-26eb-43c0-935f-4e81cfebf716\") " Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.311745 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e74c1123-26eb-43c0-935f-4e81cfebf716-logs\") pod \"e74c1123-26eb-43c0-935f-4e81cfebf716\" (UID: \"e74c1123-26eb-43c0-935f-4e81cfebf716\") " Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.311867 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"e74c1123-26eb-43c0-935f-4e81cfebf716\" (UID: \"e74c1123-26eb-43c0-935f-4e81cfebf716\") " Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.312228 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e74c1123-26eb-43c0-935f-4e81cfebf716-combined-ca-bundle\") pod \"e74c1123-26eb-43c0-935f-4e81cfebf716\" (UID: \"e74c1123-26eb-43c0-935f-4e81cfebf716\") " Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.312321 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fcsl\" (UniqueName: \"kubernetes.io/projected/e74c1123-26eb-43c0-935f-4e81cfebf716-kube-api-access-6fcsl\") pod \"e74c1123-26eb-43c0-935f-4e81cfebf716\" (UID: \"e74c1123-26eb-43c0-935f-4e81cfebf716\") " Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.312407 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e74c1123-26eb-43c0-935f-4e81cfebf716-public-tls-certs\") pod \"e74c1123-26eb-43c0-935f-4e81cfebf716\" (UID: \"e74c1123-26eb-43c0-935f-4e81cfebf716\") " Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.312406 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e74c1123-26eb-43c0-935f-4e81cfebf716-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e74c1123-26eb-43c0-935f-4e81cfebf716" (UID: "e74c1123-26eb-43c0-935f-4e81cfebf716"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.312693 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e74c1123-26eb-43c0-935f-4e81cfebf716-logs" (OuterVolumeSpecName: "logs") pod "e74c1123-26eb-43c0-935f-4e81cfebf716" (UID: "e74c1123-26eb-43c0-935f-4e81cfebf716"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.313107 4991 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e74c1123-26eb-43c0-935f-4e81cfebf716-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.313130 4991 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e74c1123-26eb-43c0-935f-4e81cfebf716-logs\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.319199 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e74c1123-26eb-43c0-935f-4e81cfebf716-kube-api-access-6fcsl" (OuterVolumeSpecName: "kube-api-access-6fcsl") pod "e74c1123-26eb-43c0-935f-4e81cfebf716" (UID: "e74c1123-26eb-43c0-935f-4e81cfebf716"). InnerVolumeSpecName "kube-api-access-6fcsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.319321 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "e74c1123-26eb-43c0-935f-4e81cfebf716" (UID: "e74c1123-26eb-43c0-935f-4e81cfebf716"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.319316 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e74c1123-26eb-43c0-935f-4e81cfebf716-scripts" (OuterVolumeSpecName: "scripts") pod "e74c1123-26eb-43c0-935f-4e81cfebf716" (UID: "e74c1123-26eb-43c0-935f-4e81cfebf716"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:24:16 crc kubenswrapper[4991]: E1206 01:24:16.321818 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b1aa8166dce0a49bb56209af0eb5abb57f06febaef3cbc839845cc48289692c\": container with ID starting with 1b1aa8166dce0a49bb56209af0eb5abb57f06febaef3cbc839845cc48289692c not found: ID does not exist" containerID="1b1aa8166dce0a49bb56209af0eb5abb57f06febaef3cbc839845cc48289692c" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.321999 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b1aa8166dce0a49bb56209af0eb5abb57f06febaef3cbc839845cc48289692c"} err="failed to get container status \"1b1aa8166dce0a49bb56209af0eb5abb57f06febaef3cbc839845cc48289692c\": rpc error: code = NotFound desc = could not find container \"1b1aa8166dce0a49bb56209af0eb5abb57f06febaef3cbc839845cc48289692c\": container with ID starting with 1b1aa8166dce0a49bb56209af0eb5abb57f06febaef3cbc839845cc48289692c not found: ID does not exist" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.322122 4991 scope.go:117] "RemoveContainer" containerID="f106749d82599cac0a5c27f80233b2c668719e84730836ce58ec4ed148edeb9b" Dec 06 01:24:16 crc kubenswrapper[4991]: E1206 01:24:16.323659 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f106749d82599cac0a5c27f80233b2c668719e84730836ce58ec4ed148edeb9b\": container with ID starting with f106749d82599cac0a5c27f80233b2c668719e84730836ce58ec4ed148edeb9b not found: ID does not exist" containerID="f106749d82599cac0a5c27f80233b2c668719e84730836ce58ec4ed148edeb9b" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.323770 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f106749d82599cac0a5c27f80233b2c668719e84730836ce58ec4ed148edeb9b"} err="failed to get container status \"f106749d82599cac0a5c27f80233b2c668719e84730836ce58ec4ed148edeb9b\": rpc error: code = NotFound desc = could not find container \"f106749d82599cac0a5c27f80233b2c668719e84730836ce58ec4ed148edeb9b\": container with ID starting with f106749d82599cac0a5c27f80233b2c668719e84730836ce58ec4ed148edeb9b not found: ID does not exist" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.323842 4991 scope.go:117] "RemoveContainer" containerID="d2c8da9bd39427422a5b745f5d5747fa0636df3e7c129c4244da753ffd593dca" Dec 06 01:24:16 crc kubenswrapper[4991]: E1206 01:24:16.324296 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2c8da9bd39427422a5b745f5d5747fa0636df3e7c129c4244da753ffd593dca\": container with ID starting with d2c8da9bd39427422a5b745f5d5747fa0636df3e7c129c4244da753ffd593dca not found: ID does not exist" containerID="d2c8da9bd39427422a5b745f5d5747fa0636df3e7c129c4244da753ffd593dca" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.324350 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2c8da9bd39427422a5b745f5d5747fa0636df3e7c129c4244da753ffd593dca"} err="failed to get container status \"d2c8da9bd39427422a5b745f5d5747fa0636df3e7c129c4244da753ffd593dca\": rpc error: code = NotFound desc = could not find container \"d2c8da9bd39427422a5b745f5d5747fa0636df3e7c129c4244da753ffd593dca\": container with ID starting with d2c8da9bd39427422a5b745f5d5747fa0636df3e7c129c4244da753ffd593dca not found: ID does not exist" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.324380 4991 scope.go:117] "RemoveContainer" containerID="64d0945d2549d8e61f2ece12542b441467d8076591d57e1483a74b8e31819706" Dec 06 01:24:16 crc kubenswrapper[4991]: E1206 01:24:16.324958 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64d0945d2549d8e61f2ece12542b441467d8076591d57e1483a74b8e31819706\": container with ID starting with 64d0945d2549d8e61f2ece12542b441467d8076591d57e1483a74b8e31819706 not found: ID does not exist" containerID="64d0945d2549d8e61f2ece12542b441467d8076591d57e1483a74b8e31819706" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.325032 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64d0945d2549d8e61f2ece12542b441467d8076591d57e1483a74b8e31819706"} err="failed to get container status \"64d0945d2549d8e61f2ece12542b441467d8076591d57e1483a74b8e31819706\": rpc error: code = NotFound desc = could not find container \"64d0945d2549d8e61f2ece12542b441467d8076591d57e1483a74b8e31819706\": container with ID starting with 64d0945d2549d8e61f2ece12542b441467d8076591d57e1483a74b8e31819706 not found: ID does not exist" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.325054 4991 scope.go:117] "RemoveContainer" containerID="1b1aa8166dce0a49bb56209af0eb5abb57f06febaef3cbc839845cc48289692c" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.325381 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b1aa8166dce0a49bb56209af0eb5abb57f06febaef3cbc839845cc48289692c"} err="failed to get container status \"1b1aa8166dce0a49bb56209af0eb5abb57f06febaef3cbc839845cc48289692c\": rpc error: code = NotFound desc = could not find container \"1b1aa8166dce0a49bb56209af0eb5abb57f06febaef3cbc839845cc48289692c\": container with ID starting with 1b1aa8166dce0a49bb56209af0eb5abb57f06febaef3cbc839845cc48289692c not found: ID does not exist" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.325481 4991 scope.go:117] "RemoveContainer" containerID="f106749d82599cac0a5c27f80233b2c668719e84730836ce58ec4ed148edeb9b" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.326010 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f106749d82599cac0a5c27f80233b2c668719e84730836ce58ec4ed148edeb9b"} err="failed to get container status \"f106749d82599cac0a5c27f80233b2c668719e84730836ce58ec4ed148edeb9b\": rpc error: code = NotFound desc = could not find container \"f106749d82599cac0a5c27f80233b2c668719e84730836ce58ec4ed148edeb9b\": container with ID starting with f106749d82599cac0a5c27f80233b2c668719e84730836ce58ec4ed148edeb9b not found: ID does not exist" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.326112 4991 scope.go:117] "RemoveContainer" containerID="d2c8da9bd39427422a5b745f5d5747fa0636df3e7c129c4244da753ffd593dca" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.326388 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2c8da9bd39427422a5b745f5d5747fa0636df3e7c129c4244da753ffd593dca"} err="failed to get container status \"d2c8da9bd39427422a5b745f5d5747fa0636df3e7c129c4244da753ffd593dca\": rpc error: code = NotFound desc = could not find container \"d2c8da9bd39427422a5b745f5d5747fa0636df3e7c129c4244da753ffd593dca\": container with ID starting with d2c8da9bd39427422a5b745f5d5747fa0636df3e7c129c4244da753ffd593dca not found: ID does not exist" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.326486 4991 scope.go:117] "RemoveContainer" containerID="64d0945d2549d8e61f2ece12542b441467d8076591d57e1483a74b8e31819706" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.326862 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64d0945d2549d8e61f2ece12542b441467d8076591d57e1483a74b8e31819706"} err="failed to get container status \"64d0945d2549d8e61f2ece12542b441467d8076591d57e1483a74b8e31819706\": rpc error: code = NotFound desc = could not find container \"64d0945d2549d8e61f2ece12542b441467d8076591d57e1483a74b8e31819706\": container with ID starting with 64d0945d2549d8e61f2ece12542b441467d8076591d57e1483a74b8e31819706 not found: ID does not exist" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.327059 4991 scope.go:117] "RemoveContainer" containerID="1b1aa8166dce0a49bb56209af0eb5abb57f06febaef3cbc839845cc48289692c" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.327366 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b1aa8166dce0a49bb56209af0eb5abb57f06febaef3cbc839845cc48289692c"} err="failed to get container status \"1b1aa8166dce0a49bb56209af0eb5abb57f06febaef3cbc839845cc48289692c\": rpc error: code = NotFound desc = could not find container \"1b1aa8166dce0a49bb56209af0eb5abb57f06febaef3cbc839845cc48289692c\": container with ID starting with 1b1aa8166dce0a49bb56209af0eb5abb57f06febaef3cbc839845cc48289692c not found: ID does not exist" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.327445 4991 scope.go:117] "RemoveContainer" containerID="f106749d82599cac0a5c27f80233b2c668719e84730836ce58ec4ed148edeb9b" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.327693 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f106749d82599cac0a5c27f80233b2c668719e84730836ce58ec4ed148edeb9b"} err="failed to get container status \"f106749d82599cac0a5c27f80233b2c668719e84730836ce58ec4ed148edeb9b\": rpc error: code = NotFound desc = could not find container \"f106749d82599cac0a5c27f80233b2c668719e84730836ce58ec4ed148edeb9b\": container with ID starting with f106749d82599cac0a5c27f80233b2c668719e84730836ce58ec4ed148edeb9b not found: ID does not exist" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.327775 4991 scope.go:117] "RemoveContainer" containerID="d2c8da9bd39427422a5b745f5d5747fa0636df3e7c129c4244da753ffd593dca" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.328046 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2c8da9bd39427422a5b745f5d5747fa0636df3e7c129c4244da753ffd593dca"} err="failed to get container status \"d2c8da9bd39427422a5b745f5d5747fa0636df3e7c129c4244da753ffd593dca\": rpc error: code = NotFound desc = could not find container \"d2c8da9bd39427422a5b745f5d5747fa0636df3e7c129c4244da753ffd593dca\": container with ID starting with d2c8da9bd39427422a5b745f5d5747fa0636df3e7c129c4244da753ffd593dca not found: ID does not exist" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.328133 4991 scope.go:117] "RemoveContainer" containerID="64d0945d2549d8e61f2ece12542b441467d8076591d57e1483a74b8e31819706" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.328387 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64d0945d2549d8e61f2ece12542b441467d8076591d57e1483a74b8e31819706"} err="failed to get container status \"64d0945d2549d8e61f2ece12542b441467d8076591d57e1483a74b8e31819706\": rpc error: code = NotFound desc = could not find container \"64d0945d2549d8e61f2ece12542b441467d8076591d57e1483a74b8e31819706\": container with ID starting with 64d0945d2549d8e61f2ece12542b441467d8076591d57e1483a74b8e31819706 not found: ID does not exist" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.328473 4991 scope.go:117] "RemoveContainer" containerID="1b1aa8166dce0a49bb56209af0eb5abb57f06febaef3cbc839845cc48289692c" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.328760 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b1aa8166dce0a49bb56209af0eb5abb57f06febaef3cbc839845cc48289692c"} err="failed to get container status \"1b1aa8166dce0a49bb56209af0eb5abb57f06febaef3cbc839845cc48289692c\": rpc error: code = NotFound desc = could not find container \"1b1aa8166dce0a49bb56209af0eb5abb57f06febaef3cbc839845cc48289692c\": container with ID starting with 1b1aa8166dce0a49bb56209af0eb5abb57f06febaef3cbc839845cc48289692c not found: ID does not exist" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.328838 4991 scope.go:117] "RemoveContainer" containerID="f106749d82599cac0a5c27f80233b2c668719e84730836ce58ec4ed148edeb9b" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.329114 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f106749d82599cac0a5c27f80233b2c668719e84730836ce58ec4ed148edeb9b"} err="failed to get container status \"f106749d82599cac0a5c27f80233b2c668719e84730836ce58ec4ed148edeb9b\": rpc error: code = NotFound desc = could not find container \"f106749d82599cac0a5c27f80233b2c668719e84730836ce58ec4ed148edeb9b\": container with ID starting with f106749d82599cac0a5c27f80233b2c668719e84730836ce58ec4ed148edeb9b not found: ID does not exist" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.329188 4991 scope.go:117] "RemoveContainer" containerID="d2c8da9bd39427422a5b745f5d5747fa0636df3e7c129c4244da753ffd593dca" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.329445 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2c8da9bd39427422a5b745f5d5747fa0636df3e7c129c4244da753ffd593dca"} err="failed to get container status \"d2c8da9bd39427422a5b745f5d5747fa0636df3e7c129c4244da753ffd593dca\": rpc error: code = NotFound desc = could not find container \"d2c8da9bd39427422a5b745f5d5747fa0636df3e7c129c4244da753ffd593dca\": container with ID starting with d2c8da9bd39427422a5b745f5d5747fa0636df3e7c129c4244da753ffd593dca not found: ID does not exist" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.329525 4991 scope.go:117] "RemoveContainer" containerID="64d0945d2549d8e61f2ece12542b441467d8076591d57e1483a74b8e31819706" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.329811 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64d0945d2549d8e61f2ece12542b441467d8076591d57e1483a74b8e31819706"} err="failed to get container status \"64d0945d2549d8e61f2ece12542b441467d8076591d57e1483a74b8e31819706\": rpc error: code = NotFound desc = could not find container \"64d0945d2549d8e61f2ece12542b441467d8076591d57e1483a74b8e31819706\": container with ID starting with 64d0945d2549d8e61f2ece12542b441467d8076591d57e1483a74b8e31819706 not found: ID does not exist" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.356417 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e74c1123-26eb-43c0-935f-4e81cfebf716-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e74c1123-26eb-43c0-935f-4e81cfebf716" (UID: "e74c1123-26eb-43c0-935f-4e81cfebf716"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.378133 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e74c1123-26eb-43c0-935f-4e81cfebf716-config-data" (OuterVolumeSpecName: "config-data") pod "e74c1123-26eb-43c0-935f-4e81cfebf716" (UID: "e74c1123-26eb-43c0-935f-4e81cfebf716"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.383752 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e74c1123-26eb-43c0-935f-4e81cfebf716-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e74c1123-26eb-43c0-935f-4e81cfebf716" (UID: "e74c1123-26eb-43c0-935f-4e81cfebf716"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.414739 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fcsl\" (UniqueName: \"kubernetes.io/projected/e74c1123-26eb-43c0-935f-4e81cfebf716-kube-api-access-6fcsl\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.414781 4991 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e74c1123-26eb-43c0-935f-4e81cfebf716-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.414793 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e74c1123-26eb-43c0-935f-4e81cfebf716-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.414802 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e74c1123-26eb-43c0-935f-4e81cfebf716-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.414838 4991 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.414849 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e74c1123-26eb-43c0-935f-4e81cfebf716-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.439351 4991 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.457232 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.479310 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.487766 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:24:16 crc kubenswrapper[4991]: E1206 01:24:16.489591 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e394cbd-aea5-4c26-841a-9c504c6a49ef" containerName="ceilometer-notification-agent" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.489640 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e394cbd-aea5-4c26-841a-9c504c6a49ef" containerName="ceilometer-notification-agent" Dec 06 01:24:16 crc kubenswrapper[4991]: E1206 01:24:16.489671 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e394cbd-aea5-4c26-841a-9c504c6a49ef" containerName="ceilometer-central-agent" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.489678 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e394cbd-aea5-4c26-841a-9c504c6a49ef" containerName="ceilometer-central-agent" Dec 06 01:24:16 crc kubenswrapper[4991]: E1206 01:24:16.489689 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e74c1123-26eb-43c0-935f-4e81cfebf716" containerName="glance-httpd" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.489715 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="e74c1123-26eb-43c0-935f-4e81cfebf716" containerName="glance-httpd" Dec 06 01:24:16 crc kubenswrapper[4991]: E1206 01:24:16.489733 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e74c1123-26eb-43c0-935f-4e81cfebf716" containerName="glance-log" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.489739 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="e74c1123-26eb-43c0-935f-4e81cfebf716" containerName="glance-log" Dec 06 01:24:16 crc kubenswrapper[4991]: E1206 01:24:16.489750 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e394cbd-aea5-4c26-841a-9c504c6a49ef" containerName="proxy-httpd" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.489756 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e394cbd-aea5-4c26-841a-9c504c6a49ef" containerName="proxy-httpd" Dec 06 01:24:16 crc kubenswrapper[4991]: E1206 01:24:16.489766 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e394cbd-aea5-4c26-841a-9c504c6a49ef" containerName="sg-core" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.489772 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e394cbd-aea5-4c26-841a-9c504c6a49ef" containerName="sg-core" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.490216 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="e74c1123-26eb-43c0-935f-4e81cfebf716" containerName="glance-httpd" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.490260 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e394cbd-aea5-4c26-841a-9c504c6a49ef" containerName="ceilometer-notification-agent" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.490273 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="e74c1123-26eb-43c0-935f-4e81cfebf716" containerName="glance-log" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.490284 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e394cbd-aea5-4c26-841a-9c504c6a49ef" containerName="ceilometer-central-agent" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.490300 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e394cbd-aea5-4c26-841a-9c504c6a49ef" containerName="proxy-httpd" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.490313 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e394cbd-aea5-4c26-841a-9c504c6a49ef" containerName="sg-core" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.495676 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.498250 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.502831 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.518037 4991 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.525958 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.619591 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82992cbc-9e0f-44a8-bd20-06a746836080-log-httpd\") pod \"ceilometer-0\" (UID: \"82992cbc-9e0f-44a8-bd20-06a746836080\") " pod="openstack/ceilometer-0" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.620177 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82992cbc-9e0f-44a8-bd20-06a746836080-scripts\") pod \"ceilometer-0\" (UID: \"82992cbc-9e0f-44a8-bd20-06a746836080\") " pod="openstack/ceilometer-0" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.620228 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82992cbc-9e0f-44a8-bd20-06a746836080-config-data\") pod \"ceilometer-0\" (UID: \"82992cbc-9e0f-44a8-bd20-06a746836080\") " pod="openstack/ceilometer-0" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.620259 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/82992cbc-9e0f-44a8-bd20-06a746836080-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"82992cbc-9e0f-44a8-bd20-06a746836080\") " pod="openstack/ceilometer-0" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.620285 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82992cbc-9e0f-44a8-bd20-06a746836080-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"82992cbc-9e0f-44a8-bd20-06a746836080\") " pod="openstack/ceilometer-0" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.620321 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thsdb\" (UniqueName: \"kubernetes.io/projected/82992cbc-9e0f-44a8-bd20-06a746836080-kube-api-access-thsdb\") pod \"ceilometer-0\" (UID: \"82992cbc-9e0f-44a8-bd20-06a746836080\") " pod="openstack/ceilometer-0" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.620337 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82992cbc-9e0f-44a8-bd20-06a746836080-run-httpd\") pod \"ceilometer-0\" (UID: \"82992cbc-9e0f-44a8-bd20-06a746836080\") " pod="openstack/ceilometer-0" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.722093 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thsdb\" (UniqueName: \"kubernetes.io/projected/82992cbc-9e0f-44a8-bd20-06a746836080-kube-api-access-thsdb\") pod \"ceilometer-0\" (UID: \"82992cbc-9e0f-44a8-bd20-06a746836080\") " pod="openstack/ceilometer-0" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.722195 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82992cbc-9e0f-44a8-bd20-06a746836080-run-httpd\") pod \"ceilometer-0\" (UID: \"82992cbc-9e0f-44a8-bd20-06a746836080\") " pod="openstack/ceilometer-0" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.722245 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82992cbc-9e0f-44a8-bd20-06a746836080-log-httpd\") pod \"ceilometer-0\" (UID: \"82992cbc-9e0f-44a8-bd20-06a746836080\") " pod="openstack/ceilometer-0" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.722315 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82992cbc-9e0f-44a8-bd20-06a746836080-scripts\") pod \"ceilometer-0\" (UID: \"82992cbc-9e0f-44a8-bd20-06a746836080\") " pod="openstack/ceilometer-0" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.722368 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82992cbc-9e0f-44a8-bd20-06a746836080-config-data\") pod \"ceilometer-0\" (UID: \"82992cbc-9e0f-44a8-bd20-06a746836080\") " pod="openstack/ceilometer-0" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.722405 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/82992cbc-9e0f-44a8-bd20-06a746836080-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"82992cbc-9e0f-44a8-bd20-06a746836080\") " pod="openstack/ceilometer-0" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.722441 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82992cbc-9e0f-44a8-bd20-06a746836080-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"82992cbc-9e0f-44a8-bd20-06a746836080\") " pod="openstack/ceilometer-0" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.723065 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82992cbc-9e0f-44a8-bd20-06a746836080-run-httpd\") pod \"ceilometer-0\" (UID: \"82992cbc-9e0f-44a8-bd20-06a746836080\") " pod="openstack/ceilometer-0" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.723277 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82992cbc-9e0f-44a8-bd20-06a746836080-log-httpd\") pod \"ceilometer-0\" (UID: \"82992cbc-9e0f-44a8-bd20-06a746836080\") " pod="openstack/ceilometer-0" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.727930 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82992cbc-9e0f-44a8-bd20-06a746836080-scripts\") pod \"ceilometer-0\" (UID: \"82992cbc-9e0f-44a8-bd20-06a746836080\") " pod="openstack/ceilometer-0" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.728404 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/82992cbc-9e0f-44a8-bd20-06a746836080-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"82992cbc-9e0f-44a8-bd20-06a746836080\") " pod="openstack/ceilometer-0" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.728909 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82992cbc-9e0f-44a8-bd20-06a746836080-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"82992cbc-9e0f-44a8-bd20-06a746836080\") " pod="openstack/ceilometer-0" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.729362 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82992cbc-9e0f-44a8-bd20-06a746836080-config-data\") pod \"ceilometer-0\" (UID: \"82992cbc-9e0f-44a8-bd20-06a746836080\") " pod="openstack/ceilometer-0" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.740742 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thsdb\" (UniqueName: \"kubernetes.io/projected/82992cbc-9e0f-44a8-bd20-06a746836080-kube-api-access-thsdb\") pod \"ceilometer-0\" (UID: \"82992cbc-9e0f-44a8-bd20-06a746836080\") " pod="openstack/ceilometer-0" Dec 06 01:24:16 crc kubenswrapper[4991]: I1206 01:24:16.817950 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 01:24:17 crc kubenswrapper[4991]: I1206 01:24:17.045540 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 01:24:17 crc kubenswrapper[4991]: I1206 01:24:17.057435 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e394cbd-aea5-4c26-841a-9c504c6a49ef" path="/var/lib/kubelet/pods/0e394cbd-aea5-4c26-841a-9c504c6a49ef/volumes" Dec 06 01:24:17 crc kubenswrapper[4991]: I1206 01:24:17.058751 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e74c1123-26eb-43c0-935f-4e81cfebf716","Type":"ContainerDied","Data":"4920aac871b90fab1103f6babeca5ad9c803cafb80961d4db51b37240c5c921b"} Dec 06 01:24:17 crc kubenswrapper[4991]: I1206 01:24:17.058805 4991 scope.go:117] "RemoveContainer" containerID="5cf4078301fb73ac5dccc6888d936b4d9b3cf6e69e8ef008ad49e9f54b072c31" Dec 06 01:24:17 crc kubenswrapper[4991]: I1206 01:24:17.106284 4991 scope.go:117] "RemoveContainer" containerID="aa2eb5f0f8a82137ce365cd29c102ac4411dd75fd6727867985f43f055aecaae" Dec 06 01:24:17 crc kubenswrapper[4991]: I1206 01:24:17.106430 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 01:24:17 crc kubenswrapper[4991]: I1206 01:24:17.119885 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 01:24:17 crc kubenswrapper[4991]: I1206 01:24:17.141840 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 01:24:17 crc kubenswrapper[4991]: I1206 01:24:17.143911 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 01:24:17 crc kubenswrapper[4991]: I1206 01:24:17.148336 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 06 01:24:17 crc kubenswrapper[4991]: I1206 01:24:17.158826 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 06 01:24:17 crc kubenswrapper[4991]: I1206 01:24:17.160772 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 01:24:17 crc kubenswrapper[4991]: I1206 01:24:17.240382 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b96bae9-f846-4e7e-b1df-3d51b139dcb5-scripts\") pod \"glance-default-external-api-0\" (UID: \"7b96bae9-f846-4e7e-b1df-3d51b139dcb5\") " pod="openstack/glance-default-external-api-0" Dec 06 01:24:17 crc kubenswrapper[4991]: I1206 01:24:17.240514 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kchz\" (UniqueName: \"kubernetes.io/projected/7b96bae9-f846-4e7e-b1df-3d51b139dcb5-kube-api-access-9kchz\") pod \"glance-default-external-api-0\" (UID: \"7b96bae9-f846-4e7e-b1df-3d51b139dcb5\") " pod="openstack/glance-default-external-api-0" Dec 06 01:24:17 crc kubenswrapper[4991]: I1206 01:24:17.240593 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b96bae9-f846-4e7e-b1df-3d51b139dcb5-config-data\") pod \"glance-default-external-api-0\" (UID: \"7b96bae9-f846-4e7e-b1df-3d51b139dcb5\") " pod="openstack/glance-default-external-api-0" Dec 06 01:24:17 crc kubenswrapper[4991]: I1206 01:24:17.240664 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b96bae9-f846-4e7e-b1df-3d51b139dcb5-logs\") pod \"glance-default-external-api-0\" (UID: \"7b96bae9-f846-4e7e-b1df-3d51b139dcb5\") " pod="openstack/glance-default-external-api-0" Dec 06 01:24:17 crc kubenswrapper[4991]: I1206 01:24:17.240742 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b96bae9-f846-4e7e-b1df-3d51b139dcb5-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7b96bae9-f846-4e7e-b1df-3d51b139dcb5\") " pod="openstack/glance-default-external-api-0" Dec 06 01:24:17 crc kubenswrapper[4991]: I1206 01:24:17.240820 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"7b96bae9-f846-4e7e-b1df-3d51b139dcb5\") " pod="openstack/glance-default-external-api-0" Dec 06 01:24:17 crc kubenswrapper[4991]: I1206 01:24:17.240908 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7b96bae9-f846-4e7e-b1df-3d51b139dcb5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7b96bae9-f846-4e7e-b1df-3d51b139dcb5\") " pod="openstack/glance-default-external-api-0" Dec 06 01:24:17 crc kubenswrapper[4991]: I1206 01:24:17.240998 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b96bae9-f846-4e7e-b1df-3d51b139dcb5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7b96bae9-f846-4e7e-b1df-3d51b139dcb5\") " pod="openstack/glance-default-external-api-0" Dec 06 01:24:17 crc kubenswrapper[4991]: I1206 01:24:17.342735 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b96bae9-f846-4e7e-b1df-3d51b139dcb5-config-data\") pod \"glance-default-external-api-0\" (UID: \"7b96bae9-f846-4e7e-b1df-3d51b139dcb5\") " pod="openstack/glance-default-external-api-0" Dec 06 01:24:17 crc kubenswrapper[4991]: I1206 01:24:17.342820 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b96bae9-f846-4e7e-b1df-3d51b139dcb5-logs\") pod \"glance-default-external-api-0\" (UID: \"7b96bae9-f846-4e7e-b1df-3d51b139dcb5\") " pod="openstack/glance-default-external-api-0" Dec 06 01:24:17 crc kubenswrapper[4991]: I1206 01:24:17.342868 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b96bae9-f846-4e7e-b1df-3d51b139dcb5-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7b96bae9-f846-4e7e-b1df-3d51b139dcb5\") " pod="openstack/glance-default-external-api-0" Dec 06 01:24:17 crc kubenswrapper[4991]: I1206 01:24:17.342914 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"7b96bae9-f846-4e7e-b1df-3d51b139dcb5\") " pod="openstack/glance-default-external-api-0" Dec 06 01:24:17 crc kubenswrapper[4991]: I1206 01:24:17.342970 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7b96bae9-f846-4e7e-b1df-3d51b139dcb5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7b96bae9-f846-4e7e-b1df-3d51b139dcb5\") " pod="openstack/glance-default-external-api-0" Dec 06 01:24:17 crc kubenswrapper[4991]: I1206 01:24:17.343026 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b96bae9-f846-4e7e-b1df-3d51b139dcb5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7b96bae9-f846-4e7e-b1df-3d51b139dcb5\") " pod="openstack/glance-default-external-api-0" Dec 06 01:24:17 crc kubenswrapper[4991]: I1206 01:24:17.343074 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b96bae9-f846-4e7e-b1df-3d51b139dcb5-scripts\") pod \"glance-default-external-api-0\" (UID: \"7b96bae9-f846-4e7e-b1df-3d51b139dcb5\") " pod="openstack/glance-default-external-api-0" Dec 06 01:24:17 crc kubenswrapper[4991]: I1206 01:24:17.343096 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kchz\" (UniqueName: \"kubernetes.io/projected/7b96bae9-f846-4e7e-b1df-3d51b139dcb5-kube-api-access-9kchz\") pod \"glance-default-external-api-0\" (UID: \"7b96bae9-f846-4e7e-b1df-3d51b139dcb5\") " pod="openstack/glance-default-external-api-0" Dec 06 01:24:17 crc kubenswrapper[4991]: I1206 01:24:17.345359 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b96bae9-f846-4e7e-b1df-3d51b139dcb5-logs\") pod \"glance-default-external-api-0\" (UID: \"7b96bae9-f846-4e7e-b1df-3d51b139dcb5\") " pod="openstack/glance-default-external-api-0" Dec 06 01:24:17 crc kubenswrapper[4991]: I1206 01:24:17.345508 4991 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"7b96bae9-f846-4e7e-b1df-3d51b139dcb5\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Dec 06 01:24:17 crc kubenswrapper[4991]: I1206 01:24:17.345563 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7b96bae9-f846-4e7e-b1df-3d51b139dcb5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7b96bae9-f846-4e7e-b1df-3d51b139dcb5\") " pod="openstack/glance-default-external-api-0" Dec 06 01:24:17 crc kubenswrapper[4991]: I1206 01:24:17.353375 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b96bae9-f846-4e7e-b1df-3d51b139dcb5-scripts\") pod \"glance-default-external-api-0\" (UID: \"7b96bae9-f846-4e7e-b1df-3d51b139dcb5\") " pod="openstack/glance-default-external-api-0" Dec 06 01:24:17 crc kubenswrapper[4991]: I1206 01:24:17.354319 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b96bae9-f846-4e7e-b1df-3d51b139dcb5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7b96bae9-f846-4e7e-b1df-3d51b139dcb5\") " pod="openstack/glance-default-external-api-0" Dec 06 01:24:17 crc kubenswrapper[4991]: I1206 01:24:17.355695 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b96bae9-f846-4e7e-b1df-3d51b139dcb5-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7b96bae9-f846-4e7e-b1df-3d51b139dcb5\") " pod="openstack/glance-default-external-api-0" Dec 06 01:24:17 crc kubenswrapper[4991]: I1206 01:24:17.356398 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b96bae9-f846-4e7e-b1df-3d51b139dcb5-config-data\") pod \"glance-default-external-api-0\" (UID: \"7b96bae9-f846-4e7e-b1df-3d51b139dcb5\") " pod="openstack/glance-default-external-api-0" Dec 06 01:24:17 crc kubenswrapper[4991]: I1206 01:24:17.366907 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kchz\" (UniqueName: \"kubernetes.io/projected/7b96bae9-f846-4e7e-b1df-3d51b139dcb5-kube-api-access-9kchz\") pod \"glance-default-external-api-0\" (UID: \"7b96bae9-f846-4e7e-b1df-3d51b139dcb5\") " pod="openstack/glance-default-external-api-0" Dec 06 01:24:17 crc kubenswrapper[4991]: I1206 01:24:17.384291 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"7b96bae9-f846-4e7e-b1df-3d51b139dcb5\") " pod="openstack/glance-default-external-api-0" Dec 06 01:24:17 crc kubenswrapper[4991]: I1206 01:24:17.462850 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 06 01:24:17 crc kubenswrapper[4991]: I1206 01:24:17.570227 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:24:17 crc kubenswrapper[4991]: W1206 01:24:17.586826 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82992cbc_9e0f_44a8_bd20_06a746836080.slice/crio-5089b706978a91c1112270f076d8ae7d84fb35fb1d52e8ad3c26b3e86b6ed5d1 WatchSource:0}: Error finding container 5089b706978a91c1112270f076d8ae7d84fb35fb1d52e8ad3c26b3e86b6ed5d1: Status 404 returned error can't find the container with id 5089b706978a91c1112270f076d8ae7d84fb35fb1d52e8ad3c26b3e86b6ed5d1 Dec 06 01:24:17 crc kubenswrapper[4991]: I1206 01:24:17.953169 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:24:18 crc kubenswrapper[4991]: I1206 01:24:18.050417 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82992cbc-9e0f-44a8-bd20-06a746836080","Type":"ContainerStarted","Data":"5089b706978a91c1112270f076d8ae7d84fb35fb1d52e8ad3c26b3e86b6ed5d1"} Dec 06 01:24:18 crc kubenswrapper[4991]: I1206 01:24:18.066790 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.054450 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e74c1123-26eb-43c0-935f-4e81cfebf716" path="/var/lib/kubelet/pods/e74c1123-26eb-43c0-935f-4e81cfebf716/volumes" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.067380 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.074190 4991 generic.go:334] "Generic (PLEG): container finished" podID="7f8e1b96-e706-4f50-b033-b0fb66936deb" containerID="026fafed64204e02cd1b4e5a4aa3520e431c2610853beac3abc984b0f672b8b6" exitCode=0 Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.074259 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7f8e1b96-e706-4f50-b033-b0fb66936deb","Type":"ContainerDied","Data":"026fafed64204e02cd1b4e5a4aa3520e431c2610853beac3abc984b0f672b8b6"} Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.074300 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7f8e1b96-e706-4f50-b033-b0fb66936deb","Type":"ContainerDied","Data":"81f722b0b530fbbf80e3ab03e0964afc4bb44f27cbb6f08260eee6176b260f60"} Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.074323 4991 scope.go:117] "RemoveContainer" containerID="026fafed64204e02cd1b4e5a4aa3520e431c2610853beac3abc984b0f672b8b6" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.074476 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.079406 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7b96bae9-f846-4e7e-b1df-3d51b139dcb5","Type":"ContainerStarted","Data":"af32a739f711375d36b14e063c04115e707b459ea8d4eddc0a6ce5378ce582cf"} Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.079458 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7b96bae9-f846-4e7e-b1df-3d51b139dcb5","Type":"ContainerStarted","Data":"560e52467cb749abf664e26f828594c99f6753cc666c43a27197ca630161b619"} Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.086703 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82992cbc-9e0f-44a8-bd20-06a746836080","Type":"ContainerStarted","Data":"4843dfee6ce241989aa04e8ae7fe3e82aec010f1ab2a8c6e5f43684ff5ccc917"} Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.086751 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82992cbc-9e0f-44a8-bd20-06a746836080","Type":"ContainerStarted","Data":"ab52f9f24fc3826b9bbb616c4f814f1e7b9a3691cd48b7de48df318a35696b44"} Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.182591 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpq4m\" (UniqueName: \"kubernetes.io/projected/7f8e1b96-e706-4f50-b033-b0fb66936deb-kube-api-access-bpq4m\") pod \"7f8e1b96-e706-4f50-b033-b0fb66936deb\" (UID: \"7f8e1b96-e706-4f50-b033-b0fb66936deb\") " Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.184078 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7f8e1b96-e706-4f50-b033-b0fb66936deb-httpd-run\") pod \"7f8e1b96-e706-4f50-b033-b0fb66936deb\" (UID: \"7f8e1b96-e706-4f50-b033-b0fb66936deb\") " Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.184821 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f8e1b96-e706-4f50-b033-b0fb66936deb-internal-tls-certs\") pod \"7f8e1b96-e706-4f50-b033-b0fb66936deb\" (UID: \"7f8e1b96-e706-4f50-b033-b0fb66936deb\") " Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.185119 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f8e1b96-e706-4f50-b033-b0fb66936deb-scripts\") pod \"7f8e1b96-e706-4f50-b033-b0fb66936deb\" (UID: \"7f8e1b96-e706-4f50-b033-b0fb66936deb\") " Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.188413 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"7f8e1b96-e706-4f50-b033-b0fb66936deb\" (UID: \"7f8e1b96-e706-4f50-b033-b0fb66936deb\") " Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.188670 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f8e1b96-e706-4f50-b033-b0fb66936deb-logs\") pod \"7f8e1b96-e706-4f50-b033-b0fb66936deb\" (UID: \"7f8e1b96-e706-4f50-b033-b0fb66936deb\") " Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.188790 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f8e1b96-e706-4f50-b033-b0fb66936deb-config-data\") pod \"7f8e1b96-e706-4f50-b033-b0fb66936deb\" (UID: \"7f8e1b96-e706-4f50-b033-b0fb66936deb\") " Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.188895 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f8e1b96-e706-4f50-b033-b0fb66936deb-combined-ca-bundle\") pod \"7f8e1b96-e706-4f50-b033-b0fb66936deb\" (UID: \"7f8e1b96-e706-4f50-b033-b0fb66936deb\") " Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.198224 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f8e1b96-e706-4f50-b033-b0fb66936deb-kube-api-access-bpq4m" (OuterVolumeSpecName: "kube-api-access-bpq4m") pod "7f8e1b96-e706-4f50-b033-b0fb66936deb" (UID: "7f8e1b96-e706-4f50-b033-b0fb66936deb"). InnerVolumeSpecName "kube-api-access-bpq4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.198620 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f8e1b96-e706-4f50-b033-b0fb66936deb-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7f8e1b96-e706-4f50-b033-b0fb66936deb" (UID: "7f8e1b96-e706-4f50-b033-b0fb66936deb"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.201592 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f8e1b96-e706-4f50-b033-b0fb66936deb-scripts" (OuterVolumeSpecName: "scripts") pod "7f8e1b96-e706-4f50-b033-b0fb66936deb" (UID: "7f8e1b96-e706-4f50-b033-b0fb66936deb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.207158 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f8e1b96-e706-4f50-b033-b0fb66936deb-logs" (OuterVolumeSpecName: "logs") pod "7f8e1b96-e706-4f50-b033-b0fb66936deb" (UID: "7f8e1b96-e706-4f50-b033-b0fb66936deb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.214031 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "7f8e1b96-e706-4f50-b033-b0fb66936deb" (UID: "7f8e1b96-e706-4f50-b033-b0fb66936deb"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.264058 4991 scope.go:117] "RemoveContainer" containerID="70e84139579a76c64d6888d736431bcd66bd236eb2d0d8df41fd650fb013a1e7" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.270853 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f8e1b96-e706-4f50-b033-b0fb66936deb-config-data" (OuterVolumeSpecName: "config-data") pod "7f8e1b96-e706-4f50-b033-b0fb66936deb" (UID: "7f8e1b96-e706-4f50-b033-b0fb66936deb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.292075 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f8e1b96-e706-4f50-b033-b0fb66936deb-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.292137 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpq4m\" (UniqueName: \"kubernetes.io/projected/7f8e1b96-e706-4f50-b033-b0fb66936deb-kube-api-access-bpq4m\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.292156 4991 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7f8e1b96-e706-4f50-b033-b0fb66936deb-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.292169 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f8e1b96-e706-4f50-b033-b0fb66936deb-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.292182 4991 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f8e1b96-e706-4f50-b033-b0fb66936deb-logs\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.292273 4991 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.322314 4991 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.325142 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f8e1b96-e706-4f50-b033-b0fb66936deb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7f8e1b96-e706-4f50-b033-b0fb66936deb" (UID: "7f8e1b96-e706-4f50-b033-b0fb66936deb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.364095 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f8e1b96-e706-4f50-b033-b0fb66936deb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f8e1b96-e706-4f50-b033-b0fb66936deb" (UID: "7f8e1b96-e706-4f50-b033-b0fb66936deb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.394675 4991 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f8e1b96-e706-4f50-b033-b0fb66936deb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.394717 4991 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.394727 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f8e1b96-e706-4f50-b033-b0fb66936deb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.432262 4991 scope.go:117] "RemoveContainer" containerID="026fafed64204e02cd1b4e5a4aa3520e431c2610853beac3abc984b0f672b8b6" Dec 06 01:24:19 crc kubenswrapper[4991]: E1206 01:24:19.453549 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"026fafed64204e02cd1b4e5a4aa3520e431c2610853beac3abc984b0f672b8b6\": container with ID starting with 026fafed64204e02cd1b4e5a4aa3520e431c2610853beac3abc984b0f672b8b6 not found: ID does not exist" containerID="026fafed64204e02cd1b4e5a4aa3520e431c2610853beac3abc984b0f672b8b6" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.453621 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"026fafed64204e02cd1b4e5a4aa3520e431c2610853beac3abc984b0f672b8b6"} err="failed to get container status \"026fafed64204e02cd1b4e5a4aa3520e431c2610853beac3abc984b0f672b8b6\": rpc error: code = NotFound desc = could not find container \"026fafed64204e02cd1b4e5a4aa3520e431c2610853beac3abc984b0f672b8b6\": container with ID starting with 026fafed64204e02cd1b4e5a4aa3520e431c2610853beac3abc984b0f672b8b6 not found: ID does not exist" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.453665 4991 scope.go:117] "RemoveContainer" containerID="70e84139579a76c64d6888d736431bcd66bd236eb2d0d8df41fd650fb013a1e7" Dec 06 01:24:19 crc kubenswrapper[4991]: E1206 01:24:19.454328 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70e84139579a76c64d6888d736431bcd66bd236eb2d0d8df41fd650fb013a1e7\": container with ID starting with 70e84139579a76c64d6888d736431bcd66bd236eb2d0d8df41fd650fb013a1e7 not found: ID does not exist" containerID="70e84139579a76c64d6888d736431bcd66bd236eb2d0d8df41fd650fb013a1e7" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.454359 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70e84139579a76c64d6888d736431bcd66bd236eb2d0d8df41fd650fb013a1e7"} err="failed to get container status \"70e84139579a76c64d6888d736431bcd66bd236eb2d0d8df41fd650fb013a1e7\": rpc error: code = NotFound desc = could not find container \"70e84139579a76c64d6888d736431bcd66bd236eb2d0d8df41fd650fb013a1e7\": container with ID starting with 70e84139579a76c64d6888d736431bcd66bd236eb2d0d8df41fd650fb013a1e7 not found: ID does not exist" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.473312 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.505817 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.526615 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 01:24:19 crc kubenswrapper[4991]: E1206 01:24:19.529472 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f8e1b96-e706-4f50-b033-b0fb66936deb" containerName="glance-httpd" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.529503 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f8e1b96-e706-4f50-b033-b0fb66936deb" containerName="glance-httpd" Dec 06 01:24:19 crc kubenswrapper[4991]: E1206 01:24:19.529552 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f8e1b96-e706-4f50-b033-b0fb66936deb" containerName="glance-log" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.529564 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f8e1b96-e706-4f50-b033-b0fb66936deb" containerName="glance-log" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.529789 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f8e1b96-e706-4f50-b033-b0fb66936deb" containerName="glance-httpd" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.529807 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f8e1b96-e706-4f50-b033-b0fb66936deb" containerName="glance-log" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.532446 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.539408 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.539718 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.549091 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.600816 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9569cf80-9b29-4a2e-8a9b-a618fb5b0128-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9569cf80-9b29-4a2e-8a9b-a618fb5b0128\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.600866 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9569cf80-9b29-4a2e-8a9b-a618fb5b0128-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9569cf80-9b29-4a2e-8a9b-a618fb5b0128\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.600907 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9569cf80-9b29-4a2e-8a9b-a618fb5b0128-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9569cf80-9b29-4a2e-8a9b-a618fb5b0128\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.600936 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9569cf80-9b29-4a2e-8a9b-a618fb5b0128-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9569cf80-9b29-4a2e-8a9b-a618fb5b0128\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.600974 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"9569cf80-9b29-4a2e-8a9b-a618fb5b0128\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.601033 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bjxd\" (UniqueName: \"kubernetes.io/projected/9569cf80-9b29-4a2e-8a9b-a618fb5b0128-kube-api-access-2bjxd\") pod \"glance-default-internal-api-0\" (UID: \"9569cf80-9b29-4a2e-8a9b-a618fb5b0128\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.601088 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9569cf80-9b29-4a2e-8a9b-a618fb5b0128-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9569cf80-9b29-4a2e-8a9b-a618fb5b0128\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.601127 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9569cf80-9b29-4a2e-8a9b-a618fb5b0128-logs\") pod \"glance-default-internal-api-0\" (UID: \"9569cf80-9b29-4a2e-8a9b-a618fb5b0128\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.703428 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9569cf80-9b29-4a2e-8a9b-a618fb5b0128-logs\") pod \"glance-default-internal-api-0\" (UID: \"9569cf80-9b29-4a2e-8a9b-a618fb5b0128\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.703551 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9569cf80-9b29-4a2e-8a9b-a618fb5b0128-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9569cf80-9b29-4a2e-8a9b-a618fb5b0128\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.703627 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9569cf80-9b29-4a2e-8a9b-a618fb5b0128-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9569cf80-9b29-4a2e-8a9b-a618fb5b0128\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.703666 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9569cf80-9b29-4a2e-8a9b-a618fb5b0128-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9569cf80-9b29-4a2e-8a9b-a618fb5b0128\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.703696 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9569cf80-9b29-4a2e-8a9b-a618fb5b0128-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9569cf80-9b29-4a2e-8a9b-a618fb5b0128\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.703733 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"9569cf80-9b29-4a2e-8a9b-a618fb5b0128\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.703767 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bjxd\" (UniqueName: \"kubernetes.io/projected/9569cf80-9b29-4a2e-8a9b-a618fb5b0128-kube-api-access-2bjxd\") pod \"glance-default-internal-api-0\" (UID: \"9569cf80-9b29-4a2e-8a9b-a618fb5b0128\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.703825 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9569cf80-9b29-4a2e-8a9b-a618fb5b0128-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9569cf80-9b29-4a2e-8a9b-a618fb5b0128\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.704400 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9569cf80-9b29-4a2e-8a9b-a618fb5b0128-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9569cf80-9b29-4a2e-8a9b-a618fb5b0128\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.704661 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9569cf80-9b29-4a2e-8a9b-a618fb5b0128-logs\") pod \"glance-default-internal-api-0\" (UID: \"9569cf80-9b29-4a2e-8a9b-a618fb5b0128\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.706052 4991 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"9569cf80-9b29-4a2e-8a9b-a618fb5b0128\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.714506 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9569cf80-9b29-4a2e-8a9b-a618fb5b0128-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9569cf80-9b29-4a2e-8a9b-a618fb5b0128\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.718963 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9569cf80-9b29-4a2e-8a9b-a618fb5b0128-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9569cf80-9b29-4a2e-8a9b-a618fb5b0128\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.719213 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9569cf80-9b29-4a2e-8a9b-a618fb5b0128-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9569cf80-9b29-4a2e-8a9b-a618fb5b0128\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.725137 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9569cf80-9b29-4a2e-8a9b-a618fb5b0128-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9569cf80-9b29-4a2e-8a9b-a618fb5b0128\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.729543 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bjxd\" (UniqueName: \"kubernetes.io/projected/9569cf80-9b29-4a2e-8a9b-a618fb5b0128-kube-api-access-2bjxd\") pod \"glance-default-internal-api-0\" (UID: \"9569cf80-9b29-4a2e-8a9b-a618fb5b0128\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:24:19 crc kubenswrapper[4991]: E1206 01:24:19.754102 4991 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f8e1b96_e706_4f50_b033_b0fb66936deb.slice/crio-81f722b0b530fbbf80e3ab03e0964afc4bb44f27cbb6f08260eee6176b260f60\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f8e1b96_e706_4f50_b033_b0fb66936deb.slice\": RecentStats: unable to find data in memory cache]" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.756232 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"9569cf80-9b29-4a2e-8a9b-a618fb5b0128\") " pod="openstack/glance-default-internal-api-0" Dec 06 01:24:19 crc kubenswrapper[4991]: I1206 01:24:19.874422 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 06 01:24:20 crc kubenswrapper[4991]: I1206 01:24:20.616488 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 06 01:24:21 crc kubenswrapper[4991]: I1206 01:24:21.067405 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f8e1b96-e706-4f50-b033-b0fb66936deb" path="/var/lib/kubelet/pods/7f8e1b96-e706-4f50-b033-b0fb66936deb/volumes" Dec 06 01:24:21 crc kubenswrapper[4991]: I1206 01:24:21.171544 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7b96bae9-f846-4e7e-b1df-3d51b139dcb5","Type":"ContainerStarted","Data":"95e2eecb161bd54f3d7c4761abfd7230dee78effa8561b9ea29df53e99f307bf"} Dec 06 01:24:21 crc kubenswrapper[4991]: I1206 01:24:21.176022 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9569cf80-9b29-4a2e-8a9b-a618fb5b0128","Type":"ContainerStarted","Data":"f9f6724c2916d75ddf81f4079b8c54ce1cec2d05ae10f858b2d96b9af1278ac0"} Dec 06 01:24:21 crc kubenswrapper[4991]: I1206 01:24:21.197325 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.197299053 podStartE2EDuration="4.197299053s" podCreationTimestamp="2025-12-06 01:24:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:24:21.193359891 +0000 UTC m=+1334.543650359" watchObservedRunningTime="2025-12-06 01:24:21.197299053 +0000 UTC m=+1334.547589521" Dec 06 01:24:22 crc kubenswrapper[4991]: I1206 01:24:22.194092 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9569cf80-9b29-4a2e-8a9b-a618fb5b0128","Type":"ContainerStarted","Data":"ef8e7a21c3c29fa1c00bd20c0ee801e621ab71d6003d5cee443bf97215fd2b3e"} Dec 06 01:24:22 crc kubenswrapper[4991]: I1206 01:24:22.195045 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9569cf80-9b29-4a2e-8a9b-a618fb5b0128","Type":"ContainerStarted","Data":"2d6bcdc234a9f9ab25f83e995f1fbcd71e6a7d6df9f77556f42449a12311bab2"} Dec 06 01:24:22 crc kubenswrapper[4991]: I1206 01:24:22.197409 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82992cbc-9e0f-44a8-bd20-06a746836080","Type":"ContainerStarted","Data":"d86fd1700aada4a1df634eb44ca9d398950fd93b0704081dab1d64e22f9cb12b"} Dec 06 01:24:22 crc kubenswrapper[4991]: I1206 01:24:22.228045 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.228023737 podStartE2EDuration="3.228023737s" podCreationTimestamp="2025-12-06 01:24:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:24:22.216725344 +0000 UTC m=+1335.567015812" watchObservedRunningTime="2025-12-06 01:24:22.228023737 +0000 UTC m=+1335.578314205" Dec 06 01:24:23 crc kubenswrapper[4991]: I1206 01:24:23.216100 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82992cbc-9e0f-44a8-bd20-06a746836080","Type":"ContainerStarted","Data":"48266da3e02965500205c8e880eae95bc2ffc0856b7b6f12417d7160f024d902"} Dec 06 01:24:23 crc kubenswrapper[4991]: I1206 01:24:23.216647 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 06 01:24:23 crc kubenswrapper[4991]: I1206 01:24:23.216266 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="82992cbc-9e0f-44a8-bd20-06a746836080" containerName="proxy-httpd" containerID="cri-o://48266da3e02965500205c8e880eae95bc2ffc0856b7b6f12417d7160f024d902" gracePeriod=30 Dec 06 01:24:23 crc kubenswrapper[4991]: I1206 01:24:23.216200 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="82992cbc-9e0f-44a8-bd20-06a746836080" containerName="ceilometer-central-agent" containerID="cri-o://ab52f9f24fc3826b9bbb616c4f814f1e7b9a3691cd48b7de48df318a35696b44" gracePeriod=30 Dec 06 01:24:23 crc kubenswrapper[4991]: I1206 01:24:23.216342 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="82992cbc-9e0f-44a8-bd20-06a746836080" containerName="ceilometer-notification-agent" containerID="cri-o://4843dfee6ce241989aa04e8ae7fe3e82aec010f1ab2a8c6e5f43684ff5ccc917" gracePeriod=30 Dec 06 01:24:23 crc kubenswrapper[4991]: I1206 01:24:23.216330 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="82992cbc-9e0f-44a8-bd20-06a746836080" containerName="sg-core" containerID="cri-o://d86fd1700aada4a1df634eb44ca9d398950fd93b0704081dab1d64e22f9cb12b" gracePeriod=30 Dec 06 01:24:23 crc kubenswrapper[4991]: I1206 01:24:23.252062 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.567685303 podStartE2EDuration="7.252039637s" podCreationTimestamp="2025-12-06 01:24:16 +0000 UTC" firstStartedPulling="2025-12-06 01:24:17.589885308 +0000 UTC m=+1330.940175776" lastFinishedPulling="2025-12-06 01:24:22.274239642 +0000 UTC m=+1335.624530110" observedRunningTime="2025-12-06 01:24:23.247776157 +0000 UTC m=+1336.598066625" watchObservedRunningTime="2025-12-06 01:24:23.252039637 +0000 UTC m=+1336.602330105" Dec 06 01:24:24 crc kubenswrapper[4991]: I1206 01:24:24.226033 4991 generic.go:334] "Generic (PLEG): container finished" podID="82992cbc-9e0f-44a8-bd20-06a746836080" containerID="48266da3e02965500205c8e880eae95bc2ffc0856b7b6f12417d7160f024d902" exitCode=0 Dec 06 01:24:24 crc kubenswrapper[4991]: I1206 01:24:24.226371 4991 generic.go:334] "Generic (PLEG): container finished" podID="82992cbc-9e0f-44a8-bd20-06a746836080" containerID="d86fd1700aada4a1df634eb44ca9d398950fd93b0704081dab1d64e22f9cb12b" exitCode=2 Dec 06 01:24:24 crc kubenswrapper[4991]: I1206 01:24:24.226383 4991 generic.go:334] "Generic (PLEG): container finished" podID="82992cbc-9e0f-44a8-bd20-06a746836080" containerID="4843dfee6ce241989aa04e8ae7fe3e82aec010f1ab2a8c6e5f43684ff5ccc917" exitCode=0 Dec 06 01:24:24 crc kubenswrapper[4991]: I1206 01:24:24.226103 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82992cbc-9e0f-44a8-bd20-06a746836080","Type":"ContainerDied","Data":"48266da3e02965500205c8e880eae95bc2ffc0856b7b6f12417d7160f024d902"} Dec 06 01:24:24 crc kubenswrapper[4991]: I1206 01:24:24.226426 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82992cbc-9e0f-44a8-bd20-06a746836080","Type":"ContainerDied","Data":"d86fd1700aada4a1df634eb44ca9d398950fd93b0704081dab1d64e22f9cb12b"} Dec 06 01:24:24 crc kubenswrapper[4991]: I1206 01:24:24.226439 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82992cbc-9e0f-44a8-bd20-06a746836080","Type":"ContainerDied","Data":"4843dfee6ce241989aa04e8ae7fe3e82aec010f1ab2a8c6e5f43684ff5ccc917"} Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.220094 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.237780 4991 generic.go:334] "Generic (PLEG): container finished" podID="82992cbc-9e0f-44a8-bd20-06a746836080" containerID="ab52f9f24fc3826b9bbb616c4f814f1e7b9a3691cd48b7de48df318a35696b44" exitCode=0 Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.237859 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82992cbc-9e0f-44a8-bd20-06a746836080","Type":"ContainerDied","Data":"ab52f9f24fc3826b9bbb616c4f814f1e7b9a3691cd48b7de48df318a35696b44"} Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.237907 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82992cbc-9e0f-44a8-bd20-06a746836080","Type":"ContainerDied","Data":"5089b706978a91c1112270f076d8ae7d84fb35fb1d52e8ad3c26b3e86b6ed5d1"} Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.237942 4991 scope.go:117] "RemoveContainer" containerID="48266da3e02965500205c8e880eae95bc2ffc0856b7b6f12417d7160f024d902" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.238268 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.275427 4991 scope.go:117] "RemoveContainer" containerID="d86fd1700aada4a1df634eb44ca9d398950fd93b0704081dab1d64e22f9cb12b" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.297104 4991 scope.go:117] "RemoveContainer" containerID="4843dfee6ce241989aa04e8ae7fe3e82aec010f1ab2a8c6e5f43684ff5ccc917" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.328039 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82992cbc-9e0f-44a8-bd20-06a746836080-combined-ca-bundle\") pod \"82992cbc-9e0f-44a8-bd20-06a746836080\" (UID: \"82992cbc-9e0f-44a8-bd20-06a746836080\") " Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.328177 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82992cbc-9e0f-44a8-bd20-06a746836080-log-httpd\") pod \"82992cbc-9e0f-44a8-bd20-06a746836080\" (UID: \"82992cbc-9e0f-44a8-bd20-06a746836080\") " Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.328321 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82992cbc-9e0f-44a8-bd20-06a746836080-run-httpd\") pod \"82992cbc-9e0f-44a8-bd20-06a746836080\" (UID: \"82992cbc-9e0f-44a8-bd20-06a746836080\") " Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.328372 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/82992cbc-9e0f-44a8-bd20-06a746836080-sg-core-conf-yaml\") pod \"82992cbc-9e0f-44a8-bd20-06a746836080\" (UID: \"82992cbc-9e0f-44a8-bd20-06a746836080\") " Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.328440 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82992cbc-9e0f-44a8-bd20-06a746836080-config-data\") pod \"82992cbc-9e0f-44a8-bd20-06a746836080\" (UID: \"82992cbc-9e0f-44a8-bd20-06a746836080\") " Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.328479 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thsdb\" (UniqueName: \"kubernetes.io/projected/82992cbc-9e0f-44a8-bd20-06a746836080-kube-api-access-thsdb\") pod \"82992cbc-9e0f-44a8-bd20-06a746836080\" (UID: \"82992cbc-9e0f-44a8-bd20-06a746836080\") " Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.328604 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82992cbc-9e0f-44a8-bd20-06a746836080-scripts\") pod \"82992cbc-9e0f-44a8-bd20-06a746836080\" (UID: \"82992cbc-9e0f-44a8-bd20-06a746836080\") " Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.330471 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82992cbc-9e0f-44a8-bd20-06a746836080-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "82992cbc-9e0f-44a8-bd20-06a746836080" (UID: "82992cbc-9e0f-44a8-bd20-06a746836080"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.330714 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82992cbc-9e0f-44a8-bd20-06a746836080-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "82992cbc-9e0f-44a8-bd20-06a746836080" (UID: "82992cbc-9e0f-44a8-bd20-06a746836080"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.333330 4991 scope.go:117] "RemoveContainer" containerID="ab52f9f24fc3826b9bbb616c4f814f1e7b9a3691cd48b7de48df318a35696b44" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.338878 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82992cbc-9e0f-44a8-bd20-06a746836080-kube-api-access-thsdb" (OuterVolumeSpecName: "kube-api-access-thsdb") pod "82992cbc-9e0f-44a8-bd20-06a746836080" (UID: "82992cbc-9e0f-44a8-bd20-06a746836080"). InnerVolumeSpecName "kube-api-access-thsdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.346363 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82992cbc-9e0f-44a8-bd20-06a746836080-scripts" (OuterVolumeSpecName: "scripts") pod "82992cbc-9e0f-44a8-bd20-06a746836080" (UID: "82992cbc-9e0f-44a8-bd20-06a746836080"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.377724 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82992cbc-9e0f-44a8-bd20-06a746836080-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "82992cbc-9e0f-44a8-bd20-06a746836080" (UID: "82992cbc-9e0f-44a8-bd20-06a746836080"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.431622 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82992cbc-9e0f-44a8-bd20-06a746836080-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82992cbc-9e0f-44a8-bd20-06a746836080" (UID: "82992cbc-9e0f-44a8-bd20-06a746836080"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.431822 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82992cbc-9e0f-44a8-bd20-06a746836080-combined-ca-bundle\") pod \"82992cbc-9e0f-44a8-bd20-06a746836080\" (UID: \"82992cbc-9e0f-44a8-bd20-06a746836080\") " Dec 06 01:24:25 crc kubenswrapper[4991]: W1206 01:24:25.431954 4991 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/82992cbc-9e0f-44a8-bd20-06a746836080/volumes/kubernetes.io~secret/combined-ca-bundle Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.431974 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82992cbc-9e0f-44a8-bd20-06a746836080-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82992cbc-9e0f-44a8-bd20-06a746836080" (UID: "82992cbc-9e0f-44a8-bd20-06a746836080"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.432762 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thsdb\" (UniqueName: \"kubernetes.io/projected/82992cbc-9e0f-44a8-bd20-06a746836080-kube-api-access-thsdb\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.432797 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82992cbc-9e0f-44a8-bd20-06a746836080-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.432814 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82992cbc-9e0f-44a8-bd20-06a746836080-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.432829 4991 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82992cbc-9e0f-44a8-bd20-06a746836080-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.432846 4991 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82992cbc-9e0f-44a8-bd20-06a746836080-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.432862 4991 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/82992cbc-9e0f-44a8-bd20-06a746836080-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.438032 4991 scope.go:117] "RemoveContainer" containerID="48266da3e02965500205c8e880eae95bc2ffc0856b7b6f12417d7160f024d902" Dec 06 01:24:25 crc kubenswrapper[4991]: E1206 01:24:25.438818 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48266da3e02965500205c8e880eae95bc2ffc0856b7b6f12417d7160f024d902\": container with ID starting with 48266da3e02965500205c8e880eae95bc2ffc0856b7b6f12417d7160f024d902 not found: ID does not exist" containerID="48266da3e02965500205c8e880eae95bc2ffc0856b7b6f12417d7160f024d902" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.438875 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48266da3e02965500205c8e880eae95bc2ffc0856b7b6f12417d7160f024d902"} err="failed to get container status \"48266da3e02965500205c8e880eae95bc2ffc0856b7b6f12417d7160f024d902\": rpc error: code = NotFound desc = could not find container \"48266da3e02965500205c8e880eae95bc2ffc0856b7b6f12417d7160f024d902\": container with ID starting with 48266da3e02965500205c8e880eae95bc2ffc0856b7b6f12417d7160f024d902 not found: ID does not exist" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.438936 4991 scope.go:117] "RemoveContainer" containerID="d86fd1700aada4a1df634eb44ca9d398950fd93b0704081dab1d64e22f9cb12b" Dec 06 01:24:25 crc kubenswrapper[4991]: E1206 01:24:25.439356 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d86fd1700aada4a1df634eb44ca9d398950fd93b0704081dab1d64e22f9cb12b\": container with ID starting with d86fd1700aada4a1df634eb44ca9d398950fd93b0704081dab1d64e22f9cb12b not found: ID does not exist" containerID="d86fd1700aada4a1df634eb44ca9d398950fd93b0704081dab1d64e22f9cb12b" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.439400 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d86fd1700aada4a1df634eb44ca9d398950fd93b0704081dab1d64e22f9cb12b"} err="failed to get container status \"d86fd1700aada4a1df634eb44ca9d398950fd93b0704081dab1d64e22f9cb12b\": rpc error: code = NotFound desc = could not find container \"d86fd1700aada4a1df634eb44ca9d398950fd93b0704081dab1d64e22f9cb12b\": container with ID starting with d86fd1700aada4a1df634eb44ca9d398950fd93b0704081dab1d64e22f9cb12b not found: ID does not exist" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.439437 4991 scope.go:117] "RemoveContainer" containerID="4843dfee6ce241989aa04e8ae7fe3e82aec010f1ab2a8c6e5f43684ff5ccc917" Dec 06 01:24:25 crc kubenswrapper[4991]: E1206 01:24:25.439883 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4843dfee6ce241989aa04e8ae7fe3e82aec010f1ab2a8c6e5f43684ff5ccc917\": container with ID starting with 4843dfee6ce241989aa04e8ae7fe3e82aec010f1ab2a8c6e5f43684ff5ccc917 not found: ID does not exist" containerID="4843dfee6ce241989aa04e8ae7fe3e82aec010f1ab2a8c6e5f43684ff5ccc917" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.439921 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4843dfee6ce241989aa04e8ae7fe3e82aec010f1ab2a8c6e5f43684ff5ccc917"} err="failed to get container status \"4843dfee6ce241989aa04e8ae7fe3e82aec010f1ab2a8c6e5f43684ff5ccc917\": rpc error: code = NotFound desc = could not find container \"4843dfee6ce241989aa04e8ae7fe3e82aec010f1ab2a8c6e5f43684ff5ccc917\": container with ID starting with 4843dfee6ce241989aa04e8ae7fe3e82aec010f1ab2a8c6e5f43684ff5ccc917 not found: ID does not exist" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.439940 4991 scope.go:117] "RemoveContainer" containerID="ab52f9f24fc3826b9bbb616c4f814f1e7b9a3691cd48b7de48df318a35696b44" Dec 06 01:24:25 crc kubenswrapper[4991]: E1206 01:24:25.440289 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab52f9f24fc3826b9bbb616c4f814f1e7b9a3691cd48b7de48df318a35696b44\": container with ID starting with ab52f9f24fc3826b9bbb616c4f814f1e7b9a3691cd48b7de48df318a35696b44 not found: ID does not exist" containerID="ab52f9f24fc3826b9bbb616c4f814f1e7b9a3691cd48b7de48df318a35696b44" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.440447 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab52f9f24fc3826b9bbb616c4f814f1e7b9a3691cd48b7de48df318a35696b44"} err="failed to get container status \"ab52f9f24fc3826b9bbb616c4f814f1e7b9a3691cd48b7de48df318a35696b44\": rpc error: code = NotFound desc = could not find container \"ab52f9f24fc3826b9bbb616c4f814f1e7b9a3691cd48b7de48df318a35696b44\": container with ID starting with ab52f9f24fc3826b9bbb616c4f814f1e7b9a3691cd48b7de48df318a35696b44 not found: ID does not exist" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.457916 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82992cbc-9e0f-44a8-bd20-06a746836080-config-data" (OuterVolumeSpecName: "config-data") pod "82992cbc-9e0f-44a8-bd20-06a746836080" (UID: "82992cbc-9e0f-44a8-bd20-06a746836080"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.535188 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82992cbc-9e0f-44a8-bd20-06a746836080-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.584801 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.613683 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.629373 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:24:25 crc kubenswrapper[4991]: E1206 01:24:25.630021 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82992cbc-9e0f-44a8-bd20-06a746836080" containerName="proxy-httpd" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.630046 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="82992cbc-9e0f-44a8-bd20-06a746836080" containerName="proxy-httpd" Dec 06 01:24:25 crc kubenswrapper[4991]: E1206 01:24:25.630083 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82992cbc-9e0f-44a8-bd20-06a746836080" containerName="sg-core" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.630092 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="82992cbc-9e0f-44a8-bd20-06a746836080" containerName="sg-core" Dec 06 01:24:25 crc kubenswrapper[4991]: E1206 01:24:25.630128 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82992cbc-9e0f-44a8-bd20-06a746836080" containerName="ceilometer-notification-agent" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.630138 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="82992cbc-9e0f-44a8-bd20-06a746836080" containerName="ceilometer-notification-agent" Dec 06 01:24:25 crc kubenswrapper[4991]: E1206 01:24:25.630145 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82992cbc-9e0f-44a8-bd20-06a746836080" containerName="ceilometer-central-agent" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.630152 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="82992cbc-9e0f-44a8-bd20-06a746836080" containerName="ceilometer-central-agent" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.630362 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="82992cbc-9e0f-44a8-bd20-06a746836080" containerName="ceilometer-notification-agent" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.630387 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="82992cbc-9e0f-44a8-bd20-06a746836080" containerName="proxy-httpd" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.630407 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="82992cbc-9e0f-44a8-bd20-06a746836080" containerName="sg-core" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.630418 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="82992cbc-9e0f-44a8-bd20-06a746836080" containerName="ceilometer-central-agent" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.632582 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.635184 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.635307 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.642124 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.739365 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52b047e6-8844-421a-877f-3cbda2e6d6a7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"52b047e6-8844-421a-877f-3cbda2e6d6a7\") " pod="openstack/ceilometer-0" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.739487 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52b047e6-8844-421a-877f-3cbda2e6d6a7-config-data\") pod \"ceilometer-0\" (UID: \"52b047e6-8844-421a-877f-3cbda2e6d6a7\") " pod="openstack/ceilometer-0" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.739540 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52b047e6-8844-421a-877f-3cbda2e6d6a7-run-httpd\") pod \"ceilometer-0\" (UID: \"52b047e6-8844-421a-877f-3cbda2e6d6a7\") " pod="openstack/ceilometer-0" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.739969 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52b047e6-8844-421a-877f-3cbda2e6d6a7-scripts\") pod \"ceilometer-0\" (UID: \"52b047e6-8844-421a-877f-3cbda2e6d6a7\") " pod="openstack/ceilometer-0" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.740172 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52b047e6-8844-421a-877f-3cbda2e6d6a7-log-httpd\") pod \"ceilometer-0\" (UID: \"52b047e6-8844-421a-877f-3cbda2e6d6a7\") " pod="openstack/ceilometer-0" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.740200 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/52b047e6-8844-421a-877f-3cbda2e6d6a7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"52b047e6-8844-421a-877f-3cbda2e6d6a7\") " pod="openstack/ceilometer-0" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.740232 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvkmv\" (UniqueName: \"kubernetes.io/projected/52b047e6-8844-421a-877f-3cbda2e6d6a7-kube-api-access-dvkmv\") pod \"ceilometer-0\" (UID: \"52b047e6-8844-421a-877f-3cbda2e6d6a7\") " pod="openstack/ceilometer-0" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.842799 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52b047e6-8844-421a-877f-3cbda2e6d6a7-config-data\") pod \"ceilometer-0\" (UID: \"52b047e6-8844-421a-877f-3cbda2e6d6a7\") " pod="openstack/ceilometer-0" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.842878 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52b047e6-8844-421a-877f-3cbda2e6d6a7-run-httpd\") pod \"ceilometer-0\" (UID: \"52b047e6-8844-421a-877f-3cbda2e6d6a7\") " pod="openstack/ceilometer-0" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.842946 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52b047e6-8844-421a-877f-3cbda2e6d6a7-scripts\") pod \"ceilometer-0\" (UID: \"52b047e6-8844-421a-877f-3cbda2e6d6a7\") " pod="openstack/ceilometer-0" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.843009 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52b047e6-8844-421a-877f-3cbda2e6d6a7-log-httpd\") pod \"ceilometer-0\" (UID: \"52b047e6-8844-421a-877f-3cbda2e6d6a7\") " pod="openstack/ceilometer-0" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.843033 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/52b047e6-8844-421a-877f-3cbda2e6d6a7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"52b047e6-8844-421a-877f-3cbda2e6d6a7\") " pod="openstack/ceilometer-0" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.843063 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvkmv\" (UniqueName: \"kubernetes.io/projected/52b047e6-8844-421a-877f-3cbda2e6d6a7-kube-api-access-dvkmv\") pod \"ceilometer-0\" (UID: \"52b047e6-8844-421a-877f-3cbda2e6d6a7\") " pod="openstack/ceilometer-0" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.843115 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52b047e6-8844-421a-877f-3cbda2e6d6a7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"52b047e6-8844-421a-877f-3cbda2e6d6a7\") " pod="openstack/ceilometer-0" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.843769 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52b047e6-8844-421a-877f-3cbda2e6d6a7-log-httpd\") pod \"ceilometer-0\" (UID: \"52b047e6-8844-421a-877f-3cbda2e6d6a7\") " pod="openstack/ceilometer-0" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.844171 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52b047e6-8844-421a-877f-3cbda2e6d6a7-run-httpd\") pod \"ceilometer-0\" (UID: \"52b047e6-8844-421a-877f-3cbda2e6d6a7\") " pod="openstack/ceilometer-0" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.847189 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52b047e6-8844-421a-877f-3cbda2e6d6a7-scripts\") pod \"ceilometer-0\" (UID: \"52b047e6-8844-421a-877f-3cbda2e6d6a7\") " pod="openstack/ceilometer-0" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.847331 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52b047e6-8844-421a-877f-3cbda2e6d6a7-config-data\") pod \"ceilometer-0\" (UID: \"52b047e6-8844-421a-877f-3cbda2e6d6a7\") " pod="openstack/ceilometer-0" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.852270 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52b047e6-8844-421a-877f-3cbda2e6d6a7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"52b047e6-8844-421a-877f-3cbda2e6d6a7\") " pod="openstack/ceilometer-0" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.853871 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/52b047e6-8844-421a-877f-3cbda2e6d6a7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"52b047e6-8844-421a-877f-3cbda2e6d6a7\") " pod="openstack/ceilometer-0" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.876944 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvkmv\" (UniqueName: \"kubernetes.io/projected/52b047e6-8844-421a-877f-3cbda2e6d6a7-kube-api-access-dvkmv\") pod \"ceilometer-0\" (UID: \"52b047e6-8844-421a-877f-3cbda2e6d6a7\") " pod="openstack/ceilometer-0" Dec 06 01:24:25 crc kubenswrapper[4991]: I1206 01:24:25.968821 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 01:24:26 crc kubenswrapper[4991]: I1206 01:24:26.475343 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:24:27 crc kubenswrapper[4991]: I1206 01:24:27.051349 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82992cbc-9e0f-44a8-bd20-06a746836080" path="/var/lib/kubelet/pods/82992cbc-9e0f-44a8-bd20-06a746836080/volumes" Dec 06 01:24:27 crc kubenswrapper[4991]: I1206 01:24:27.260394 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52b047e6-8844-421a-877f-3cbda2e6d6a7","Type":"ContainerStarted","Data":"c18cb57f467f2a5287fc106969693f91ed3079a86c8dbde8b35af87df6dc4c16"} Dec 06 01:24:27 crc kubenswrapper[4991]: I1206 01:24:27.260817 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52b047e6-8844-421a-877f-3cbda2e6d6a7","Type":"ContainerStarted","Data":"866c04d3860e48fbb0f3c6008b430b7c3860ebafeab8b86b220512051e368c15"} Dec 06 01:24:27 crc kubenswrapper[4991]: I1206 01:24:27.262891 4991 generic.go:334] "Generic (PLEG): container finished" podID="4490e9af-2abc-4d2c-83e2-ce5b7324a32a" containerID="e4520f4c284293c26259544c9abb2578872b250069cdf4ad7a9791a263cff66c" exitCode=0 Dec 06 01:24:27 crc kubenswrapper[4991]: I1206 01:24:27.262963 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ktmxx" event={"ID":"4490e9af-2abc-4d2c-83e2-ce5b7324a32a","Type":"ContainerDied","Data":"e4520f4c284293c26259544c9abb2578872b250069cdf4ad7a9791a263cff66c"} Dec 06 01:24:27 crc kubenswrapper[4991]: I1206 01:24:27.463722 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 06 01:24:27 crc kubenswrapper[4991]: I1206 01:24:27.464250 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 06 01:24:27 crc kubenswrapper[4991]: I1206 01:24:27.504598 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 06 01:24:27 crc kubenswrapper[4991]: I1206 01:24:27.537816 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 06 01:24:28 crc kubenswrapper[4991]: I1206 01:24:28.277713 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52b047e6-8844-421a-877f-3cbda2e6d6a7","Type":"ContainerStarted","Data":"e6daab8b21fd31e0304f1079dc3cf6bf52fa4ee9e5f3808aa67af05423c74d6c"} Dec 06 01:24:28 crc kubenswrapper[4991]: I1206 01:24:28.278503 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 06 01:24:28 crc kubenswrapper[4991]: I1206 01:24:28.278537 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 06 01:24:28 crc kubenswrapper[4991]: I1206 01:24:28.709106 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ktmxx" Dec 06 01:24:28 crc kubenswrapper[4991]: I1206 01:24:28.816099 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4490e9af-2abc-4d2c-83e2-ce5b7324a32a-scripts\") pod \"4490e9af-2abc-4d2c-83e2-ce5b7324a32a\" (UID: \"4490e9af-2abc-4d2c-83e2-ce5b7324a32a\") " Dec 06 01:24:28 crc kubenswrapper[4991]: I1206 01:24:28.816273 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jngdk\" (UniqueName: \"kubernetes.io/projected/4490e9af-2abc-4d2c-83e2-ce5b7324a32a-kube-api-access-jngdk\") pod \"4490e9af-2abc-4d2c-83e2-ce5b7324a32a\" (UID: \"4490e9af-2abc-4d2c-83e2-ce5b7324a32a\") " Dec 06 01:24:28 crc kubenswrapper[4991]: I1206 01:24:28.816388 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4490e9af-2abc-4d2c-83e2-ce5b7324a32a-combined-ca-bundle\") pod \"4490e9af-2abc-4d2c-83e2-ce5b7324a32a\" (UID: \"4490e9af-2abc-4d2c-83e2-ce5b7324a32a\") " Dec 06 01:24:28 crc kubenswrapper[4991]: I1206 01:24:28.816549 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4490e9af-2abc-4d2c-83e2-ce5b7324a32a-config-data\") pod \"4490e9af-2abc-4d2c-83e2-ce5b7324a32a\" (UID: \"4490e9af-2abc-4d2c-83e2-ce5b7324a32a\") " Dec 06 01:24:28 crc kubenswrapper[4991]: I1206 01:24:28.823080 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4490e9af-2abc-4d2c-83e2-ce5b7324a32a-scripts" (OuterVolumeSpecName: "scripts") pod "4490e9af-2abc-4d2c-83e2-ce5b7324a32a" (UID: "4490e9af-2abc-4d2c-83e2-ce5b7324a32a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:24:28 crc kubenswrapper[4991]: I1206 01:24:28.830621 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4490e9af-2abc-4d2c-83e2-ce5b7324a32a-kube-api-access-jngdk" (OuterVolumeSpecName: "kube-api-access-jngdk") pod "4490e9af-2abc-4d2c-83e2-ce5b7324a32a" (UID: "4490e9af-2abc-4d2c-83e2-ce5b7324a32a"). InnerVolumeSpecName "kube-api-access-jngdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:24:28 crc kubenswrapper[4991]: I1206 01:24:28.853818 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4490e9af-2abc-4d2c-83e2-ce5b7324a32a-config-data" (OuterVolumeSpecName: "config-data") pod "4490e9af-2abc-4d2c-83e2-ce5b7324a32a" (UID: "4490e9af-2abc-4d2c-83e2-ce5b7324a32a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:24:28 crc kubenswrapper[4991]: I1206 01:24:28.870130 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4490e9af-2abc-4d2c-83e2-ce5b7324a32a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4490e9af-2abc-4d2c-83e2-ce5b7324a32a" (UID: "4490e9af-2abc-4d2c-83e2-ce5b7324a32a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:24:28 crc kubenswrapper[4991]: I1206 01:24:28.918913 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jngdk\" (UniqueName: \"kubernetes.io/projected/4490e9af-2abc-4d2c-83e2-ce5b7324a32a-kube-api-access-jngdk\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:28 crc kubenswrapper[4991]: I1206 01:24:28.918950 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4490e9af-2abc-4d2c-83e2-ce5b7324a32a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:28 crc kubenswrapper[4991]: I1206 01:24:28.918966 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4490e9af-2abc-4d2c-83e2-ce5b7324a32a-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:28 crc kubenswrapper[4991]: I1206 01:24:28.918977 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4490e9af-2abc-4d2c-83e2-ce5b7324a32a-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:29 crc kubenswrapper[4991]: I1206 01:24:29.289900 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ktmxx" event={"ID":"4490e9af-2abc-4d2c-83e2-ce5b7324a32a","Type":"ContainerDied","Data":"306db855ef1d81c35620b9147446ff8f5d8cbbaef9a6511cb79d5ffd37b2338d"} Dec 06 01:24:29 crc kubenswrapper[4991]: I1206 01:24:29.289948 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="306db855ef1d81c35620b9147446ff8f5d8cbbaef9a6511cb79d5ffd37b2338d" Dec 06 01:24:29 crc kubenswrapper[4991]: I1206 01:24:29.289951 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ktmxx" Dec 06 01:24:29 crc kubenswrapper[4991]: I1206 01:24:29.293940 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52b047e6-8844-421a-877f-3cbda2e6d6a7","Type":"ContainerStarted","Data":"0a1cd767529c52b46a8c6ef7ac64e97205c0053589d1730827360c349e47f48b"} Dec 06 01:24:29 crc kubenswrapper[4991]: I1206 01:24:29.454133 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 06 01:24:29 crc kubenswrapper[4991]: E1206 01:24:29.454780 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4490e9af-2abc-4d2c-83e2-ce5b7324a32a" containerName="nova-cell0-conductor-db-sync" Dec 06 01:24:29 crc kubenswrapper[4991]: I1206 01:24:29.454812 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="4490e9af-2abc-4d2c-83e2-ce5b7324a32a" containerName="nova-cell0-conductor-db-sync" Dec 06 01:24:29 crc kubenswrapper[4991]: I1206 01:24:29.455106 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="4490e9af-2abc-4d2c-83e2-ce5b7324a32a" containerName="nova-cell0-conductor-db-sync" Dec 06 01:24:29 crc kubenswrapper[4991]: I1206 01:24:29.459214 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 06 01:24:29 crc kubenswrapper[4991]: I1206 01:24:29.462182 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-pdgft" Dec 06 01:24:29 crc kubenswrapper[4991]: I1206 01:24:29.462509 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 06 01:24:29 crc kubenswrapper[4991]: I1206 01:24:29.476951 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 06 01:24:29 crc kubenswrapper[4991]: I1206 01:24:29.531628 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/380a9e11-2703-494e-aae0-ce7758d34509-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"380a9e11-2703-494e-aae0-ce7758d34509\") " pod="openstack/nova-cell0-conductor-0" Dec 06 01:24:29 crc kubenswrapper[4991]: I1206 01:24:29.532066 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/380a9e11-2703-494e-aae0-ce7758d34509-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"380a9e11-2703-494e-aae0-ce7758d34509\") " pod="openstack/nova-cell0-conductor-0" Dec 06 01:24:29 crc kubenswrapper[4991]: I1206 01:24:29.532198 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbsrg\" (UniqueName: \"kubernetes.io/projected/380a9e11-2703-494e-aae0-ce7758d34509-kube-api-access-zbsrg\") pod \"nova-cell0-conductor-0\" (UID: \"380a9e11-2703-494e-aae0-ce7758d34509\") " pod="openstack/nova-cell0-conductor-0" Dec 06 01:24:29 crc kubenswrapper[4991]: I1206 01:24:29.634146 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/380a9e11-2703-494e-aae0-ce7758d34509-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"380a9e11-2703-494e-aae0-ce7758d34509\") " pod="openstack/nova-cell0-conductor-0" Dec 06 01:24:29 crc kubenswrapper[4991]: I1206 01:24:29.634267 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/380a9e11-2703-494e-aae0-ce7758d34509-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"380a9e11-2703-494e-aae0-ce7758d34509\") " pod="openstack/nova-cell0-conductor-0" Dec 06 01:24:29 crc kubenswrapper[4991]: I1206 01:24:29.634314 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbsrg\" (UniqueName: \"kubernetes.io/projected/380a9e11-2703-494e-aae0-ce7758d34509-kube-api-access-zbsrg\") pod \"nova-cell0-conductor-0\" (UID: \"380a9e11-2703-494e-aae0-ce7758d34509\") " pod="openstack/nova-cell0-conductor-0" Dec 06 01:24:29 crc kubenswrapper[4991]: I1206 01:24:29.640197 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/380a9e11-2703-494e-aae0-ce7758d34509-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"380a9e11-2703-494e-aae0-ce7758d34509\") " pod="openstack/nova-cell0-conductor-0" Dec 06 01:24:29 crc kubenswrapper[4991]: I1206 01:24:29.646678 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/380a9e11-2703-494e-aae0-ce7758d34509-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"380a9e11-2703-494e-aae0-ce7758d34509\") " pod="openstack/nova-cell0-conductor-0" Dec 06 01:24:29 crc kubenswrapper[4991]: I1206 01:24:29.664131 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbsrg\" (UniqueName: \"kubernetes.io/projected/380a9e11-2703-494e-aae0-ce7758d34509-kube-api-access-zbsrg\") pod \"nova-cell0-conductor-0\" (UID: \"380a9e11-2703-494e-aae0-ce7758d34509\") " pod="openstack/nova-cell0-conductor-0" Dec 06 01:24:29 crc kubenswrapper[4991]: I1206 01:24:29.837364 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 06 01:24:29 crc kubenswrapper[4991]: I1206 01:24:29.879831 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 06 01:24:29 crc kubenswrapper[4991]: I1206 01:24:29.879883 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 06 01:24:29 crc kubenswrapper[4991]: I1206 01:24:29.924230 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 06 01:24:29 crc kubenswrapper[4991]: I1206 01:24:29.936669 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 06 01:24:30 crc kubenswrapper[4991]: I1206 01:24:30.309801 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52b047e6-8844-421a-877f-3cbda2e6d6a7","Type":"ContainerStarted","Data":"113dcd9de662fe5153f1615b7d15bcb1686d74ed586a7ce45a06f0af10645bcb"} Dec 06 01:24:30 crc kubenswrapper[4991]: I1206 01:24:30.310850 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 06 01:24:30 crc kubenswrapper[4991]: I1206 01:24:30.310875 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 06 01:24:30 crc kubenswrapper[4991]: I1206 01:24:30.342947 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.856254209 podStartE2EDuration="5.342914821s" podCreationTimestamp="2025-12-06 01:24:25 +0000 UTC" firstStartedPulling="2025-12-06 01:24:26.475245947 +0000 UTC m=+1339.825536425" lastFinishedPulling="2025-12-06 01:24:29.961906569 +0000 UTC m=+1343.312197037" observedRunningTime="2025-12-06 01:24:30.332222295 +0000 UTC m=+1343.682512773" watchObservedRunningTime="2025-12-06 01:24:30.342914821 +0000 UTC m=+1343.693205289" Dec 06 01:24:30 crc kubenswrapper[4991]: I1206 01:24:30.394581 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 06 01:24:30 crc kubenswrapper[4991]: I1206 01:24:30.395805 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 06 01:24:30 crc kubenswrapper[4991]: I1206 01:24:30.447892 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 06 01:24:31 crc kubenswrapper[4991]: I1206 01:24:31.347855 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"380a9e11-2703-494e-aae0-ce7758d34509","Type":"ContainerStarted","Data":"6c0e82468c9d33a50d952ec1e0c2985dfdbd78f913d108d6c4df4bc159e7d174"} Dec 06 01:24:31 crc kubenswrapper[4991]: I1206 01:24:31.348527 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 06 01:24:31 crc kubenswrapper[4991]: I1206 01:24:31.348554 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"380a9e11-2703-494e-aae0-ce7758d34509","Type":"ContainerStarted","Data":"22fd7cc80a71ba4f6deebcf34dce43ce4cb764e01138e6373c28dc40987a9f27"} Dec 06 01:24:31 crc kubenswrapper[4991]: I1206 01:24:31.348913 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 06 01:24:31 crc kubenswrapper[4991]: I1206 01:24:31.376169 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.37614789 podStartE2EDuration="2.37614789s" podCreationTimestamp="2025-12-06 01:24:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:24:31.366045809 +0000 UTC m=+1344.716336287" watchObservedRunningTime="2025-12-06 01:24:31.37614789 +0000 UTC m=+1344.726438358" Dec 06 01:24:32 crc kubenswrapper[4991]: I1206 01:24:32.575448 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 06 01:24:32 crc kubenswrapper[4991]: I1206 01:24:32.575811 4991 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 06 01:24:32 crc kubenswrapper[4991]: I1206 01:24:32.577080 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 06 01:24:39 crc kubenswrapper[4991]: I1206 01:24:39.887328 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 06 01:24:40 crc kubenswrapper[4991]: I1206 01:24:40.421027 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-m77vk"] Dec 06 01:24:40 crc kubenswrapper[4991]: I1206 01:24:40.423141 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-m77vk" Dec 06 01:24:40 crc kubenswrapper[4991]: I1206 01:24:40.425867 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 06 01:24:40 crc kubenswrapper[4991]: I1206 01:24:40.426256 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 06 01:24:40 crc kubenswrapper[4991]: I1206 01:24:40.455591 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-m77vk"] Dec 06 01:24:40 crc kubenswrapper[4991]: I1206 01:24:40.503520 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fc9bd44-9dec-455c-85a1-87064befa3ce-scripts\") pod \"nova-cell0-cell-mapping-m77vk\" (UID: \"3fc9bd44-9dec-455c-85a1-87064befa3ce\") " pod="openstack/nova-cell0-cell-mapping-m77vk" Dec 06 01:24:40 crc kubenswrapper[4991]: I1206 01:24:40.503685 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fc9bd44-9dec-455c-85a1-87064befa3ce-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-m77vk\" (UID: \"3fc9bd44-9dec-455c-85a1-87064befa3ce\") " pod="openstack/nova-cell0-cell-mapping-m77vk" Dec 06 01:24:40 crc kubenswrapper[4991]: I1206 01:24:40.503738 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fc9bd44-9dec-455c-85a1-87064befa3ce-config-data\") pod \"nova-cell0-cell-mapping-m77vk\" (UID: \"3fc9bd44-9dec-455c-85a1-87064befa3ce\") " pod="openstack/nova-cell0-cell-mapping-m77vk" Dec 06 01:24:40 crc kubenswrapper[4991]: I1206 01:24:40.503814 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fm8nm\" (UniqueName: \"kubernetes.io/projected/3fc9bd44-9dec-455c-85a1-87064befa3ce-kube-api-access-fm8nm\") pod \"nova-cell0-cell-mapping-m77vk\" (UID: \"3fc9bd44-9dec-455c-85a1-87064befa3ce\") " pod="openstack/nova-cell0-cell-mapping-m77vk" Dec 06 01:24:40 crc kubenswrapper[4991]: I1206 01:24:40.606578 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fc9bd44-9dec-455c-85a1-87064befa3ce-scripts\") pod \"nova-cell0-cell-mapping-m77vk\" (UID: \"3fc9bd44-9dec-455c-85a1-87064befa3ce\") " pod="openstack/nova-cell0-cell-mapping-m77vk" Dec 06 01:24:40 crc kubenswrapper[4991]: I1206 01:24:40.606702 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fc9bd44-9dec-455c-85a1-87064befa3ce-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-m77vk\" (UID: \"3fc9bd44-9dec-455c-85a1-87064befa3ce\") " pod="openstack/nova-cell0-cell-mapping-m77vk" Dec 06 01:24:40 crc kubenswrapper[4991]: I1206 01:24:40.606737 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fc9bd44-9dec-455c-85a1-87064befa3ce-config-data\") pod \"nova-cell0-cell-mapping-m77vk\" (UID: \"3fc9bd44-9dec-455c-85a1-87064befa3ce\") " pod="openstack/nova-cell0-cell-mapping-m77vk" Dec 06 01:24:40 crc kubenswrapper[4991]: I1206 01:24:40.606809 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fm8nm\" (UniqueName: \"kubernetes.io/projected/3fc9bd44-9dec-455c-85a1-87064befa3ce-kube-api-access-fm8nm\") pod \"nova-cell0-cell-mapping-m77vk\" (UID: \"3fc9bd44-9dec-455c-85a1-87064befa3ce\") " pod="openstack/nova-cell0-cell-mapping-m77vk" Dec 06 01:24:40 crc kubenswrapper[4991]: I1206 01:24:40.627218 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fc9bd44-9dec-455c-85a1-87064befa3ce-config-data\") pod \"nova-cell0-cell-mapping-m77vk\" (UID: \"3fc9bd44-9dec-455c-85a1-87064befa3ce\") " pod="openstack/nova-cell0-cell-mapping-m77vk" Dec 06 01:24:40 crc kubenswrapper[4991]: I1206 01:24:40.627692 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fc9bd44-9dec-455c-85a1-87064befa3ce-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-m77vk\" (UID: \"3fc9bd44-9dec-455c-85a1-87064befa3ce\") " pod="openstack/nova-cell0-cell-mapping-m77vk" Dec 06 01:24:40 crc kubenswrapper[4991]: I1206 01:24:40.635625 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fc9bd44-9dec-455c-85a1-87064befa3ce-scripts\") pod \"nova-cell0-cell-mapping-m77vk\" (UID: \"3fc9bd44-9dec-455c-85a1-87064befa3ce\") " pod="openstack/nova-cell0-cell-mapping-m77vk" Dec 06 01:24:40 crc kubenswrapper[4991]: I1206 01:24:40.664710 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fm8nm\" (UniqueName: \"kubernetes.io/projected/3fc9bd44-9dec-455c-85a1-87064befa3ce-kube-api-access-fm8nm\") pod \"nova-cell0-cell-mapping-m77vk\" (UID: \"3fc9bd44-9dec-455c-85a1-87064befa3ce\") " pod="openstack/nova-cell0-cell-mapping-m77vk" Dec 06 01:24:40 crc kubenswrapper[4991]: I1206 01:24:40.762063 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 06 01:24:40 crc kubenswrapper[4991]: I1206 01:24:40.763204 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-m77vk" Dec 06 01:24:40 crc kubenswrapper[4991]: I1206 01:24:40.764387 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 01:24:40 crc kubenswrapper[4991]: I1206 01:24:40.769097 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 06 01:24:40 crc kubenswrapper[4991]: I1206 01:24:40.786741 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 01:24:40 crc kubenswrapper[4991]: I1206 01:24:40.855672 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 01:24:40 crc kubenswrapper[4991]: I1206 01:24:40.857499 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 01:24:40 crc kubenswrapper[4991]: I1206 01:24:40.864704 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 06 01:24:40 crc kubenswrapper[4991]: I1206 01:24:40.892127 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 01:24:40 crc kubenswrapper[4991]: I1206 01:24:40.941208 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90efc906-29ab-474c-9db5-d8b8dc495afc-config-data\") pod \"nova-api-0\" (UID: \"90efc906-29ab-474c-9db5-d8b8dc495afc\") " pod="openstack/nova-api-0" Dec 06 01:24:40 crc kubenswrapper[4991]: I1206 01:24:40.941295 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7pjk\" (UniqueName: \"kubernetes.io/projected/90efc906-29ab-474c-9db5-d8b8dc495afc-kube-api-access-p7pjk\") pod \"nova-api-0\" (UID: \"90efc906-29ab-474c-9db5-d8b8dc495afc\") " pod="openstack/nova-api-0" Dec 06 01:24:40 crc kubenswrapper[4991]: I1206 01:24:40.941358 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6292e839-f3b8-4072-a6d3-40c9a88e5656-config-data\") pod \"nova-scheduler-0\" (UID: \"6292e839-f3b8-4072-a6d3-40c9a88e5656\") " pod="openstack/nova-scheduler-0" Dec 06 01:24:40 crc kubenswrapper[4991]: I1206 01:24:40.941385 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90efc906-29ab-474c-9db5-d8b8dc495afc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"90efc906-29ab-474c-9db5-d8b8dc495afc\") " pod="openstack/nova-api-0" Dec 06 01:24:40 crc kubenswrapper[4991]: I1206 01:24:40.941405 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90efc906-29ab-474c-9db5-d8b8dc495afc-logs\") pod \"nova-api-0\" (UID: \"90efc906-29ab-474c-9db5-d8b8dc495afc\") " pod="openstack/nova-api-0" Dec 06 01:24:40 crc kubenswrapper[4991]: I1206 01:24:40.941430 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnnqw\" (UniqueName: \"kubernetes.io/projected/6292e839-f3b8-4072-a6d3-40c9a88e5656-kube-api-access-dnnqw\") pod \"nova-scheduler-0\" (UID: \"6292e839-f3b8-4072-a6d3-40c9a88e5656\") " pod="openstack/nova-scheduler-0" Dec 06 01:24:40 crc kubenswrapper[4991]: I1206 01:24:40.941499 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6292e839-f3b8-4072-a6d3-40c9a88e5656-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6292e839-f3b8-4072-a6d3-40c9a88e5656\") " pod="openstack/nova-scheduler-0" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.044069 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6292e839-f3b8-4072-a6d3-40c9a88e5656-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6292e839-f3b8-4072-a6d3-40c9a88e5656\") " pod="openstack/nova-scheduler-0" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.044135 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90efc906-29ab-474c-9db5-d8b8dc495afc-config-data\") pod \"nova-api-0\" (UID: \"90efc906-29ab-474c-9db5-d8b8dc495afc\") " pod="openstack/nova-api-0" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.044165 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7pjk\" (UniqueName: \"kubernetes.io/projected/90efc906-29ab-474c-9db5-d8b8dc495afc-kube-api-access-p7pjk\") pod \"nova-api-0\" (UID: \"90efc906-29ab-474c-9db5-d8b8dc495afc\") " pod="openstack/nova-api-0" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.044210 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6292e839-f3b8-4072-a6d3-40c9a88e5656-config-data\") pod \"nova-scheduler-0\" (UID: \"6292e839-f3b8-4072-a6d3-40c9a88e5656\") " pod="openstack/nova-scheduler-0" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.044233 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90efc906-29ab-474c-9db5-d8b8dc495afc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"90efc906-29ab-474c-9db5-d8b8dc495afc\") " pod="openstack/nova-api-0" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.044250 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90efc906-29ab-474c-9db5-d8b8dc495afc-logs\") pod \"nova-api-0\" (UID: \"90efc906-29ab-474c-9db5-d8b8dc495afc\") " pod="openstack/nova-api-0" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.044275 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnnqw\" (UniqueName: \"kubernetes.io/projected/6292e839-f3b8-4072-a6d3-40c9a88e5656-kube-api-access-dnnqw\") pod \"nova-scheduler-0\" (UID: \"6292e839-f3b8-4072-a6d3-40c9a88e5656\") " pod="openstack/nova-scheduler-0" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.047526 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90efc906-29ab-474c-9db5-d8b8dc495afc-logs\") pod \"nova-api-0\" (UID: \"90efc906-29ab-474c-9db5-d8b8dc495afc\") " pod="openstack/nova-api-0" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.061787 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90efc906-29ab-474c-9db5-d8b8dc495afc-config-data\") pod \"nova-api-0\" (UID: \"90efc906-29ab-474c-9db5-d8b8dc495afc\") " pod="openstack/nova-api-0" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.063600 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6292e839-f3b8-4072-a6d3-40c9a88e5656-config-data\") pod \"nova-scheduler-0\" (UID: \"6292e839-f3b8-4072-a6d3-40c9a88e5656\") " pod="openstack/nova-scheduler-0" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.071671 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7pjk\" (UniqueName: \"kubernetes.io/projected/90efc906-29ab-474c-9db5-d8b8dc495afc-kube-api-access-p7pjk\") pod \"nova-api-0\" (UID: \"90efc906-29ab-474c-9db5-d8b8dc495afc\") " pod="openstack/nova-api-0" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.071749 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnnqw\" (UniqueName: \"kubernetes.io/projected/6292e839-f3b8-4072-a6d3-40c9a88e5656-kube-api-access-dnnqw\") pod \"nova-scheduler-0\" (UID: \"6292e839-f3b8-4072-a6d3-40c9a88e5656\") " pod="openstack/nova-scheduler-0" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.071927 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90efc906-29ab-474c-9db5-d8b8dc495afc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"90efc906-29ab-474c-9db5-d8b8dc495afc\") " pod="openstack/nova-api-0" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.072157 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6292e839-f3b8-4072-a6d3-40c9a88e5656-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6292e839-f3b8-4072-a6d3-40c9a88e5656\") " pod="openstack/nova-scheduler-0" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.072387 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.073532 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.079549 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.097424 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.105766 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.140711 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.142923 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.156798 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.183597 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdcb1d06-e919-463f-9301-596a4524e303-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdcb1d06-e919-463f-9301-596a4524e303\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.184849 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c802574-8e08-41bc-9c96-604388fc89e1-config-data\") pod \"nova-metadata-0\" (UID: \"8c802574-8e08-41bc-9c96-604388fc89e1\") " pod="openstack/nova-metadata-0" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.185579 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxbkp\" (UniqueName: \"kubernetes.io/projected/fdcb1d06-e919-463f-9301-596a4524e303-kube-api-access-nxbkp\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdcb1d06-e919-463f-9301-596a4524e303\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.185866 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdcb1d06-e919-463f-9301-596a4524e303-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdcb1d06-e919-463f-9301-596a4524e303\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.186667 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c802574-8e08-41bc-9c96-604388fc89e1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8c802574-8e08-41bc-9c96-604388fc89e1\") " pod="openstack/nova-metadata-0" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.187236 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c802574-8e08-41bc-9c96-604388fc89e1-logs\") pod \"nova-metadata-0\" (UID: \"8c802574-8e08-41bc-9c96-604388fc89e1\") " pod="openstack/nova-metadata-0" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.187344 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlch9\" (UniqueName: \"kubernetes.io/projected/8c802574-8e08-41bc-9c96-604388fc89e1-kube-api-access-qlch9\") pod \"nova-metadata-0\" (UID: \"8c802574-8e08-41bc-9c96-604388fc89e1\") " pod="openstack/nova-metadata-0" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.283122 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.302326 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.306046 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdcb1d06-e919-463f-9301-596a4524e303-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdcb1d06-e919-463f-9301-596a4524e303\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.306687 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c802574-8e08-41bc-9c96-604388fc89e1-config-data\") pod \"nova-metadata-0\" (UID: \"8c802574-8e08-41bc-9c96-604388fc89e1\") " pod="openstack/nova-metadata-0" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.306836 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxbkp\" (UniqueName: \"kubernetes.io/projected/fdcb1d06-e919-463f-9301-596a4524e303-kube-api-access-nxbkp\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdcb1d06-e919-463f-9301-596a4524e303\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.306924 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdcb1d06-e919-463f-9301-596a4524e303-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdcb1d06-e919-463f-9301-596a4524e303\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.307021 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c802574-8e08-41bc-9c96-604388fc89e1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8c802574-8e08-41bc-9c96-604388fc89e1\") " pod="openstack/nova-metadata-0" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.317208 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c802574-8e08-41bc-9c96-604388fc89e1-logs\") pod \"nova-metadata-0\" (UID: \"8c802574-8e08-41bc-9c96-604388fc89e1\") " pod="openstack/nova-metadata-0" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.317392 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlch9\" (UniqueName: \"kubernetes.io/projected/8c802574-8e08-41bc-9c96-604388fc89e1-kube-api-access-qlch9\") pod \"nova-metadata-0\" (UID: \"8c802574-8e08-41bc-9c96-604388fc89e1\") " pod="openstack/nova-metadata-0" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.318403 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c802574-8e08-41bc-9c96-604388fc89e1-logs\") pod \"nova-metadata-0\" (UID: \"8c802574-8e08-41bc-9c96-604388fc89e1\") " pod="openstack/nova-metadata-0" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.328576 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c802574-8e08-41bc-9c96-604388fc89e1-config-data\") pod \"nova-metadata-0\" (UID: \"8c802574-8e08-41bc-9c96-604388fc89e1\") " pod="openstack/nova-metadata-0" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.332875 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdcb1d06-e919-463f-9301-596a4524e303-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdcb1d06-e919-463f-9301-596a4524e303\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.333495 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdcb1d06-e919-463f-9301-596a4524e303-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdcb1d06-e919-463f-9301-596a4524e303\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.336262 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c802574-8e08-41bc-9c96-604388fc89e1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8c802574-8e08-41bc-9c96-604388fc89e1\") " pod="openstack/nova-metadata-0" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.358555 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxbkp\" (UniqueName: \"kubernetes.io/projected/fdcb1d06-e919-463f-9301-596a4524e303-kube-api-access-nxbkp\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdcb1d06-e919-463f-9301-596a4524e303\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.365036 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlch9\" (UniqueName: \"kubernetes.io/projected/8c802574-8e08-41bc-9c96-604388fc89e1-kube-api-access-qlch9\") pod \"nova-metadata-0\" (UID: \"8c802574-8e08-41bc-9c96-604388fc89e1\") " pod="openstack/nova-metadata-0" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.372770 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-7chjv"] Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.382776 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-7chjv" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.398529 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-7chjv"] Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.419296 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5151c0e3-7834-4643-8b66-5d7e0f421059-dns-svc\") pod \"dnsmasq-dns-865f5d856f-7chjv\" (UID: \"5151c0e3-7834-4643-8b66-5d7e0f421059\") " pod="openstack/dnsmasq-dns-865f5d856f-7chjv" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.419384 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j55cj\" (UniqueName: \"kubernetes.io/projected/5151c0e3-7834-4643-8b66-5d7e0f421059-kube-api-access-j55cj\") pod \"dnsmasq-dns-865f5d856f-7chjv\" (UID: \"5151c0e3-7834-4643-8b66-5d7e0f421059\") " pod="openstack/dnsmasq-dns-865f5d856f-7chjv" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.419434 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5151c0e3-7834-4643-8b66-5d7e0f421059-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-7chjv\" (UID: \"5151c0e3-7834-4643-8b66-5d7e0f421059\") " pod="openstack/dnsmasq-dns-865f5d856f-7chjv" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.419471 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5151c0e3-7834-4643-8b66-5d7e0f421059-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-7chjv\" (UID: \"5151c0e3-7834-4643-8b66-5d7e0f421059\") " pod="openstack/dnsmasq-dns-865f5d856f-7chjv" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.419520 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5151c0e3-7834-4643-8b66-5d7e0f421059-config\") pod \"dnsmasq-dns-865f5d856f-7chjv\" (UID: \"5151c0e3-7834-4643-8b66-5d7e0f421059\") " pod="openstack/dnsmasq-dns-865f5d856f-7chjv" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.419579 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5151c0e3-7834-4643-8b66-5d7e0f421059-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-7chjv\" (UID: \"5151c0e3-7834-4643-8b66-5d7e0f421059\") " pod="openstack/dnsmasq-dns-865f5d856f-7chjv" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.424819 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.512270 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.521695 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5151c0e3-7834-4643-8b66-5d7e0f421059-dns-svc\") pod \"dnsmasq-dns-865f5d856f-7chjv\" (UID: \"5151c0e3-7834-4643-8b66-5d7e0f421059\") " pod="openstack/dnsmasq-dns-865f5d856f-7chjv" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.521764 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j55cj\" (UniqueName: \"kubernetes.io/projected/5151c0e3-7834-4643-8b66-5d7e0f421059-kube-api-access-j55cj\") pod \"dnsmasq-dns-865f5d856f-7chjv\" (UID: \"5151c0e3-7834-4643-8b66-5d7e0f421059\") " pod="openstack/dnsmasq-dns-865f5d856f-7chjv" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.521801 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5151c0e3-7834-4643-8b66-5d7e0f421059-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-7chjv\" (UID: \"5151c0e3-7834-4643-8b66-5d7e0f421059\") " pod="openstack/dnsmasq-dns-865f5d856f-7chjv" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.521845 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5151c0e3-7834-4643-8b66-5d7e0f421059-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-7chjv\" (UID: \"5151c0e3-7834-4643-8b66-5d7e0f421059\") " pod="openstack/dnsmasq-dns-865f5d856f-7chjv" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.521884 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5151c0e3-7834-4643-8b66-5d7e0f421059-config\") pod \"dnsmasq-dns-865f5d856f-7chjv\" (UID: \"5151c0e3-7834-4643-8b66-5d7e0f421059\") " pod="openstack/dnsmasq-dns-865f5d856f-7chjv" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.521931 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5151c0e3-7834-4643-8b66-5d7e0f421059-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-7chjv\" (UID: \"5151c0e3-7834-4643-8b66-5d7e0f421059\") " pod="openstack/dnsmasq-dns-865f5d856f-7chjv" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.522849 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5151c0e3-7834-4643-8b66-5d7e0f421059-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-7chjv\" (UID: \"5151c0e3-7834-4643-8b66-5d7e0f421059\") " pod="openstack/dnsmasq-dns-865f5d856f-7chjv" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.523447 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5151c0e3-7834-4643-8b66-5d7e0f421059-dns-svc\") pod \"dnsmasq-dns-865f5d856f-7chjv\" (UID: \"5151c0e3-7834-4643-8b66-5d7e0f421059\") " pod="openstack/dnsmasq-dns-865f5d856f-7chjv" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.524998 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5151c0e3-7834-4643-8b66-5d7e0f421059-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-7chjv\" (UID: \"5151c0e3-7834-4643-8b66-5d7e0f421059\") " pod="openstack/dnsmasq-dns-865f5d856f-7chjv" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.525836 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5151c0e3-7834-4643-8b66-5d7e0f421059-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-7chjv\" (UID: \"5151c0e3-7834-4643-8b66-5d7e0f421059\") " pod="openstack/dnsmasq-dns-865f5d856f-7chjv" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.526754 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5151c0e3-7834-4643-8b66-5d7e0f421059-config\") pod \"dnsmasq-dns-865f5d856f-7chjv\" (UID: \"5151c0e3-7834-4643-8b66-5d7e0f421059\") " pod="openstack/dnsmasq-dns-865f5d856f-7chjv" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.545665 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j55cj\" (UniqueName: \"kubernetes.io/projected/5151c0e3-7834-4643-8b66-5d7e0f421059-kube-api-access-j55cj\") pod \"dnsmasq-dns-865f5d856f-7chjv\" (UID: \"5151c0e3-7834-4643-8b66-5d7e0f421059\") " pod="openstack/dnsmasq-dns-865f5d856f-7chjv" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.700222 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-m77vk"] Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.710037 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-7chjv" Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.849348 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 01:24:41 crc kubenswrapper[4991]: W1206 01:24:41.894522 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90efc906_29ab_474c_9db5_d8b8dc495afc.slice/crio-e12b51fe2f552287f96040acd57b079e6c506353a4c71a2c35ba3619f85b7bed WatchSource:0}: Error finding container e12b51fe2f552287f96040acd57b079e6c506353a4c71a2c35ba3619f85b7bed: Status 404 returned error can't find the container with id e12b51fe2f552287f96040acd57b079e6c506353a4c71a2c35ba3619f85b7bed Dec 06 01:24:41 crc kubenswrapper[4991]: W1206 01:24:41.968870 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfdcb1d06_e919_463f_9301_596a4524e303.slice/crio-195954df317a2d0f73412817d491fd27048e7823002d7abe1db5389b65a323eb WatchSource:0}: Error finding container 195954df317a2d0f73412817d491fd27048e7823002d7abe1db5389b65a323eb: Status 404 returned error can't find the container with id 195954df317a2d0f73412817d491fd27048e7823002d7abe1db5389b65a323eb Dec 06 01:24:41 crc kubenswrapper[4991]: I1206 01:24:41.969003 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 01:24:42 crc kubenswrapper[4991]: I1206 01:24:42.021682 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 01:24:42 crc kubenswrapper[4991]: I1206 01:24:42.047490 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jtnk5"] Dec 06 01:24:42 crc kubenswrapper[4991]: I1206 01:24:42.048945 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-jtnk5" Dec 06 01:24:42 crc kubenswrapper[4991]: I1206 01:24:42.065187 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 06 01:24:42 crc kubenswrapper[4991]: I1206 01:24:42.067262 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 06 01:24:42 crc kubenswrapper[4991]: I1206 01:24:42.086223 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jtnk5"] Dec 06 01:24:42 crc kubenswrapper[4991]: I1206 01:24:42.138197 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcttr\" (UniqueName: \"kubernetes.io/projected/828f26ed-d1bf-4625-a643-a72c9207b0a5-kube-api-access-dcttr\") pod \"nova-cell1-conductor-db-sync-jtnk5\" (UID: \"828f26ed-d1bf-4625-a643-a72c9207b0a5\") " pod="openstack/nova-cell1-conductor-db-sync-jtnk5" Dec 06 01:24:42 crc kubenswrapper[4991]: I1206 01:24:42.138284 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/828f26ed-d1bf-4625-a643-a72c9207b0a5-scripts\") pod \"nova-cell1-conductor-db-sync-jtnk5\" (UID: \"828f26ed-d1bf-4625-a643-a72c9207b0a5\") " pod="openstack/nova-cell1-conductor-db-sync-jtnk5" Dec 06 01:24:42 crc kubenswrapper[4991]: I1206 01:24:42.138380 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/828f26ed-d1bf-4625-a643-a72c9207b0a5-config-data\") pod \"nova-cell1-conductor-db-sync-jtnk5\" (UID: \"828f26ed-d1bf-4625-a643-a72c9207b0a5\") " pod="openstack/nova-cell1-conductor-db-sync-jtnk5" Dec 06 01:24:42 crc kubenswrapper[4991]: I1206 01:24:42.138398 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/828f26ed-d1bf-4625-a643-a72c9207b0a5-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-jtnk5\" (UID: \"828f26ed-d1bf-4625-a643-a72c9207b0a5\") " pod="openstack/nova-cell1-conductor-db-sync-jtnk5" Dec 06 01:24:42 crc kubenswrapper[4991]: I1206 01:24:42.240082 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcttr\" (UniqueName: \"kubernetes.io/projected/828f26ed-d1bf-4625-a643-a72c9207b0a5-kube-api-access-dcttr\") pod \"nova-cell1-conductor-db-sync-jtnk5\" (UID: \"828f26ed-d1bf-4625-a643-a72c9207b0a5\") " pod="openstack/nova-cell1-conductor-db-sync-jtnk5" Dec 06 01:24:42 crc kubenswrapper[4991]: I1206 01:24:42.240631 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/828f26ed-d1bf-4625-a643-a72c9207b0a5-scripts\") pod \"nova-cell1-conductor-db-sync-jtnk5\" (UID: \"828f26ed-d1bf-4625-a643-a72c9207b0a5\") " pod="openstack/nova-cell1-conductor-db-sync-jtnk5" Dec 06 01:24:42 crc kubenswrapper[4991]: I1206 01:24:42.240781 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/828f26ed-d1bf-4625-a643-a72c9207b0a5-config-data\") pod \"nova-cell1-conductor-db-sync-jtnk5\" (UID: \"828f26ed-d1bf-4625-a643-a72c9207b0a5\") " pod="openstack/nova-cell1-conductor-db-sync-jtnk5" Dec 06 01:24:42 crc kubenswrapper[4991]: I1206 01:24:42.240839 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/828f26ed-d1bf-4625-a643-a72c9207b0a5-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-jtnk5\" (UID: \"828f26ed-d1bf-4625-a643-a72c9207b0a5\") " pod="openstack/nova-cell1-conductor-db-sync-jtnk5" Dec 06 01:24:42 crc kubenswrapper[4991]: I1206 01:24:42.248968 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/828f26ed-d1bf-4625-a643-a72c9207b0a5-scripts\") pod \"nova-cell1-conductor-db-sync-jtnk5\" (UID: \"828f26ed-d1bf-4625-a643-a72c9207b0a5\") " pod="openstack/nova-cell1-conductor-db-sync-jtnk5" Dec 06 01:24:42 crc kubenswrapper[4991]: I1206 01:24:42.249550 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/828f26ed-d1bf-4625-a643-a72c9207b0a5-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-jtnk5\" (UID: \"828f26ed-d1bf-4625-a643-a72c9207b0a5\") " pod="openstack/nova-cell1-conductor-db-sync-jtnk5" Dec 06 01:24:42 crc kubenswrapper[4991]: I1206 01:24:42.250496 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/828f26ed-d1bf-4625-a643-a72c9207b0a5-config-data\") pod \"nova-cell1-conductor-db-sync-jtnk5\" (UID: \"828f26ed-d1bf-4625-a643-a72c9207b0a5\") " pod="openstack/nova-cell1-conductor-db-sync-jtnk5" Dec 06 01:24:42 crc kubenswrapper[4991]: I1206 01:24:42.263818 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcttr\" (UniqueName: \"kubernetes.io/projected/828f26ed-d1bf-4625-a643-a72c9207b0a5-kube-api-access-dcttr\") pod \"nova-cell1-conductor-db-sync-jtnk5\" (UID: \"828f26ed-d1bf-4625-a643-a72c9207b0a5\") " pod="openstack/nova-cell1-conductor-db-sync-jtnk5" Dec 06 01:24:42 crc kubenswrapper[4991]: I1206 01:24:42.289911 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 01:24:42 crc kubenswrapper[4991]: I1206 01:24:42.379781 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-7chjv"] Dec 06 01:24:42 crc kubenswrapper[4991]: W1206 01:24:42.381557 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5151c0e3_7834_4643_8b66_5d7e0f421059.slice/crio-db10b62306dda0dbc50bc1b37602472f24d84e58314b2098adf3910d116f42c3 WatchSource:0}: Error finding container db10b62306dda0dbc50bc1b37602472f24d84e58314b2098adf3910d116f42c3: Status 404 returned error can't find the container with id db10b62306dda0dbc50bc1b37602472f24d84e58314b2098adf3910d116f42c3 Dec 06 01:24:42 crc kubenswrapper[4991]: I1206 01:24:42.384105 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-jtnk5" Dec 06 01:24:42 crc kubenswrapper[4991]: I1206 01:24:42.482046 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"90efc906-29ab-474c-9db5-d8b8dc495afc","Type":"ContainerStarted","Data":"e12b51fe2f552287f96040acd57b079e6c506353a4c71a2c35ba3619f85b7bed"} Dec 06 01:24:42 crc kubenswrapper[4991]: I1206 01:24:42.496640 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-m77vk" event={"ID":"3fc9bd44-9dec-455c-85a1-87064befa3ce","Type":"ContainerStarted","Data":"d1285efbb0688e01733671f5dbc93c616f9c7da3c7945ee4ef98b0d0c37a60b0"} Dec 06 01:24:42 crc kubenswrapper[4991]: I1206 01:24:42.496683 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-m77vk" event={"ID":"3fc9bd44-9dec-455c-85a1-87064befa3ce","Type":"ContainerStarted","Data":"1ec9a2b4241997dd96edb1346b5c4c958c5af71d4457e53e3ccd13e4c01e6eca"} Dec 06 01:24:42 crc kubenswrapper[4991]: I1206 01:24:42.499639 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6292e839-f3b8-4072-a6d3-40c9a88e5656","Type":"ContainerStarted","Data":"3c56ddfe976209d2aaec40aca1dc01f0105d2409b9d3cdfe617fc43d99c2dd63"} Dec 06 01:24:42 crc kubenswrapper[4991]: I1206 01:24:42.501772 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fdcb1d06-e919-463f-9301-596a4524e303","Type":"ContainerStarted","Data":"195954df317a2d0f73412817d491fd27048e7823002d7abe1db5389b65a323eb"} Dec 06 01:24:42 crc kubenswrapper[4991]: I1206 01:24:42.514562 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-m77vk" podStartSLOduration=2.51454508 podStartE2EDuration="2.51454508s" podCreationTimestamp="2025-12-06 01:24:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:24:42.512318162 +0000 UTC m=+1355.862608630" watchObservedRunningTime="2025-12-06 01:24:42.51454508 +0000 UTC m=+1355.864835548" Dec 06 01:24:42 crc kubenswrapper[4991]: I1206 01:24:42.515175 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8c802574-8e08-41bc-9c96-604388fc89e1","Type":"ContainerStarted","Data":"7a34dffefba7c02bedd143a7f1a307a0ae85aad8dcd97cc829967860c96091dd"} Dec 06 01:24:42 crc kubenswrapper[4991]: I1206 01:24:42.517815 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-7chjv" event={"ID":"5151c0e3-7834-4643-8b66-5d7e0f421059","Type":"ContainerStarted","Data":"db10b62306dda0dbc50bc1b37602472f24d84e58314b2098adf3910d116f42c3"} Dec 06 01:24:42 crc kubenswrapper[4991]: I1206 01:24:42.913561 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jtnk5"] Dec 06 01:24:43 crc kubenswrapper[4991]: I1206 01:24:43.526908 4991 generic.go:334] "Generic (PLEG): container finished" podID="5151c0e3-7834-4643-8b66-5d7e0f421059" containerID="24d10f3b68efc3ee2a148dec96724d60f42d9179242e037284a48e65db97e90b" exitCode=0 Dec 06 01:24:43 crc kubenswrapper[4991]: I1206 01:24:43.527345 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-7chjv" event={"ID":"5151c0e3-7834-4643-8b66-5d7e0f421059","Type":"ContainerDied","Data":"24d10f3b68efc3ee2a148dec96724d60f42d9179242e037284a48e65db97e90b"} Dec 06 01:24:43 crc kubenswrapper[4991]: I1206 01:24:43.534085 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-jtnk5" event={"ID":"828f26ed-d1bf-4625-a643-a72c9207b0a5","Type":"ContainerStarted","Data":"8e1b13a8c747a4ec45a52166136620ce2aa7810514a9fcfaae96d18504b018cb"} Dec 06 01:24:43 crc kubenswrapper[4991]: I1206 01:24:43.534117 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-jtnk5" event={"ID":"828f26ed-d1bf-4625-a643-a72c9207b0a5","Type":"ContainerStarted","Data":"2fa29b13fac801ce4ca687b8f8a2a9b0edeeac246e5a722c6e7a70e36e099b06"} Dec 06 01:24:43 crc kubenswrapper[4991]: I1206 01:24:43.575159 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-jtnk5" podStartSLOduration=1.575136486 podStartE2EDuration="1.575136486s" podCreationTimestamp="2025-12-06 01:24:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:24:43.573004271 +0000 UTC m=+1356.923294739" watchObservedRunningTime="2025-12-06 01:24:43.575136486 +0000 UTC m=+1356.925426954" Dec 06 01:24:44 crc kubenswrapper[4991]: I1206 01:24:44.668341 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 01:24:44 crc kubenswrapper[4991]: I1206 01:24:44.678175 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 01:24:46 crc kubenswrapper[4991]: I1206 01:24:46.576275 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8c802574-8e08-41bc-9c96-604388fc89e1","Type":"ContainerStarted","Data":"b65d741d7bcaab9e03365a6cdb13aac1b2ab5aeb55cdbaf3ff68d00de4942104"} Dec 06 01:24:46 crc kubenswrapper[4991]: I1206 01:24:46.593263 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-7chjv" event={"ID":"5151c0e3-7834-4643-8b66-5d7e0f421059","Type":"ContainerStarted","Data":"f6c7eed506f4e8b6ba442e15c7c5bf52ecb32f2b70b0da1ed5c1aa3475a58e25"} Dec 06 01:24:46 crc kubenswrapper[4991]: I1206 01:24:46.594525 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-865f5d856f-7chjv" Dec 06 01:24:46 crc kubenswrapper[4991]: I1206 01:24:46.619233 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"90efc906-29ab-474c-9db5-d8b8dc495afc","Type":"ContainerStarted","Data":"b428dcd737da574263318d54829b55013b5e0339d4a634ad63d8a039c444a457"} Dec 06 01:24:46 crc kubenswrapper[4991]: I1206 01:24:46.619283 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"90efc906-29ab-474c-9db5-d8b8dc495afc","Type":"ContainerStarted","Data":"70f7b5404b4c3cbed5ff8f35410c5c9ca47d9a96aa6520b0bf11c0d8c65c9b2f"} Dec 06 01:24:46 crc kubenswrapper[4991]: I1206 01:24:46.645313 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6292e839-f3b8-4072-a6d3-40c9a88e5656","Type":"ContainerStarted","Data":"b7423a8468e754b55219f207d99ae1df2eeb0e2f6250d688c8a62ad71a723342"} Dec 06 01:24:46 crc kubenswrapper[4991]: I1206 01:24:46.673237 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fdcb1d06-e919-463f-9301-596a4524e303","Type":"ContainerStarted","Data":"e5a1c10272cc48df479abbaf5daf6cf6fa65ac024ad2d9f1b252319378b9401a"} Dec 06 01:24:46 crc kubenswrapper[4991]: I1206 01:24:46.673468 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="fdcb1d06-e919-463f-9301-596a4524e303" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://e5a1c10272cc48df479abbaf5daf6cf6fa65ac024ad2d9f1b252319378b9401a" gracePeriod=30 Dec 06 01:24:46 crc kubenswrapper[4991]: I1206 01:24:46.716065 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.663488181 podStartE2EDuration="6.716046438s" podCreationTimestamp="2025-12-06 01:24:40 +0000 UTC" firstStartedPulling="2025-12-06 01:24:41.907475562 +0000 UTC m=+1355.257766030" lastFinishedPulling="2025-12-06 01:24:45.960033819 +0000 UTC m=+1359.310324287" observedRunningTime="2025-12-06 01:24:46.686272598 +0000 UTC m=+1360.036563066" watchObservedRunningTime="2025-12-06 01:24:46.716046438 +0000 UTC m=+1360.066336906" Dec 06 01:24:46 crc kubenswrapper[4991]: I1206 01:24:46.716185 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-865f5d856f-7chjv" podStartSLOduration=5.716180691 podStartE2EDuration="5.716180691s" podCreationTimestamp="2025-12-06 01:24:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:24:46.636561772 +0000 UTC m=+1359.986852240" watchObservedRunningTime="2025-12-06 01:24:46.716180691 +0000 UTC m=+1360.066471159" Dec 06 01:24:46 crc kubenswrapper[4991]: I1206 01:24:46.726258 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.793086033 podStartE2EDuration="6.726237801s" podCreationTimestamp="2025-12-06 01:24:40 +0000 UTC" firstStartedPulling="2025-12-06 01:24:42.026552202 +0000 UTC m=+1355.376842670" lastFinishedPulling="2025-12-06 01:24:45.95970397 +0000 UTC m=+1359.309994438" observedRunningTime="2025-12-06 01:24:46.715506144 +0000 UTC m=+1360.065796612" watchObservedRunningTime="2025-12-06 01:24:46.726237801 +0000 UTC m=+1360.076528269" Dec 06 01:24:46 crc kubenswrapper[4991]: I1206 01:24:46.787562 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.8051902659999999 podStartE2EDuration="5.787528226s" podCreationTimestamp="2025-12-06 01:24:41 +0000 UTC" firstStartedPulling="2025-12-06 01:24:41.977299708 +0000 UTC m=+1355.327590176" lastFinishedPulling="2025-12-06 01:24:45.959637668 +0000 UTC m=+1359.309928136" observedRunningTime="2025-12-06 01:24:46.738861628 +0000 UTC m=+1360.089152096" watchObservedRunningTime="2025-12-06 01:24:46.787528226 +0000 UTC m=+1360.137818684" Dec 06 01:24:47 crc kubenswrapper[4991]: I1206 01:24:47.696327 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8c802574-8e08-41bc-9c96-604388fc89e1" containerName="nova-metadata-log" containerID="cri-o://b65d741d7bcaab9e03365a6cdb13aac1b2ab5aeb55cdbaf3ff68d00de4942104" gracePeriod=30 Dec 06 01:24:47 crc kubenswrapper[4991]: I1206 01:24:47.697202 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8c802574-8e08-41bc-9c96-604388fc89e1","Type":"ContainerStarted","Data":"2269571fbc2252f4fe39804777bda0bd4830fbab48b09de3a6275e0fa98654bf"} Dec 06 01:24:47 crc kubenswrapper[4991]: I1206 01:24:47.697267 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8c802574-8e08-41bc-9c96-604388fc89e1" containerName="nova-metadata-metadata" containerID="cri-o://2269571fbc2252f4fe39804777bda0bd4830fbab48b09de3a6275e0fa98654bf" gracePeriod=30 Dec 06 01:24:47 crc kubenswrapper[4991]: I1206 01:24:47.754787 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.086524511 podStartE2EDuration="6.754768329s" podCreationTimestamp="2025-12-06 01:24:41 +0000 UTC" firstStartedPulling="2025-12-06 01:24:42.298908385 +0000 UTC m=+1355.649198853" lastFinishedPulling="2025-12-06 01:24:45.967152193 +0000 UTC m=+1359.317442671" observedRunningTime="2025-12-06 01:24:47.745326267 +0000 UTC m=+1361.095616735" watchObservedRunningTime="2025-12-06 01:24:47.754768329 +0000 UTC m=+1361.105058797" Dec 06 01:24:48 crc kubenswrapper[4991]: I1206 01:24:48.643030 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 01:24:48 crc kubenswrapper[4991]: I1206 01:24:48.681865 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c802574-8e08-41bc-9c96-604388fc89e1-logs\") pod \"8c802574-8e08-41bc-9c96-604388fc89e1\" (UID: \"8c802574-8e08-41bc-9c96-604388fc89e1\") " Dec 06 01:24:48 crc kubenswrapper[4991]: I1206 01:24:48.682535 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c802574-8e08-41bc-9c96-604388fc89e1-combined-ca-bundle\") pod \"8c802574-8e08-41bc-9c96-604388fc89e1\" (UID: \"8c802574-8e08-41bc-9c96-604388fc89e1\") " Dec 06 01:24:48 crc kubenswrapper[4991]: I1206 01:24:48.682588 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlch9\" (UniqueName: \"kubernetes.io/projected/8c802574-8e08-41bc-9c96-604388fc89e1-kube-api-access-qlch9\") pod \"8c802574-8e08-41bc-9c96-604388fc89e1\" (UID: \"8c802574-8e08-41bc-9c96-604388fc89e1\") " Dec 06 01:24:48 crc kubenswrapper[4991]: I1206 01:24:48.682892 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c802574-8e08-41bc-9c96-604388fc89e1-config-data\") pod \"8c802574-8e08-41bc-9c96-604388fc89e1\" (UID: \"8c802574-8e08-41bc-9c96-604388fc89e1\") " Dec 06 01:24:48 crc kubenswrapper[4991]: I1206 01:24:48.684011 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c802574-8e08-41bc-9c96-604388fc89e1-logs" (OuterVolumeSpecName: "logs") pod "8c802574-8e08-41bc-9c96-604388fc89e1" (UID: "8c802574-8e08-41bc-9c96-604388fc89e1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:24:48 crc kubenswrapper[4991]: I1206 01:24:48.705677 4991 generic.go:334] "Generic (PLEG): container finished" podID="8c802574-8e08-41bc-9c96-604388fc89e1" containerID="2269571fbc2252f4fe39804777bda0bd4830fbab48b09de3a6275e0fa98654bf" exitCode=0 Dec 06 01:24:48 crc kubenswrapper[4991]: I1206 01:24:48.705722 4991 generic.go:334] "Generic (PLEG): container finished" podID="8c802574-8e08-41bc-9c96-604388fc89e1" containerID="b65d741d7bcaab9e03365a6cdb13aac1b2ab5aeb55cdbaf3ff68d00de4942104" exitCode=143 Dec 06 01:24:48 crc kubenswrapper[4991]: I1206 01:24:48.707304 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 01:24:48 crc kubenswrapper[4991]: I1206 01:24:48.708033 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8c802574-8e08-41bc-9c96-604388fc89e1","Type":"ContainerDied","Data":"2269571fbc2252f4fe39804777bda0bd4830fbab48b09de3a6275e0fa98654bf"} Dec 06 01:24:48 crc kubenswrapper[4991]: I1206 01:24:48.708122 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8c802574-8e08-41bc-9c96-604388fc89e1","Type":"ContainerDied","Data":"b65d741d7bcaab9e03365a6cdb13aac1b2ab5aeb55cdbaf3ff68d00de4942104"} Dec 06 01:24:48 crc kubenswrapper[4991]: I1206 01:24:48.708137 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8c802574-8e08-41bc-9c96-604388fc89e1","Type":"ContainerDied","Data":"7a34dffefba7c02bedd143a7f1a307a0ae85aad8dcd97cc829967860c96091dd"} Dec 06 01:24:48 crc kubenswrapper[4991]: I1206 01:24:48.708157 4991 scope.go:117] "RemoveContainer" containerID="2269571fbc2252f4fe39804777bda0bd4830fbab48b09de3a6275e0fa98654bf" Dec 06 01:24:48 crc kubenswrapper[4991]: I1206 01:24:48.725256 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c802574-8e08-41bc-9c96-604388fc89e1-config-data" (OuterVolumeSpecName: "config-data") pod "8c802574-8e08-41bc-9c96-604388fc89e1" (UID: "8c802574-8e08-41bc-9c96-604388fc89e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:24:48 crc kubenswrapper[4991]: I1206 01:24:48.737032 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c802574-8e08-41bc-9c96-604388fc89e1-kube-api-access-qlch9" (OuterVolumeSpecName: "kube-api-access-qlch9") pod "8c802574-8e08-41bc-9c96-604388fc89e1" (UID: "8c802574-8e08-41bc-9c96-604388fc89e1"). InnerVolumeSpecName "kube-api-access-qlch9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:24:48 crc kubenswrapper[4991]: I1206 01:24:48.758547 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c802574-8e08-41bc-9c96-604388fc89e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c802574-8e08-41bc-9c96-604388fc89e1" (UID: "8c802574-8e08-41bc-9c96-604388fc89e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:24:48 crc kubenswrapper[4991]: I1206 01:24:48.785478 4991 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c802574-8e08-41bc-9c96-604388fc89e1-logs\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:48 crc kubenswrapper[4991]: I1206 01:24:48.785525 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c802574-8e08-41bc-9c96-604388fc89e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:48 crc kubenswrapper[4991]: I1206 01:24:48.785540 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlch9\" (UniqueName: \"kubernetes.io/projected/8c802574-8e08-41bc-9c96-604388fc89e1-kube-api-access-qlch9\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:48 crc kubenswrapper[4991]: I1206 01:24:48.785557 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c802574-8e08-41bc-9c96-604388fc89e1-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:48 crc kubenswrapper[4991]: I1206 01:24:48.796230 4991 scope.go:117] "RemoveContainer" containerID="b65d741d7bcaab9e03365a6cdb13aac1b2ab5aeb55cdbaf3ff68d00de4942104" Dec 06 01:24:48 crc kubenswrapper[4991]: I1206 01:24:48.816937 4991 scope.go:117] "RemoveContainer" containerID="2269571fbc2252f4fe39804777bda0bd4830fbab48b09de3a6275e0fa98654bf" Dec 06 01:24:48 crc kubenswrapper[4991]: E1206 01:24:48.818032 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2269571fbc2252f4fe39804777bda0bd4830fbab48b09de3a6275e0fa98654bf\": container with ID starting with 2269571fbc2252f4fe39804777bda0bd4830fbab48b09de3a6275e0fa98654bf not found: ID does not exist" containerID="2269571fbc2252f4fe39804777bda0bd4830fbab48b09de3a6275e0fa98654bf" Dec 06 01:24:48 crc kubenswrapper[4991]: I1206 01:24:48.818073 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2269571fbc2252f4fe39804777bda0bd4830fbab48b09de3a6275e0fa98654bf"} err="failed to get container status \"2269571fbc2252f4fe39804777bda0bd4830fbab48b09de3a6275e0fa98654bf\": rpc error: code = NotFound desc = could not find container \"2269571fbc2252f4fe39804777bda0bd4830fbab48b09de3a6275e0fa98654bf\": container with ID starting with 2269571fbc2252f4fe39804777bda0bd4830fbab48b09de3a6275e0fa98654bf not found: ID does not exist" Dec 06 01:24:48 crc kubenswrapper[4991]: I1206 01:24:48.818099 4991 scope.go:117] "RemoveContainer" containerID="b65d741d7bcaab9e03365a6cdb13aac1b2ab5aeb55cdbaf3ff68d00de4942104" Dec 06 01:24:48 crc kubenswrapper[4991]: E1206 01:24:48.818483 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b65d741d7bcaab9e03365a6cdb13aac1b2ab5aeb55cdbaf3ff68d00de4942104\": container with ID starting with b65d741d7bcaab9e03365a6cdb13aac1b2ab5aeb55cdbaf3ff68d00de4942104 not found: ID does not exist" containerID="b65d741d7bcaab9e03365a6cdb13aac1b2ab5aeb55cdbaf3ff68d00de4942104" Dec 06 01:24:48 crc kubenswrapper[4991]: I1206 01:24:48.818559 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b65d741d7bcaab9e03365a6cdb13aac1b2ab5aeb55cdbaf3ff68d00de4942104"} err="failed to get container status \"b65d741d7bcaab9e03365a6cdb13aac1b2ab5aeb55cdbaf3ff68d00de4942104\": rpc error: code = NotFound desc = could not find container \"b65d741d7bcaab9e03365a6cdb13aac1b2ab5aeb55cdbaf3ff68d00de4942104\": container with ID starting with b65d741d7bcaab9e03365a6cdb13aac1b2ab5aeb55cdbaf3ff68d00de4942104 not found: ID does not exist" Dec 06 01:24:48 crc kubenswrapper[4991]: I1206 01:24:48.818603 4991 scope.go:117] "RemoveContainer" containerID="2269571fbc2252f4fe39804777bda0bd4830fbab48b09de3a6275e0fa98654bf" Dec 06 01:24:48 crc kubenswrapper[4991]: I1206 01:24:48.819121 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2269571fbc2252f4fe39804777bda0bd4830fbab48b09de3a6275e0fa98654bf"} err="failed to get container status \"2269571fbc2252f4fe39804777bda0bd4830fbab48b09de3a6275e0fa98654bf\": rpc error: code = NotFound desc = could not find container \"2269571fbc2252f4fe39804777bda0bd4830fbab48b09de3a6275e0fa98654bf\": container with ID starting with 2269571fbc2252f4fe39804777bda0bd4830fbab48b09de3a6275e0fa98654bf not found: ID does not exist" Dec 06 01:24:48 crc kubenswrapper[4991]: I1206 01:24:48.819152 4991 scope.go:117] "RemoveContainer" containerID="b65d741d7bcaab9e03365a6cdb13aac1b2ab5aeb55cdbaf3ff68d00de4942104" Dec 06 01:24:48 crc kubenswrapper[4991]: I1206 01:24:48.819552 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b65d741d7bcaab9e03365a6cdb13aac1b2ab5aeb55cdbaf3ff68d00de4942104"} err="failed to get container status \"b65d741d7bcaab9e03365a6cdb13aac1b2ab5aeb55cdbaf3ff68d00de4942104\": rpc error: code = NotFound desc = could not find container \"b65d741d7bcaab9e03365a6cdb13aac1b2ab5aeb55cdbaf3ff68d00de4942104\": container with ID starting with b65d741d7bcaab9e03365a6cdb13aac1b2ab5aeb55cdbaf3ff68d00de4942104 not found: ID does not exist" Dec 06 01:24:49 crc kubenswrapper[4991]: I1206 01:24:49.052129 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 01:24:49 crc kubenswrapper[4991]: I1206 01:24:49.063879 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 01:24:49 crc kubenswrapper[4991]: I1206 01:24:49.082194 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 06 01:24:49 crc kubenswrapper[4991]: E1206 01:24:49.082664 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c802574-8e08-41bc-9c96-604388fc89e1" containerName="nova-metadata-metadata" Dec 06 01:24:49 crc kubenswrapper[4991]: I1206 01:24:49.082684 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c802574-8e08-41bc-9c96-604388fc89e1" containerName="nova-metadata-metadata" Dec 06 01:24:49 crc kubenswrapper[4991]: E1206 01:24:49.082719 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c802574-8e08-41bc-9c96-604388fc89e1" containerName="nova-metadata-log" Dec 06 01:24:49 crc kubenswrapper[4991]: I1206 01:24:49.082725 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c802574-8e08-41bc-9c96-604388fc89e1" containerName="nova-metadata-log" Dec 06 01:24:49 crc kubenswrapper[4991]: I1206 01:24:49.082928 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c802574-8e08-41bc-9c96-604388fc89e1" containerName="nova-metadata-log" Dec 06 01:24:49 crc kubenswrapper[4991]: I1206 01:24:49.082943 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c802574-8e08-41bc-9c96-604388fc89e1" containerName="nova-metadata-metadata" Dec 06 01:24:49 crc kubenswrapper[4991]: I1206 01:24:49.084038 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 01:24:49 crc kubenswrapper[4991]: I1206 01:24:49.086801 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 06 01:24:49 crc kubenswrapper[4991]: I1206 01:24:49.088244 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 06 01:24:49 crc kubenswrapper[4991]: I1206 01:24:49.119476 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 01:24:49 crc kubenswrapper[4991]: I1206 01:24:49.193093 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e950beb5-e2d8-410a-aadd-78a4a6b94329-logs\") pod \"nova-metadata-0\" (UID: \"e950beb5-e2d8-410a-aadd-78a4a6b94329\") " pod="openstack/nova-metadata-0" Dec 06 01:24:49 crc kubenswrapper[4991]: I1206 01:24:49.193218 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e950beb5-e2d8-410a-aadd-78a4a6b94329-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e950beb5-e2d8-410a-aadd-78a4a6b94329\") " pod="openstack/nova-metadata-0" Dec 06 01:24:49 crc kubenswrapper[4991]: I1206 01:24:49.193505 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e950beb5-e2d8-410a-aadd-78a4a6b94329-config-data\") pod \"nova-metadata-0\" (UID: \"e950beb5-e2d8-410a-aadd-78a4a6b94329\") " pod="openstack/nova-metadata-0" Dec 06 01:24:49 crc kubenswrapper[4991]: I1206 01:24:49.193584 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8v9v\" (UniqueName: \"kubernetes.io/projected/e950beb5-e2d8-410a-aadd-78a4a6b94329-kube-api-access-c8v9v\") pod \"nova-metadata-0\" (UID: \"e950beb5-e2d8-410a-aadd-78a4a6b94329\") " pod="openstack/nova-metadata-0" Dec 06 01:24:49 crc kubenswrapper[4991]: I1206 01:24:49.193662 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e950beb5-e2d8-410a-aadd-78a4a6b94329-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e950beb5-e2d8-410a-aadd-78a4a6b94329\") " pod="openstack/nova-metadata-0" Dec 06 01:24:49 crc kubenswrapper[4991]: I1206 01:24:49.295569 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e950beb5-e2d8-410a-aadd-78a4a6b94329-logs\") pod \"nova-metadata-0\" (UID: \"e950beb5-e2d8-410a-aadd-78a4a6b94329\") " pod="openstack/nova-metadata-0" Dec 06 01:24:49 crc kubenswrapper[4991]: I1206 01:24:49.295661 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e950beb5-e2d8-410a-aadd-78a4a6b94329-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e950beb5-e2d8-410a-aadd-78a4a6b94329\") " pod="openstack/nova-metadata-0" Dec 06 01:24:49 crc kubenswrapper[4991]: I1206 01:24:49.295750 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e950beb5-e2d8-410a-aadd-78a4a6b94329-config-data\") pod \"nova-metadata-0\" (UID: \"e950beb5-e2d8-410a-aadd-78a4a6b94329\") " pod="openstack/nova-metadata-0" Dec 06 01:24:49 crc kubenswrapper[4991]: I1206 01:24:49.295773 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8v9v\" (UniqueName: \"kubernetes.io/projected/e950beb5-e2d8-410a-aadd-78a4a6b94329-kube-api-access-c8v9v\") pod \"nova-metadata-0\" (UID: \"e950beb5-e2d8-410a-aadd-78a4a6b94329\") " pod="openstack/nova-metadata-0" Dec 06 01:24:49 crc kubenswrapper[4991]: I1206 01:24:49.295804 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e950beb5-e2d8-410a-aadd-78a4a6b94329-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e950beb5-e2d8-410a-aadd-78a4a6b94329\") " pod="openstack/nova-metadata-0" Dec 06 01:24:49 crc kubenswrapper[4991]: I1206 01:24:49.296067 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e950beb5-e2d8-410a-aadd-78a4a6b94329-logs\") pod \"nova-metadata-0\" (UID: \"e950beb5-e2d8-410a-aadd-78a4a6b94329\") " pod="openstack/nova-metadata-0" Dec 06 01:24:49 crc kubenswrapper[4991]: I1206 01:24:49.300368 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e950beb5-e2d8-410a-aadd-78a4a6b94329-config-data\") pod \"nova-metadata-0\" (UID: \"e950beb5-e2d8-410a-aadd-78a4a6b94329\") " pod="openstack/nova-metadata-0" Dec 06 01:24:49 crc kubenswrapper[4991]: I1206 01:24:49.300399 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e950beb5-e2d8-410a-aadd-78a4a6b94329-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e950beb5-e2d8-410a-aadd-78a4a6b94329\") " pod="openstack/nova-metadata-0" Dec 06 01:24:49 crc kubenswrapper[4991]: I1206 01:24:49.300563 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e950beb5-e2d8-410a-aadd-78a4a6b94329-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e950beb5-e2d8-410a-aadd-78a4a6b94329\") " pod="openstack/nova-metadata-0" Dec 06 01:24:49 crc kubenswrapper[4991]: I1206 01:24:49.315076 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8v9v\" (UniqueName: \"kubernetes.io/projected/e950beb5-e2d8-410a-aadd-78a4a6b94329-kube-api-access-c8v9v\") pod \"nova-metadata-0\" (UID: \"e950beb5-e2d8-410a-aadd-78a4a6b94329\") " pod="openstack/nova-metadata-0" Dec 06 01:24:49 crc kubenswrapper[4991]: I1206 01:24:49.420554 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 01:24:49 crc kubenswrapper[4991]: I1206 01:24:49.985036 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 01:24:49 crc kubenswrapper[4991]: W1206 01:24:49.993761 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode950beb5_e2d8_410a_aadd_78a4a6b94329.slice/crio-3c63e5b02750da414642092e87397d07e966a4b184113482aab0ae419214f7e8 WatchSource:0}: Error finding container 3c63e5b02750da414642092e87397d07e966a4b184113482aab0ae419214f7e8: Status 404 returned error can't find the container with id 3c63e5b02750da414642092e87397d07e966a4b184113482aab0ae419214f7e8 Dec 06 01:24:50 crc kubenswrapper[4991]: I1206 01:24:50.763908 4991 generic.go:334] "Generic (PLEG): container finished" podID="828f26ed-d1bf-4625-a643-a72c9207b0a5" containerID="8e1b13a8c747a4ec45a52166136620ce2aa7810514a9fcfaae96d18504b018cb" exitCode=0 Dec 06 01:24:50 crc kubenswrapper[4991]: I1206 01:24:50.764085 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-jtnk5" event={"ID":"828f26ed-d1bf-4625-a643-a72c9207b0a5","Type":"ContainerDied","Data":"8e1b13a8c747a4ec45a52166136620ce2aa7810514a9fcfaae96d18504b018cb"} Dec 06 01:24:50 crc kubenswrapper[4991]: I1206 01:24:50.766877 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e950beb5-e2d8-410a-aadd-78a4a6b94329","Type":"ContainerStarted","Data":"fb11c3975fbb55871b19e5081827de4db9c0982a39623ec129c6f3d4b8311b8a"} Dec 06 01:24:50 crc kubenswrapper[4991]: I1206 01:24:50.766924 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e950beb5-e2d8-410a-aadd-78a4a6b94329","Type":"ContainerStarted","Data":"a6c6c9dc0c80b215dfc19e494c061917edb06d5e97e059d1b1fd18e136d7f1a5"} Dec 06 01:24:50 crc kubenswrapper[4991]: I1206 01:24:50.766945 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e950beb5-e2d8-410a-aadd-78a4a6b94329","Type":"ContainerStarted","Data":"3c63e5b02750da414642092e87397d07e966a4b184113482aab0ae419214f7e8"} Dec 06 01:24:50 crc kubenswrapper[4991]: I1206 01:24:50.768671 4991 generic.go:334] "Generic (PLEG): container finished" podID="3fc9bd44-9dec-455c-85a1-87064befa3ce" containerID="d1285efbb0688e01733671f5dbc93c616f9c7da3c7945ee4ef98b0d0c37a60b0" exitCode=0 Dec 06 01:24:50 crc kubenswrapper[4991]: I1206 01:24:50.768711 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-m77vk" event={"ID":"3fc9bd44-9dec-455c-85a1-87064befa3ce","Type":"ContainerDied","Data":"d1285efbb0688e01733671f5dbc93c616f9c7da3c7945ee4ef98b0d0c37a60b0"} Dec 06 01:24:50 crc kubenswrapper[4991]: I1206 01:24:50.834146 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.83412953 podStartE2EDuration="1.83412953s" podCreationTimestamp="2025-12-06 01:24:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:24:50.831553784 +0000 UTC m=+1364.181844252" watchObservedRunningTime="2025-12-06 01:24:50.83412953 +0000 UTC m=+1364.184419998" Dec 06 01:24:51 crc kubenswrapper[4991]: I1206 01:24:51.055941 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c802574-8e08-41bc-9c96-604388fc89e1" path="/var/lib/kubelet/pods/8c802574-8e08-41bc-9c96-604388fc89e1/volumes" Dec 06 01:24:51 crc kubenswrapper[4991]: I1206 01:24:51.107285 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 06 01:24:51 crc kubenswrapper[4991]: I1206 01:24:51.107686 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 06 01:24:51 crc kubenswrapper[4991]: I1206 01:24:51.304088 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 06 01:24:51 crc kubenswrapper[4991]: I1206 01:24:51.305146 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 06 01:24:51 crc kubenswrapper[4991]: I1206 01:24:51.342742 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 06 01:24:51 crc kubenswrapper[4991]: I1206 01:24:51.425753 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 06 01:24:51 crc kubenswrapper[4991]: I1206 01:24:51.712416 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-865f5d856f-7chjv" Dec 06 01:24:51 crc kubenswrapper[4991]: I1206 01:24:51.832938 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-jsqrg"] Dec 06 01:24:51 crc kubenswrapper[4991]: I1206 01:24:51.833173 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bb4fc677f-jsqrg" podUID="2631c52c-c2fb-40cb-beb6-7c5035dff7fe" containerName="dnsmasq-dns" containerID="cri-o://788e98ca062f00a9c72b1297d91f301a643682b89f7a298d0f8207f1f68b3ce3" gracePeriod=10 Dec 06 01:24:51 crc kubenswrapper[4991]: I1206 01:24:51.868267 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.194235 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="90efc906-29ab-474c-9db5-d8b8dc495afc" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.185:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.195182 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="90efc906-29ab-474c-9db5-d8b8dc495afc" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.185:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.395092 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-m77vk" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.409808 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-jtnk5" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.424019 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fc9bd44-9dec-455c-85a1-87064befa3ce-scripts\") pod \"3fc9bd44-9dec-455c-85a1-87064befa3ce\" (UID: \"3fc9bd44-9dec-455c-85a1-87064befa3ce\") " Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.424171 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fm8nm\" (UniqueName: \"kubernetes.io/projected/3fc9bd44-9dec-455c-85a1-87064befa3ce-kube-api-access-fm8nm\") pod \"3fc9bd44-9dec-455c-85a1-87064befa3ce\" (UID: \"3fc9bd44-9dec-455c-85a1-87064befa3ce\") " Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.424317 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fc9bd44-9dec-455c-85a1-87064befa3ce-config-data\") pod \"3fc9bd44-9dec-455c-85a1-87064befa3ce\" (UID: \"3fc9bd44-9dec-455c-85a1-87064befa3ce\") " Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.424353 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fc9bd44-9dec-455c-85a1-87064befa3ce-combined-ca-bundle\") pod \"3fc9bd44-9dec-455c-85a1-87064befa3ce\" (UID: \"3fc9bd44-9dec-455c-85a1-87064befa3ce\") " Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.429889 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-jsqrg" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.430119 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fc9bd44-9dec-455c-85a1-87064befa3ce-scripts" (OuterVolumeSpecName: "scripts") pod "3fc9bd44-9dec-455c-85a1-87064befa3ce" (UID: "3fc9bd44-9dec-455c-85a1-87064befa3ce"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.458502 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fc9bd44-9dec-455c-85a1-87064befa3ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3fc9bd44-9dec-455c-85a1-87064befa3ce" (UID: "3fc9bd44-9dec-455c-85a1-87064befa3ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.481932 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fc9bd44-9dec-455c-85a1-87064befa3ce-kube-api-access-fm8nm" (OuterVolumeSpecName: "kube-api-access-fm8nm") pod "3fc9bd44-9dec-455c-85a1-87064befa3ce" (UID: "3fc9bd44-9dec-455c-85a1-87064befa3ce"). InnerVolumeSpecName "kube-api-access-fm8nm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.481968 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fc9bd44-9dec-455c-85a1-87064befa3ce-config-data" (OuterVolumeSpecName: "config-data") pod "3fc9bd44-9dec-455c-85a1-87064befa3ce" (UID: "3fc9bd44-9dec-455c-85a1-87064befa3ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.525605 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/828f26ed-d1bf-4625-a643-a72c9207b0a5-combined-ca-bundle\") pod \"828f26ed-d1bf-4625-a643-a72c9207b0a5\" (UID: \"828f26ed-d1bf-4625-a643-a72c9207b0a5\") " Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.525849 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2631c52c-c2fb-40cb-beb6-7c5035dff7fe-config\") pod \"2631c52c-c2fb-40cb-beb6-7c5035dff7fe\" (UID: \"2631c52c-c2fb-40cb-beb6-7c5035dff7fe\") " Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.525874 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2631c52c-c2fb-40cb-beb6-7c5035dff7fe-dns-swift-storage-0\") pod \"2631c52c-c2fb-40cb-beb6-7c5035dff7fe\" (UID: \"2631c52c-c2fb-40cb-beb6-7c5035dff7fe\") " Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.525951 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2631c52c-c2fb-40cb-beb6-7c5035dff7fe-ovsdbserver-sb\") pod \"2631c52c-c2fb-40cb-beb6-7c5035dff7fe\" (UID: \"2631c52c-c2fb-40cb-beb6-7c5035dff7fe\") " Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.526060 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldpfw\" (UniqueName: \"kubernetes.io/projected/2631c52c-c2fb-40cb-beb6-7c5035dff7fe-kube-api-access-ldpfw\") pod \"2631c52c-c2fb-40cb-beb6-7c5035dff7fe\" (UID: \"2631c52c-c2fb-40cb-beb6-7c5035dff7fe\") " Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.526090 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2631c52c-c2fb-40cb-beb6-7c5035dff7fe-ovsdbserver-nb\") pod \"2631c52c-c2fb-40cb-beb6-7c5035dff7fe\" (UID: \"2631c52c-c2fb-40cb-beb6-7c5035dff7fe\") " Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.526104 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/828f26ed-d1bf-4625-a643-a72c9207b0a5-scripts\") pod \"828f26ed-d1bf-4625-a643-a72c9207b0a5\" (UID: \"828f26ed-d1bf-4625-a643-a72c9207b0a5\") " Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.526142 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/828f26ed-d1bf-4625-a643-a72c9207b0a5-config-data\") pod \"828f26ed-d1bf-4625-a643-a72c9207b0a5\" (UID: \"828f26ed-d1bf-4625-a643-a72c9207b0a5\") " Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.526158 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2631c52c-c2fb-40cb-beb6-7c5035dff7fe-dns-svc\") pod \"2631c52c-c2fb-40cb-beb6-7c5035dff7fe\" (UID: \"2631c52c-c2fb-40cb-beb6-7c5035dff7fe\") " Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.526193 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcttr\" (UniqueName: \"kubernetes.io/projected/828f26ed-d1bf-4625-a643-a72c9207b0a5-kube-api-access-dcttr\") pod \"828f26ed-d1bf-4625-a643-a72c9207b0a5\" (UID: \"828f26ed-d1bf-4625-a643-a72c9207b0a5\") " Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.526562 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fc9bd44-9dec-455c-85a1-87064befa3ce-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.526580 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fc9bd44-9dec-455c-85a1-87064befa3ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.526590 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fc9bd44-9dec-455c-85a1-87064befa3ce-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.526599 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fm8nm\" (UniqueName: \"kubernetes.io/projected/3fc9bd44-9dec-455c-85a1-87064befa3ce-kube-api-access-fm8nm\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.530475 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/828f26ed-d1bf-4625-a643-a72c9207b0a5-scripts" (OuterVolumeSpecName: "scripts") pod "828f26ed-d1bf-4625-a643-a72c9207b0a5" (UID: "828f26ed-d1bf-4625-a643-a72c9207b0a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.530811 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/828f26ed-d1bf-4625-a643-a72c9207b0a5-kube-api-access-dcttr" (OuterVolumeSpecName: "kube-api-access-dcttr") pod "828f26ed-d1bf-4625-a643-a72c9207b0a5" (UID: "828f26ed-d1bf-4625-a643-a72c9207b0a5"). InnerVolumeSpecName "kube-api-access-dcttr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.533449 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2631c52c-c2fb-40cb-beb6-7c5035dff7fe-kube-api-access-ldpfw" (OuterVolumeSpecName: "kube-api-access-ldpfw") pod "2631c52c-c2fb-40cb-beb6-7c5035dff7fe" (UID: "2631c52c-c2fb-40cb-beb6-7c5035dff7fe"). InnerVolumeSpecName "kube-api-access-ldpfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.557782 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/828f26ed-d1bf-4625-a643-a72c9207b0a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "828f26ed-d1bf-4625-a643-a72c9207b0a5" (UID: "828f26ed-d1bf-4625-a643-a72c9207b0a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.566438 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/828f26ed-d1bf-4625-a643-a72c9207b0a5-config-data" (OuterVolumeSpecName: "config-data") pod "828f26ed-d1bf-4625-a643-a72c9207b0a5" (UID: "828f26ed-d1bf-4625-a643-a72c9207b0a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.579611 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2631c52c-c2fb-40cb-beb6-7c5035dff7fe-config" (OuterVolumeSpecName: "config") pod "2631c52c-c2fb-40cb-beb6-7c5035dff7fe" (UID: "2631c52c-c2fb-40cb-beb6-7c5035dff7fe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.582557 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2631c52c-c2fb-40cb-beb6-7c5035dff7fe-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2631c52c-c2fb-40cb-beb6-7c5035dff7fe" (UID: "2631c52c-c2fb-40cb-beb6-7c5035dff7fe"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.584294 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2631c52c-c2fb-40cb-beb6-7c5035dff7fe-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2631c52c-c2fb-40cb-beb6-7c5035dff7fe" (UID: "2631c52c-c2fb-40cb-beb6-7c5035dff7fe"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.591150 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2631c52c-c2fb-40cb-beb6-7c5035dff7fe-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2631c52c-c2fb-40cb-beb6-7c5035dff7fe" (UID: "2631c52c-c2fb-40cb-beb6-7c5035dff7fe"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.598486 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2631c52c-c2fb-40cb-beb6-7c5035dff7fe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2631c52c-c2fb-40cb-beb6-7c5035dff7fe" (UID: "2631c52c-c2fb-40cb-beb6-7c5035dff7fe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.628276 4991 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2631c52c-c2fb-40cb-beb6-7c5035dff7fe-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.628322 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2631c52c-c2fb-40cb-beb6-7c5035dff7fe-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.628332 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldpfw\" (UniqueName: \"kubernetes.io/projected/2631c52c-c2fb-40cb-beb6-7c5035dff7fe-kube-api-access-ldpfw\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.628347 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2631c52c-c2fb-40cb-beb6-7c5035dff7fe-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.628355 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/828f26ed-d1bf-4625-a643-a72c9207b0a5-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.628363 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/828f26ed-d1bf-4625-a643-a72c9207b0a5-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.628372 4991 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2631c52c-c2fb-40cb-beb6-7c5035dff7fe-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.628382 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcttr\" (UniqueName: \"kubernetes.io/projected/828f26ed-d1bf-4625-a643-a72c9207b0a5-kube-api-access-dcttr\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.628391 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/828f26ed-d1bf-4625-a643-a72c9207b0a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.628401 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2631c52c-c2fb-40cb-beb6-7c5035dff7fe-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.796910 4991 generic.go:334] "Generic (PLEG): container finished" podID="2631c52c-c2fb-40cb-beb6-7c5035dff7fe" containerID="788e98ca062f00a9c72b1297d91f301a643682b89f7a298d0f8207f1f68b3ce3" exitCode=0 Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.797008 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-jsqrg" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.797030 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-jsqrg" event={"ID":"2631c52c-c2fb-40cb-beb6-7c5035dff7fe","Type":"ContainerDied","Data":"788e98ca062f00a9c72b1297d91f301a643682b89f7a298d0f8207f1f68b3ce3"} Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.797082 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-jsqrg" event={"ID":"2631c52c-c2fb-40cb-beb6-7c5035dff7fe","Type":"ContainerDied","Data":"13461b804c8122865d243420779bcdf08ec4459a42d35603d133dc7a22609f7f"} Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.797101 4991 scope.go:117] "RemoveContainer" containerID="788e98ca062f00a9c72b1297d91f301a643682b89f7a298d0f8207f1f68b3ce3" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.800172 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-jtnk5" event={"ID":"828f26ed-d1bf-4625-a643-a72c9207b0a5","Type":"ContainerDied","Data":"2fa29b13fac801ce4ca687b8f8a2a9b0edeeac246e5a722c6e7a70e36e099b06"} Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.800222 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2fa29b13fac801ce4ca687b8f8a2a9b0edeeac246e5a722c6e7a70e36e099b06" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.800229 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-jtnk5" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.802697 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-m77vk" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.802693 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-m77vk" event={"ID":"3fc9bd44-9dec-455c-85a1-87064befa3ce","Type":"ContainerDied","Data":"1ec9a2b4241997dd96edb1346b5c4c958c5af71d4457e53e3ccd13e4c01e6eca"} Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.802821 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ec9a2b4241997dd96edb1346b5c4c958c5af71d4457e53e3ccd13e4c01e6eca" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.875436 4991 scope.go:117] "RemoveContainer" containerID="2cac24e9a6fc037ae65da4496869291205b3e01af25d3526d9cb35dc77626815" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.944183 4991 scope.go:117] "RemoveContainer" containerID="788e98ca062f00a9c72b1297d91f301a643682b89f7a298d0f8207f1f68b3ce3" Dec 06 01:24:52 crc kubenswrapper[4991]: E1206 01:24:52.956926 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"788e98ca062f00a9c72b1297d91f301a643682b89f7a298d0f8207f1f68b3ce3\": container with ID starting with 788e98ca062f00a9c72b1297d91f301a643682b89f7a298d0f8207f1f68b3ce3 not found: ID does not exist" containerID="788e98ca062f00a9c72b1297d91f301a643682b89f7a298d0f8207f1f68b3ce3" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.956997 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"788e98ca062f00a9c72b1297d91f301a643682b89f7a298d0f8207f1f68b3ce3"} err="failed to get container status \"788e98ca062f00a9c72b1297d91f301a643682b89f7a298d0f8207f1f68b3ce3\": rpc error: code = NotFound desc = could not find container \"788e98ca062f00a9c72b1297d91f301a643682b89f7a298d0f8207f1f68b3ce3\": container with ID starting with 788e98ca062f00a9c72b1297d91f301a643682b89f7a298d0f8207f1f68b3ce3 not found: ID does not exist" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.957027 4991 scope.go:117] "RemoveContainer" containerID="2cac24e9a6fc037ae65da4496869291205b3e01af25d3526d9cb35dc77626815" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.957689 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-jsqrg"] Dec 06 01:24:52 crc kubenswrapper[4991]: E1206 01:24:52.960162 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cac24e9a6fc037ae65da4496869291205b3e01af25d3526d9cb35dc77626815\": container with ID starting with 2cac24e9a6fc037ae65da4496869291205b3e01af25d3526d9cb35dc77626815 not found: ID does not exist" containerID="2cac24e9a6fc037ae65da4496869291205b3e01af25d3526d9cb35dc77626815" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.960196 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cac24e9a6fc037ae65da4496869291205b3e01af25d3526d9cb35dc77626815"} err="failed to get container status \"2cac24e9a6fc037ae65da4496869291205b3e01af25d3526d9cb35dc77626815\": rpc error: code = NotFound desc = could not find container \"2cac24e9a6fc037ae65da4496869291205b3e01af25d3526d9cb35dc77626815\": container with ID starting with 2cac24e9a6fc037ae65da4496869291205b3e01af25d3526d9cb35dc77626815 not found: ID does not exist" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.977196 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-jsqrg"] Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.989106 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 06 01:24:52 crc kubenswrapper[4991]: E1206 01:24:52.989821 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2631c52c-c2fb-40cb-beb6-7c5035dff7fe" containerName="init" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.989861 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="2631c52c-c2fb-40cb-beb6-7c5035dff7fe" containerName="init" Dec 06 01:24:52 crc kubenswrapper[4991]: E1206 01:24:52.989885 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2631c52c-c2fb-40cb-beb6-7c5035dff7fe" containerName="dnsmasq-dns" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.989893 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="2631c52c-c2fb-40cb-beb6-7c5035dff7fe" containerName="dnsmasq-dns" Dec 06 01:24:52 crc kubenswrapper[4991]: E1206 01:24:52.989910 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="828f26ed-d1bf-4625-a643-a72c9207b0a5" containerName="nova-cell1-conductor-db-sync" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.989916 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="828f26ed-d1bf-4625-a643-a72c9207b0a5" containerName="nova-cell1-conductor-db-sync" Dec 06 01:24:52 crc kubenswrapper[4991]: E1206 01:24:52.989938 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fc9bd44-9dec-455c-85a1-87064befa3ce" containerName="nova-manage" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.989945 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fc9bd44-9dec-455c-85a1-87064befa3ce" containerName="nova-manage" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.990157 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fc9bd44-9dec-455c-85a1-87064befa3ce" containerName="nova-manage" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.990183 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="828f26ed-d1bf-4625-a643-a72c9207b0a5" containerName="nova-cell1-conductor-db-sync" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.990200 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="2631c52c-c2fb-40cb-beb6-7c5035dff7fe" containerName="dnsmasq-dns" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.991181 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 06 01:24:52 crc kubenswrapper[4991]: I1206 01:24:52.993865 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.009637 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.063849 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rwsw\" (UniqueName: \"kubernetes.io/projected/8d67a449-d9a3-4a6a-bd6f-bbdd4c0de1fd-kube-api-access-7rwsw\") pod \"nova-cell1-conductor-0\" (UID: \"8d67a449-d9a3-4a6a-bd6f-bbdd4c0de1fd\") " pod="openstack/nova-cell1-conductor-0" Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.064123 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d67a449-d9a3-4a6a-bd6f-bbdd4c0de1fd-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8d67a449-d9a3-4a6a-bd6f-bbdd4c0de1fd\") " pod="openstack/nova-cell1-conductor-0" Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.064241 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d67a449-d9a3-4a6a-bd6f-bbdd4c0de1fd-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8d67a449-d9a3-4a6a-bd6f-bbdd4c0de1fd\") " pod="openstack/nova-cell1-conductor-0" Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.065757 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2631c52c-c2fb-40cb-beb6-7c5035dff7fe" path="/var/lib/kubelet/pods/2631c52c-c2fb-40cb-beb6-7c5035dff7fe/volumes" Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.104966 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.105270 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="90efc906-29ab-474c-9db5-d8b8dc495afc" containerName="nova-api-log" containerID="cri-o://70f7b5404b4c3cbed5ff8f35410c5c9ca47d9a96aa6520b0bf11c0d8c65c9b2f" gracePeriod=30 Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.105751 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="90efc906-29ab-474c-9db5-d8b8dc495afc" containerName="nova-api-api" containerID="cri-o://b428dcd737da574263318d54829b55013b5e0339d4a634ad63d8a039c444a457" gracePeriod=30 Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.136288 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.146478 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.146733 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e950beb5-e2d8-410a-aadd-78a4a6b94329" containerName="nova-metadata-log" containerID="cri-o://a6c6c9dc0c80b215dfc19e494c061917edb06d5e97e059d1b1fd18e136d7f1a5" gracePeriod=30 Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.147153 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e950beb5-e2d8-410a-aadd-78a4a6b94329" containerName="nova-metadata-metadata" containerID="cri-o://fb11c3975fbb55871b19e5081827de4db9c0982a39623ec129c6f3d4b8311b8a" gracePeriod=30 Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.166236 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rwsw\" (UniqueName: \"kubernetes.io/projected/8d67a449-d9a3-4a6a-bd6f-bbdd4c0de1fd-kube-api-access-7rwsw\") pod \"nova-cell1-conductor-0\" (UID: \"8d67a449-d9a3-4a6a-bd6f-bbdd4c0de1fd\") " pod="openstack/nova-cell1-conductor-0" Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.166295 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d67a449-d9a3-4a6a-bd6f-bbdd4c0de1fd-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8d67a449-d9a3-4a6a-bd6f-bbdd4c0de1fd\") " pod="openstack/nova-cell1-conductor-0" Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.166322 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d67a449-d9a3-4a6a-bd6f-bbdd4c0de1fd-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8d67a449-d9a3-4a6a-bd6f-bbdd4c0de1fd\") " pod="openstack/nova-cell1-conductor-0" Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.173109 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d67a449-d9a3-4a6a-bd6f-bbdd4c0de1fd-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8d67a449-d9a3-4a6a-bd6f-bbdd4c0de1fd\") " pod="openstack/nova-cell1-conductor-0" Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.173377 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d67a449-d9a3-4a6a-bd6f-bbdd4c0de1fd-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8d67a449-d9a3-4a6a-bd6f-bbdd4c0de1fd\") " pod="openstack/nova-cell1-conductor-0" Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.182456 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rwsw\" (UniqueName: \"kubernetes.io/projected/8d67a449-d9a3-4a6a-bd6f-bbdd4c0de1fd-kube-api-access-7rwsw\") pod \"nova-cell1-conductor-0\" (UID: \"8d67a449-d9a3-4a6a-bd6f-bbdd4c0de1fd\") " pod="openstack/nova-cell1-conductor-0" Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.323861 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.695751 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.782134 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e950beb5-e2d8-410a-aadd-78a4a6b94329-config-data\") pod \"e950beb5-e2d8-410a-aadd-78a4a6b94329\" (UID: \"e950beb5-e2d8-410a-aadd-78a4a6b94329\") " Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.782278 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e950beb5-e2d8-410a-aadd-78a4a6b94329-combined-ca-bundle\") pod \"e950beb5-e2d8-410a-aadd-78a4a6b94329\" (UID: \"e950beb5-e2d8-410a-aadd-78a4a6b94329\") " Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.782311 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e950beb5-e2d8-410a-aadd-78a4a6b94329-logs\") pod \"e950beb5-e2d8-410a-aadd-78a4a6b94329\" (UID: \"e950beb5-e2d8-410a-aadd-78a4a6b94329\") " Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.782335 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e950beb5-e2d8-410a-aadd-78a4a6b94329-nova-metadata-tls-certs\") pod \"e950beb5-e2d8-410a-aadd-78a4a6b94329\" (UID: \"e950beb5-e2d8-410a-aadd-78a4a6b94329\") " Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.782373 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8v9v\" (UniqueName: \"kubernetes.io/projected/e950beb5-e2d8-410a-aadd-78a4a6b94329-kube-api-access-c8v9v\") pod \"e950beb5-e2d8-410a-aadd-78a4a6b94329\" (UID: \"e950beb5-e2d8-410a-aadd-78a4a6b94329\") " Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.783182 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e950beb5-e2d8-410a-aadd-78a4a6b94329-logs" (OuterVolumeSpecName: "logs") pod "e950beb5-e2d8-410a-aadd-78a4a6b94329" (UID: "e950beb5-e2d8-410a-aadd-78a4a6b94329"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.789173 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e950beb5-e2d8-410a-aadd-78a4a6b94329-kube-api-access-c8v9v" (OuterVolumeSpecName: "kube-api-access-c8v9v") pod "e950beb5-e2d8-410a-aadd-78a4a6b94329" (UID: "e950beb5-e2d8-410a-aadd-78a4a6b94329"). InnerVolumeSpecName "kube-api-access-c8v9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.817379 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e950beb5-e2d8-410a-aadd-78a4a6b94329-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e950beb5-e2d8-410a-aadd-78a4a6b94329" (UID: "e950beb5-e2d8-410a-aadd-78a4a6b94329"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.819767 4991 generic.go:334] "Generic (PLEG): container finished" podID="90efc906-29ab-474c-9db5-d8b8dc495afc" containerID="70f7b5404b4c3cbed5ff8f35410c5c9ca47d9a96aa6520b0bf11c0d8c65c9b2f" exitCode=143 Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.819849 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"90efc906-29ab-474c-9db5-d8b8dc495afc","Type":"ContainerDied","Data":"70f7b5404b4c3cbed5ff8f35410c5c9ca47d9a96aa6520b0bf11c0d8c65c9b2f"} Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.825551 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e950beb5-e2d8-410a-aadd-78a4a6b94329-config-data" (OuterVolumeSpecName: "config-data") pod "e950beb5-e2d8-410a-aadd-78a4a6b94329" (UID: "e950beb5-e2d8-410a-aadd-78a4a6b94329"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.831171 4991 generic.go:334] "Generic (PLEG): container finished" podID="e950beb5-e2d8-410a-aadd-78a4a6b94329" containerID="fb11c3975fbb55871b19e5081827de4db9c0982a39623ec129c6f3d4b8311b8a" exitCode=0 Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.831353 4991 generic.go:334] "Generic (PLEG): container finished" podID="e950beb5-e2d8-410a-aadd-78a4a6b94329" containerID="a6c6c9dc0c80b215dfc19e494c061917edb06d5e97e059d1b1fd18e136d7f1a5" exitCode=143 Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.831534 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.831348 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e950beb5-e2d8-410a-aadd-78a4a6b94329","Type":"ContainerDied","Data":"fb11c3975fbb55871b19e5081827de4db9c0982a39623ec129c6f3d4b8311b8a"} Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.831722 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e950beb5-e2d8-410a-aadd-78a4a6b94329","Type":"ContainerDied","Data":"a6c6c9dc0c80b215dfc19e494c061917edb06d5e97e059d1b1fd18e136d7f1a5"} Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.831834 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e950beb5-e2d8-410a-aadd-78a4a6b94329","Type":"ContainerDied","Data":"3c63e5b02750da414642092e87397d07e966a4b184113482aab0ae419214f7e8"} Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.831911 4991 scope.go:117] "RemoveContainer" containerID="fb11c3975fbb55871b19e5081827de4db9c0982a39623ec129c6f3d4b8311b8a" Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.849870 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e950beb5-e2d8-410a-aadd-78a4a6b94329-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "e950beb5-e2d8-410a-aadd-78a4a6b94329" (UID: "e950beb5-e2d8-410a-aadd-78a4a6b94329"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.859064 4991 scope.go:117] "RemoveContainer" containerID="a6c6c9dc0c80b215dfc19e494c061917edb06d5e97e059d1b1fd18e136d7f1a5" Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.874484 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.887530 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e950beb5-e2d8-410a-aadd-78a4a6b94329-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.887571 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e950beb5-e2d8-410a-aadd-78a4a6b94329-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.887587 4991 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e950beb5-e2d8-410a-aadd-78a4a6b94329-logs\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.887601 4991 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e950beb5-e2d8-410a-aadd-78a4a6b94329-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.887615 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8v9v\" (UniqueName: \"kubernetes.io/projected/e950beb5-e2d8-410a-aadd-78a4a6b94329-kube-api-access-c8v9v\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.894373 4991 scope.go:117] "RemoveContainer" containerID="fb11c3975fbb55871b19e5081827de4db9c0982a39623ec129c6f3d4b8311b8a" Dec 06 01:24:53 crc kubenswrapper[4991]: E1206 01:24:53.894962 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb11c3975fbb55871b19e5081827de4db9c0982a39623ec129c6f3d4b8311b8a\": container with ID starting with fb11c3975fbb55871b19e5081827de4db9c0982a39623ec129c6f3d4b8311b8a not found: ID does not exist" containerID="fb11c3975fbb55871b19e5081827de4db9c0982a39623ec129c6f3d4b8311b8a" Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.895005 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb11c3975fbb55871b19e5081827de4db9c0982a39623ec129c6f3d4b8311b8a"} err="failed to get container status \"fb11c3975fbb55871b19e5081827de4db9c0982a39623ec129c6f3d4b8311b8a\": rpc error: code = NotFound desc = could not find container \"fb11c3975fbb55871b19e5081827de4db9c0982a39623ec129c6f3d4b8311b8a\": container with ID starting with fb11c3975fbb55871b19e5081827de4db9c0982a39623ec129c6f3d4b8311b8a not found: ID does not exist" Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.895027 4991 scope.go:117] "RemoveContainer" containerID="a6c6c9dc0c80b215dfc19e494c061917edb06d5e97e059d1b1fd18e136d7f1a5" Dec 06 01:24:53 crc kubenswrapper[4991]: E1206 01:24:53.895791 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6c6c9dc0c80b215dfc19e494c061917edb06d5e97e059d1b1fd18e136d7f1a5\": container with ID starting with a6c6c9dc0c80b215dfc19e494c061917edb06d5e97e059d1b1fd18e136d7f1a5 not found: ID does not exist" containerID="a6c6c9dc0c80b215dfc19e494c061917edb06d5e97e059d1b1fd18e136d7f1a5" Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.895915 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6c6c9dc0c80b215dfc19e494c061917edb06d5e97e059d1b1fd18e136d7f1a5"} err="failed to get container status \"a6c6c9dc0c80b215dfc19e494c061917edb06d5e97e059d1b1fd18e136d7f1a5\": rpc error: code = NotFound desc = could not find container \"a6c6c9dc0c80b215dfc19e494c061917edb06d5e97e059d1b1fd18e136d7f1a5\": container with ID starting with a6c6c9dc0c80b215dfc19e494c061917edb06d5e97e059d1b1fd18e136d7f1a5 not found: ID does not exist" Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.896044 4991 scope.go:117] "RemoveContainer" containerID="fb11c3975fbb55871b19e5081827de4db9c0982a39623ec129c6f3d4b8311b8a" Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.896653 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb11c3975fbb55871b19e5081827de4db9c0982a39623ec129c6f3d4b8311b8a"} err="failed to get container status \"fb11c3975fbb55871b19e5081827de4db9c0982a39623ec129c6f3d4b8311b8a\": rpc error: code = NotFound desc = could not find container \"fb11c3975fbb55871b19e5081827de4db9c0982a39623ec129c6f3d4b8311b8a\": container with ID starting with fb11c3975fbb55871b19e5081827de4db9c0982a39623ec129c6f3d4b8311b8a not found: ID does not exist" Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.896772 4991 scope.go:117] "RemoveContainer" containerID="a6c6c9dc0c80b215dfc19e494c061917edb06d5e97e059d1b1fd18e136d7f1a5" Dec 06 01:24:53 crc kubenswrapper[4991]: I1206 01:24:53.897238 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6c6c9dc0c80b215dfc19e494c061917edb06d5e97e059d1b1fd18e136d7f1a5"} err="failed to get container status \"a6c6c9dc0c80b215dfc19e494c061917edb06d5e97e059d1b1fd18e136d7f1a5\": rpc error: code = NotFound desc = could not find container \"a6c6c9dc0c80b215dfc19e494c061917edb06d5e97e059d1b1fd18e136d7f1a5\": container with ID starting with a6c6c9dc0c80b215dfc19e494c061917edb06d5e97e059d1b1fd18e136d7f1a5 not found: ID does not exist" Dec 06 01:24:53 crc kubenswrapper[4991]: W1206 01:24:53.901495 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d67a449_d9a3_4a6a_bd6f_bbdd4c0de1fd.slice/crio-0a5ed9939248e6ae4888677515496c8a4ba9f8c55774eb06a2d5facf68a8fa31 WatchSource:0}: Error finding container 0a5ed9939248e6ae4888677515496c8a4ba9f8c55774eb06a2d5facf68a8fa31: Status 404 returned error can't find the container with id 0a5ed9939248e6ae4888677515496c8a4ba9f8c55774eb06a2d5facf68a8fa31 Dec 06 01:24:54 crc kubenswrapper[4991]: I1206 01:24:54.181488 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 01:24:54 crc kubenswrapper[4991]: I1206 01:24:54.194075 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 01:24:54 crc kubenswrapper[4991]: I1206 01:24:54.201125 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 06 01:24:54 crc kubenswrapper[4991]: E1206 01:24:54.201781 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e950beb5-e2d8-410a-aadd-78a4a6b94329" containerName="nova-metadata-log" Dec 06 01:24:54 crc kubenswrapper[4991]: I1206 01:24:54.201797 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="e950beb5-e2d8-410a-aadd-78a4a6b94329" containerName="nova-metadata-log" Dec 06 01:24:54 crc kubenswrapper[4991]: E1206 01:24:54.201822 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e950beb5-e2d8-410a-aadd-78a4a6b94329" containerName="nova-metadata-metadata" Dec 06 01:24:54 crc kubenswrapper[4991]: I1206 01:24:54.201828 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="e950beb5-e2d8-410a-aadd-78a4a6b94329" containerName="nova-metadata-metadata" Dec 06 01:24:54 crc kubenswrapper[4991]: I1206 01:24:54.202025 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="e950beb5-e2d8-410a-aadd-78a4a6b94329" containerName="nova-metadata-metadata" Dec 06 01:24:54 crc kubenswrapper[4991]: I1206 01:24:54.202046 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="e950beb5-e2d8-410a-aadd-78a4a6b94329" containerName="nova-metadata-log" Dec 06 01:24:54 crc kubenswrapper[4991]: I1206 01:24:54.203017 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 01:24:54 crc kubenswrapper[4991]: I1206 01:24:54.211561 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 06 01:24:54 crc kubenswrapper[4991]: I1206 01:24:54.211629 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 06 01:24:54 crc kubenswrapper[4991]: I1206 01:24:54.256639 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 01:24:54 crc kubenswrapper[4991]: I1206 01:24:54.298245 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8603a428-6117-4bbb-a409-45aa3c447611-logs\") pod \"nova-metadata-0\" (UID: \"8603a428-6117-4bbb-a409-45aa3c447611\") " pod="openstack/nova-metadata-0" Dec 06 01:24:54 crc kubenswrapper[4991]: I1206 01:24:54.298326 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58dh8\" (UniqueName: \"kubernetes.io/projected/8603a428-6117-4bbb-a409-45aa3c447611-kube-api-access-58dh8\") pod \"nova-metadata-0\" (UID: \"8603a428-6117-4bbb-a409-45aa3c447611\") " pod="openstack/nova-metadata-0" Dec 06 01:24:54 crc kubenswrapper[4991]: I1206 01:24:54.298354 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8603a428-6117-4bbb-a409-45aa3c447611-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8603a428-6117-4bbb-a409-45aa3c447611\") " pod="openstack/nova-metadata-0" Dec 06 01:24:54 crc kubenswrapper[4991]: I1206 01:24:54.298369 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8603a428-6117-4bbb-a409-45aa3c447611-config-data\") pod \"nova-metadata-0\" (UID: \"8603a428-6117-4bbb-a409-45aa3c447611\") " pod="openstack/nova-metadata-0" Dec 06 01:24:54 crc kubenswrapper[4991]: I1206 01:24:54.298401 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8603a428-6117-4bbb-a409-45aa3c447611-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8603a428-6117-4bbb-a409-45aa3c447611\") " pod="openstack/nova-metadata-0" Dec 06 01:24:54 crc kubenswrapper[4991]: I1206 01:24:54.400449 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58dh8\" (UniqueName: \"kubernetes.io/projected/8603a428-6117-4bbb-a409-45aa3c447611-kube-api-access-58dh8\") pod \"nova-metadata-0\" (UID: \"8603a428-6117-4bbb-a409-45aa3c447611\") " pod="openstack/nova-metadata-0" Dec 06 01:24:54 crc kubenswrapper[4991]: I1206 01:24:54.400520 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8603a428-6117-4bbb-a409-45aa3c447611-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8603a428-6117-4bbb-a409-45aa3c447611\") " pod="openstack/nova-metadata-0" Dec 06 01:24:54 crc kubenswrapper[4991]: I1206 01:24:54.400550 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8603a428-6117-4bbb-a409-45aa3c447611-config-data\") pod \"nova-metadata-0\" (UID: \"8603a428-6117-4bbb-a409-45aa3c447611\") " pod="openstack/nova-metadata-0" Dec 06 01:24:54 crc kubenswrapper[4991]: I1206 01:24:54.400588 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8603a428-6117-4bbb-a409-45aa3c447611-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8603a428-6117-4bbb-a409-45aa3c447611\") " pod="openstack/nova-metadata-0" Dec 06 01:24:54 crc kubenswrapper[4991]: I1206 01:24:54.400708 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8603a428-6117-4bbb-a409-45aa3c447611-logs\") pod \"nova-metadata-0\" (UID: \"8603a428-6117-4bbb-a409-45aa3c447611\") " pod="openstack/nova-metadata-0" Dec 06 01:24:54 crc kubenswrapper[4991]: I1206 01:24:54.401204 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8603a428-6117-4bbb-a409-45aa3c447611-logs\") pod \"nova-metadata-0\" (UID: \"8603a428-6117-4bbb-a409-45aa3c447611\") " pod="openstack/nova-metadata-0" Dec 06 01:24:54 crc kubenswrapper[4991]: I1206 01:24:54.405122 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8603a428-6117-4bbb-a409-45aa3c447611-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8603a428-6117-4bbb-a409-45aa3c447611\") " pod="openstack/nova-metadata-0" Dec 06 01:24:54 crc kubenswrapper[4991]: I1206 01:24:54.405810 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8603a428-6117-4bbb-a409-45aa3c447611-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8603a428-6117-4bbb-a409-45aa3c447611\") " pod="openstack/nova-metadata-0" Dec 06 01:24:54 crc kubenswrapper[4991]: I1206 01:24:54.414331 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8603a428-6117-4bbb-a409-45aa3c447611-config-data\") pod \"nova-metadata-0\" (UID: \"8603a428-6117-4bbb-a409-45aa3c447611\") " pod="openstack/nova-metadata-0" Dec 06 01:24:54 crc kubenswrapper[4991]: I1206 01:24:54.419954 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58dh8\" (UniqueName: \"kubernetes.io/projected/8603a428-6117-4bbb-a409-45aa3c447611-kube-api-access-58dh8\") pod \"nova-metadata-0\" (UID: \"8603a428-6117-4bbb-a409-45aa3c447611\") " pod="openstack/nova-metadata-0" Dec 06 01:24:54 crc kubenswrapper[4991]: I1206 01:24:54.565741 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 01:24:54 crc kubenswrapper[4991]: I1206 01:24:54.853905 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"8d67a449-d9a3-4a6a-bd6f-bbdd4c0de1fd","Type":"ContainerStarted","Data":"36913588e0972e1af7f06a809ed4bf9e3bc251ad1e2a19ef4cd54be9074caf46"} Dec 06 01:24:54 crc kubenswrapper[4991]: I1206 01:24:54.854307 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"8d67a449-d9a3-4a6a-bd6f-bbdd4c0de1fd","Type":"ContainerStarted","Data":"0a5ed9939248e6ae4888677515496c8a4ba9f8c55774eb06a2d5facf68a8fa31"} Dec 06 01:24:54 crc kubenswrapper[4991]: I1206 01:24:54.854340 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 06 01:24:54 crc kubenswrapper[4991]: I1206 01:24:54.855624 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="6292e839-f3b8-4072-a6d3-40c9a88e5656" containerName="nova-scheduler-scheduler" containerID="cri-o://b7423a8468e754b55219f207d99ae1df2eeb0e2f6250d688c8a62ad71a723342" gracePeriod=30 Dec 06 01:24:54 crc kubenswrapper[4991]: I1206 01:24:54.881740 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.881720861 podStartE2EDuration="2.881720861s" podCreationTimestamp="2025-12-06 01:24:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:24:54.873177103 +0000 UTC m=+1368.223467571" watchObservedRunningTime="2025-12-06 01:24:54.881720861 +0000 UTC m=+1368.232011329" Dec 06 01:24:55 crc kubenswrapper[4991]: I1206 01:24:55.048979 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e950beb5-e2d8-410a-aadd-78a4a6b94329" path="/var/lib/kubelet/pods/e950beb5-e2d8-410a-aadd-78a4a6b94329/volumes" Dec 06 01:24:55 crc kubenswrapper[4991]: I1206 01:24:55.058383 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 01:24:55 crc kubenswrapper[4991]: I1206 01:24:55.872027 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8603a428-6117-4bbb-a409-45aa3c447611","Type":"ContainerStarted","Data":"d60d56a67c1a309283a3375423282c8e12279d7ec963fe832f420ee19d98c9f6"} Dec 06 01:24:55 crc kubenswrapper[4991]: I1206 01:24:55.872384 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8603a428-6117-4bbb-a409-45aa3c447611","Type":"ContainerStarted","Data":"81dfb5859167b882b8078f12c83503aca3b6cb983c82f38367a049a74243a90e"} Dec 06 01:24:55 crc kubenswrapper[4991]: I1206 01:24:55.872402 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8603a428-6117-4bbb-a409-45aa3c447611","Type":"ContainerStarted","Data":"3e250d89491720d399fa8b173bc199de94980e0111df56e9fc6f0cfaa8748078"} Dec 06 01:24:55 crc kubenswrapper[4991]: I1206 01:24:55.896864 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.896841739 podStartE2EDuration="1.896841739s" podCreationTimestamp="2025-12-06 01:24:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:24:55.894411737 +0000 UTC m=+1369.244702235" watchObservedRunningTime="2025-12-06 01:24:55.896841739 +0000 UTC m=+1369.247132207" Dec 06 01:24:55 crc kubenswrapper[4991]: I1206 01:24:55.974669 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 06 01:24:56 crc kubenswrapper[4991]: E1206 01:24:56.308194 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b7423a8468e754b55219f207d99ae1df2eeb0e2f6250d688c8a62ad71a723342" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 06 01:24:56 crc kubenswrapper[4991]: E1206 01:24:56.309487 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b7423a8468e754b55219f207d99ae1df2eeb0e2f6250d688c8a62ad71a723342" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 06 01:24:56 crc kubenswrapper[4991]: E1206 01:24:56.310959 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b7423a8468e754b55219f207d99ae1df2eeb0e2f6250d688c8a62ad71a723342" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 06 01:24:56 crc kubenswrapper[4991]: E1206 01:24:56.311027 4991 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="6292e839-f3b8-4072-a6d3-40c9a88e5656" containerName="nova-scheduler-scheduler" Dec 06 01:24:58 crc kubenswrapper[4991]: I1206 01:24:58.606880 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 01:24:58 crc kubenswrapper[4991]: I1206 01:24:58.705745 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6292e839-f3b8-4072-a6d3-40c9a88e5656-config-data\") pod \"6292e839-f3b8-4072-a6d3-40c9a88e5656\" (UID: \"6292e839-f3b8-4072-a6d3-40c9a88e5656\") " Dec 06 01:24:58 crc kubenswrapper[4991]: I1206 01:24:58.705803 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnnqw\" (UniqueName: \"kubernetes.io/projected/6292e839-f3b8-4072-a6d3-40c9a88e5656-kube-api-access-dnnqw\") pod \"6292e839-f3b8-4072-a6d3-40c9a88e5656\" (UID: \"6292e839-f3b8-4072-a6d3-40c9a88e5656\") " Dec 06 01:24:58 crc kubenswrapper[4991]: I1206 01:24:58.705933 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6292e839-f3b8-4072-a6d3-40c9a88e5656-combined-ca-bundle\") pod \"6292e839-f3b8-4072-a6d3-40c9a88e5656\" (UID: \"6292e839-f3b8-4072-a6d3-40c9a88e5656\") " Dec 06 01:24:58 crc kubenswrapper[4991]: I1206 01:24:58.714810 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6292e839-f3b8-4072-a6d3-40c9a88e5656-kube-api-access-dnnqw" (OuterVolumeSpecName: "kube-api-access-dnnqw") pod "6292e839-f3b8-4072-a6d3-40c9a88e5656" (UID: "6292e839-f3b8-4072-a6d3-40c9a88e5656"). InnerVolumeSpecName "kube-api-access-dnnqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:24:58 crc kubenswrapper[4991]: I1206 01:24:58.756169 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6292e839-f3b8-4072-a6d3-40c9a88e5656-config-data" (OuterVolumeSpecName: "config-data") pod "6292e839-f3b8-4072-a6d3-40c9a88e5656" (UID: "6292e839-f3b8-4072-a6d3-40c9a88e5656"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:24:58 crc kubenswrapper[4991]: I1206 01:24:58.780130 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6292e839-f3b8-4072-a6d3-40c9a88e5656-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6292e839-f3b8-4072-a6d3-40c9a88e5656" (UID: "6292e839-f3b8-4072-a6d3-40c9a88e5656"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:24:58 crc kubenswrapper[4991]: I1206 01:24:58.809220 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6292e839-f3b8-4072-a6d3-40c9a88e5656-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:58 crc kubenswrapper[4991]: I1206 01:24:58.809262 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6292e839-f3b8-4072-a6d3-40c9a88e5656-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:58 crc kubenswrapper[4991]: I1206 01:24:58.809271 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnnqw\" (UniqueName: \"kubernetes.io/projected/6292e839-f3b8-4072-a6d3-40c9a88e5656-kube-api-access-dnnqw\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:58 crc kubenswrapper[4991]: I1206 01:24:58.962977 4991 generic.go:334] "Generic (PLEG): container finished" podID="90efc906-29ab-474c-9db5-d8b8dc495afc" containerID="b428dcd737da574263318d54829b55013b5e0339d4a634ad63d8a039c444a457" exitCode=0 Dec 06 01:24:58 crc kubenswrapper[4991]: I1206 01:24:58.963095 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"90efc906-29ab-474c-9db5-d8b8dc495afc","Type":"ContainerDied","Data":"b428dcd737da574263318d54829b55013b5e0339d4a634ad63d8a039c444a457"} Dec 06 01:24:58 crc kubenswrapper[4991]: I1206 01:24:58.975258 4991 generic.go:334] "Generic (PLEG): container finished" podID="6292e839-f3b8-4072-a6d3-40c9a88e5656" containerID="b7423a8468e754b55219f207d99ae1df2eeb0e2f6250d688c8a62ad71a723342" exitCode=0 Dec 06 01:24:58 crc kubenswrapper[4991]: I1206 01:24:58.975318 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6292e839-f3b8-4072-a6d3-40c9a88e5656","Type":"ContainerDied","Data":"b7423a8468e754b55219f207d99ae1df2eeb0e2f6250d688c8a62ad71a723342"} Dec 06 01:24:58 crc kubenswrapper[4991]: I1206 01:24:58.975345 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6292e839-f3b8-4072-a6d3-40c9a88e5656","Type":"ContainerDied","Data":"3c56ddfe976209d2aaec40aca1dc01f0105d2409b9d3cdfe617fc43d99c2dd63"} Dec 06 01:24:58 crc kubenswrapper[4991]: I1206 01:24:58.975362 4991 scope.go:117] "RemoveContainer" containerID="b7423a8468e754b55219f207d99ae1df2eeb0e2f6250d688c8a62ad71a723342" Dec 06 01:24:58 crc kubenswrapper[4991]: I1206 01:24:58.975532 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 01:24:59 crc kubenswrapper[4991]: I1206 01:24:59.001291 4991 scope.go:117] "RemoveContainer" containerID="b7423a8468e754b55219f207d99ae1df2eeb0e2f6250d688c8a62ad71a723342" Dec 06 01:24:59 crc kubenswrapper[4991]: E1206 01:24:59.007544 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7423a8468e754b55219f207d99ae1df2eeb0e2f6250d688c8a62ad71a723342\": container with ID starting with b7423a8468e754b55219f207d99ae1df2eeb0e2f6250d688c8a62ad71a723342 not found: ID does not exist" containerID="b7423a8468e754b55219f207d99ae1df2eeb0e2f6250d688c8a62ad71a723342" Dec 06 01:24:59 crc kubenswrapper[4991]: I1206 01:24:59.007586 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7423a8468e754b55219f207d99ae1df2eeb0e2f6250d688c8a62ad71a723342"} err="failed to get container status \"b7423a8468e754b55219f207d99ae1df2eeb0e2f6250d688c8a62ad71a723342\": rpc error: code = NotFound desc = could not find container \"b7423a8468e754b55219f207d99ae1df2eeb0e2f6250d688c8a62ad71a723342\": container with ID starting with b7423a8468e754b55219f207d99ae1df2eeb0e2f6250d688c8a62ad71a723342 not found: ID does not exist" Dec 06 01:24:59 crc kubenswrapper[4991]: I1206 01:24:59.022761 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 01:24:59 crc kubenswrapper[4991]: I1206 01:24:59.030126 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 01:24:59 crc kubenswrapper[4991]: I1206 01:24:59.056049 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 01:24:59 crc kubenswrapper[4991]: I1206 01:24:59.061352 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 01:24:59 crc kubenswrapper[4991]: E1206 01:24:59.061736 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90efc906-29ab-474c-9db5-d8b8dc495afc" containerName="nova-api-api" Dec 06 01:24:59 crc kubenswrapper[4991]: I1206 01:24:59.061757 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="90efc906-29ab-474c-9db5-d8b8dc495afc" containerName="nova-api-api" Dec 06 01:24:59 crc kubenswrapper[4991]: E1206 01:24:59.061769 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90efc906-29ab-474c-9db5-d8b8dc495afc" containerName="nova-api-log" Dec 06 01:24:59 crc kubenswrapper[4991]: I1206 01:24:59.061778 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="90efc906-29ab-474c-9db5-d8b8dc495afc" containerName="nova-api-log" Dec 06 01:24:59 crc kubenswrapper[4991]: E1206 01:24:59.061812 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6292e839-f3b8-4072-a6d3-40c9a88e5656" containerName="nova-scheduler-scheduler" Dec 06 01:24:59 crc kubenswrapper[4991]: I1206 01:24:59.061819 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="6292e839-f3b8-4072-a6d3-40c9a88e5656" containerName="nova-scheduler-scheduler" Dec 06 01:24:59 crc kubenswrapper[4991]: I1206 01:24:59.062026 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="90efc906-29ab-474c-9db5-d8b8dc495afc" containerName="nova-api-api" Dec 06 01:24:59 crc kubenswrapper[4991]: I1206 01:24:59.062062 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="6292e839-f3b8-4072-a6d3-40c9a88e5656" containerName="nova-scheduler-scheduler" Dec 06 01:24:59 crc kubenswrapper[4991]: I1206 01:24:59.062071 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="90efc906-29ab-474c-9db5-d8b8dc495afc" containerName="nova-api-log" Dec 06 01:24:59 crc kubenswrapper[4991]: I1206 01:24:59.062679 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 01:24:59 crc kubenswrapper[4991]: I1206 01:24:59.069515 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 01:24:59 crc kubenswrapper[4991]: I1206 01:24:59.070332 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 06 01:24:59 crc kubenswrapper[4991]: I1206 01:24:59.119541 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90efc906-29ab-474c-9db5-d8b8dc495afc-combined-ca-bundle\") pod \"90efc906-29ab-474c-9db5-d8b8dc495afc\" (UID: \"90efc906-29ab-474c-9db5-d8b8dc495afc\") " Dec 06 01:24:59 crc kubenswrapper[4991]: I1206 01:24:59.119940 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90efc906-29ab-474c-9db5-d8b8dc495afc-logs\") pod \"90efc906-29ab-474c-9db5-d8b8dc495afc\" (UID: \"90efc906-29ab-474c-9db5-d8b8dc495afc\") " Dec 06 01:24:59 crc kubenswrapper[4991]: I1206 01:24:59.120094 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90efc906-29ab-474c-9db5-d8b8dc495afc-config-data\") pod \"90efc906-29ab-474c-9db5-d8b8dc495afc\" (UID: \"90efc906-29ab-474c-9db5-d8b8dc495afc\") " Dec 06 01:24:59 crc kubenswrapper[4991]: I1206 01:24:59.120419 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7pjk\" (UniqueName: \"kubernetes.io/projected/90efc906-29ab-474c-9db5-d8b8dc495afc-kube-api-access-p7pjk\") pod \"90efc906-29ab-474c-9db5-d8b8dc495afc\" (UID: \"90efc906-29ab-474c-9db5-d8b8dc495afc\") " Dec 06 01:24:59 crc kubenswrapper[4991]: I1206 01:24:59.123854 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90efc906-29ab-474c-9db5-d8b8dc495afc-logs" (OuterVolumeSpecName: "logs") pod "90efc906-29ab-474c-9db5-d8b8dc495afc" (UID: "90efc906-29ab-474c-9db5-d8b8dc495afc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:24:59 crc kubenswrapper[4991]: I1206 01:24:59.130483 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90efc906-29ab-474c-9db5-d8b8dc495afc-kube-api-access-p7pjk" (OuterVolumeSpecName: "kube-api-access-p7pjk") pod "90efc906-29ab-474c-9db5-d8b8dc495afc" (UID: "90efc906-29ab-474c-9db5-d8b8dc495afc"). InnerVolumeSpecName "kube-api-access-p7pjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:24:59 crc kubenswrapper[4991]: I1206 01:24:59.147831 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90efc906-29ab-474c-9db5-d8b8dc495afc-config-data" (OuterVolumeSpecName: "config-data") pod "90efc906-29ab-474c-9db5-d8b8dc495afc" (UID: "90efc906-29ab-474c-9db5-d8b8dc495afc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:24:59 crc kubenswrapper[4991]: I1206 01:24:59.155195 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90efc906-29ab-474c-9db5-d8b8dc495afc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90efc906-29ab-474c-9db5-d8b8dc495afc" (UID: "90efc906-29ab-474c-9db5-d8b8dc495afc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:24:59 crc kubenswrapper[4991]: I1206 01:24:59.223008 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fed182dc-f3b0-470b-965c-66f5cbb719b5-config-data\") pod \"nova-scheduler-0\" (UID: \"fed182dc-f3b0-470b-965c-66f5cbb719b5\") " pod="openstack/nova-scheduler-0" Dec 06 01:24:59 crc kubenswrapper[4991]: I1206 01:24:59.223094 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fed182dc-f3b0-470b-965c-66f5cbb719b5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fed182dc-f3b0-470b-965c-66f5cbb719b5\") " pod="openstack/nova-scheduler-0" Dec 06 01:24:59 crc kubenswrapper[4991]: I1206 01:24:59.223156 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vznf\" (UniqueName: \"kubernetes.io/projected/fed182dc-f3b0-470b-965c-66f5cbb719b5-kube-api-access-6vznf\") pod \"nova-scheduler-0\" (UID: \"fed182dc-f3b0-470b-965c-66f5cbb719b5\") " pod="openstack/nova-scheduler-0" Dec 06 01:24:59 crc kubenswrapper[4991]: I1206 01:24:59.223589 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7pjk\" (UniqueName: \"kubernetes.io/projected/90efc906-29ab-474c-9db5-d8b8dc495afc-kube-api-access-p7pjk\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:59 crc kubenswrapper[4991]: I1206 01:24:59.223631 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90efc906-29ab-474c-9db5-d8b8dc495afc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:59 crc kubenswrapper[4991]: I1206 01:24:59.223641 4991 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90efc906-29ab-474c-9db5-d8b8dc495afc-logs\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:59 crc kubenswrapper[4991]: I1206 01:24:59.223651 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90efc906-29ab-474c-9db5-d8b8dc495afc-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 01:24:59 crc kubenswrapper[4991]: I1206 01:24:59.325468 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fed182dc-f3b0-470b-965c-66f5cbb719b5-config-data\") pod \"nova-scheduler-0\" (UID: \"fed182dc-f3b0-470b-965c-66f5cbb719b5\") " pod="openstack/nova-scheduler-0" Dec 06 01:24:59 crc kubenswrapper[4991]: I1206 01:24:59.325600 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fed182dc-f3b0-470b-965c-66f5cbb719b5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fed182dc-f3b0-470b-965c-66f5cbb719b5\") " pod="openstack/nova-scheduler-0" Dec 06 01:24:59 crc kubenswrapper[4991]: I1206 01:24:59.325662 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vznf\" (UniqueName: \"kubernetes.io/projected/fed182dc-f3b0-470b-965c-66f5cbb719b5-kube-api-access-6vznf\") pod \"nova-scheduler-0\" (UID: \"fed182dc-f3b0-470b-965c-66f5cbb719b5\") " pod="openstack/nova-scheduler-0" Dec 06 01:24:59 crc kubenswrapper[4991]: I1206 01:24:59.328925 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fed182dc-f3b0-470b-965c-66f5cbb719b5-config-data\") pod \"nova-scheduler-0\" (UID: \"fed182dc-f3b0-470b-965c-66f5cbb719b5\") " pod="openstack/nova-scheduler-0" Dec 06 01:24:59 crc kubenswrapper[4991]: I1206 01:24:59.329692 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fed182dc-f3b0-470b-965c-66f5cbb719b5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fed182dc-f3b0-470b-965c-66f5cbb719b5\") " pod="openstack/nova-scheduler-0" Dec 06 01:24:59 crc kubenswrapper[4991]: I1206 01:24:59.349724 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vznf\" (UniqueName: \"kubernetes.io/projected/fed182dc-f3b0-470b-965c-66f5cbb719b5-kube-api-access-6vznf\") pod \"nova-scheduler-0\" (UID: \"fed182dc-f3b0-470b-965c-66f5cbb719b5\") " pod="openstack/nova-scheduler-0" Dec 06 01:24:59 crc kubenswrapper[4991]: I1206 01:24:59.383244 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 01:24:59 crc kubenswrapper[4991]: I1206 01:24:59.566839 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 06 01:24:59 crc kubenswrapper[4991]: I1206 01:24:59.567018 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 06 01:24:59 crc kubenswrapper[4991]: I1206 01:24:59.710605 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 01:24:59 crc kubenswrapper[4991]: I1206 01:24:59.711311 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="c8f4ebbe-9528-4c3d-b28c-ac0eff0176d8" containerName="kube-state-metrics" containerID="cri-o://f3a1e7416e7bf4f0b9c7aec629a783a2ad163987c711023f3b779d6c2d3fa94e" gracePeriod=30 Dec 06 01:24:59 crc kubenswrapper[4991]: I1206 01:24:59.879509 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 01:24:59 crc kubenswrapper[4991]: W1206 01:24:59.891647 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfed182dc_f3b0_470b_965c_66f5cbb719b5.slice/crio-1174ef697b8a04e670d4936f7f793b26c0262201d86546e24f5472de7e39b43a WatchSource:0}: Error finding container 1174ef697b8a04e670d4936f7f793b26c0262201d86546e24f5472de7e39b43a: Status 404 returned error can't find the container with id 1174ef697b8a04e670d4936f7f793b26c0262201d86546e24f5472de7e39b43a Dec 06 01:25:00 crc kubenswrapper[4991]: I1206 01:25:00.005187 4991 generic.go:334] "Generic (PLEG): container finished" podID="c8f4ebbe-9528-4c3d-b28c-ac0eff0176d8" containerID="f3a1e7416e7bf4f0b9c7aec629a783a2ad163987c711023f3b779d6c2d3fa94e" exitCode=2 Dec 06 01:25:00 crc kubenswrapper[4991]: I1206 01:25:00.005298 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c8f4ebbe-9528-4c3d-b28c-ac0eff0176d8","Type":"ContainerDied","Data":"f3a1e7416e7bf4f0b9c7aec629a783a2ad163987c711023f3b779d6c2d3fa94e"} Dec 06 01:25:00 crc kubenswrapper[4991]: I1206 01:25:00.020436 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fed182dc-f3b0-470b-965c-66f5cbb719b5","Type":"ContainerStarted","Data":"1174ef697b8a04e670d4936f7f793b26c0262201d86546e24f5472de7e39b43a"} Dec 06 01:25:00 crc kubenswrapper[4991]: I1206 01:25:00.029975 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 01:25:00 crc kubenswrapper[4991]: I1206 01:25:00.029805 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"90efc906-29ab-474c-9db5-d8b8dc495afc","Type":"ContainerDied","Data":"e12b51fe2f552287f96040acd57b079e6c506353a4c71a2c35ba3619f85b7bed"} Dec 06 01:25:00 crc kubenswrapper[4991]: I1206 01:25:00.033928 4991 scope.go:117] "RemoveContainer" containerID="b428dcd737da574263318d54829b55013b5e0339d4a634ad63d8a039c444a457" Dec 06 01:25:00 crc kubenswrapper[4991]: I1206 01:25:00.092593 4991 scope.go:117] "RemoveContainer" containerID="70f7b5404b4c3cbed5ff8f35410c5c9ca47d9a96aa6520b0bf11c0d8c65c9b2f" Dec 06 01:25:00 crc kubenswrapper[4991]: I1206 01:25:00.112338 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 06 01:25:00 crc kubenswrapper[4991]: I1206 01:25:00.135538 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 06 01:25:00 crc kubenswrapper[4991]: I1206 01:25:00.143581 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 06 01:25:00 crc kubenswrapper[4991]: I1206 01:25:00.145149 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 01:25:00 crc kubenswrapper[4991]: I1206 01:25:00.148761 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 06 01:25:00 crc kubenswrapper[4991]: I1206 01:25:00.157659 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 01:25:00 crc kubenswrapper[4991]: I1206 01:25:00.229243 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 06 01:25:00 crc kubenswrapper[4991]: I1206 01:25:00.249662 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x5fq\" (UniqueName: \"kubernetes.io/projected/4a1e5c38-a110-4c12-947e-e729e0851bd3-kube-api-access-9x5fq\") pod \"nova-api-0\" (UID: \"4a1e5c38-a110-4c12-947e-e729e0851bd3\") " pod="openstack/nova-api-0" Dec 06 01:25:00 crc kubenswrapper[4991]: I1206 01:25:00.249721 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a1e5c38-a110-4c12-947e-e729e0851bd3-logs\") pod \"nova-api-0\" (UID: \"4a1e5c38-a110-4c12-947e-e729e0851bd3\") " pod="openstack/nova-api-0" Dec 06 01:25:00 crc kubenswrapper[4991]: I1206 01:25:00.249836 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a1e5c38-a110-4c12-947e-e729e0851bd3-config-data\") pod \"nova-api-0\" (UID: \"4a1e5c38-a110-4c12-947e-e729e0851bd3\") " pod="openstack/nova-api-0" Dec 06 01:25:00 crc kubenswrapper[4991]: I1206 01:25:00.250095 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a1e5c38-a110-4c12-947e-e729e0851bd3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4a1e5c38-a110-4c12-947e-e729e0851bd3\") " pod="openstack/nova-api-0" Dec 06 01:25:00 crc kubenswrapper[4991]: I1206 01:25:00.350957 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqbh5\" (UniqueName: \"kubernetes.io/projected/c8f4ebbe-9528-4c3d-b28c-ac0eff0176d8-kube-api-access-jqbh5\") pod \"c8f4ebbe-9528-4c3d-b28c-ac0eff0176d8\" (UID: \"c8f4ebbe-9528-4c3d-b28c-ac0eff0176d8\") " Dec 06 01:25:00 crc kubenswrapper[4991]: I1206 01:25:00.351422 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x5fq\" (UniqueName: \"kubernetes.io/projected/4a1e5c38-a110-4c12-947e-e729e0851bd3-kube-api-access-9x5fq\") pod \"nova-api-0\" (UID: \"4a1e5c38-a110-4c12-947e-e729e0851bd3\") " pod="openstack/nova-api-0" Dec 06 01:25:00 crc kubenswrapper[4991]: I1206 01:25:00.351468 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a1e5c38-a110-4c12-947e-e729e0851bd3-logs\") pod \"nova-api-0\" (UID: \"4a1e5c38-a110-4c12-947e-e729e0851bd3\") " pod="openstack/nova-api-0" Dec 06 01:25:00 crc kubenswrapper[4991]: I1206 01:25:00.351530 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a1e5c38-a110-4c12-947e-e729e0851bd3-config-data\") pod \"nova-api-0\" (UID: \"4a1e5c38-a110-4c12-947e-e729e0851bd3\") " pod="openstack/nova-api-0" Dec 06 01:25:00 crc kubenswrapper[4991]: I1206 01:25:00.351599 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a1e5c38-a110-4c12-947e-e729e0851bd3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4a1e5c38-a110-4c12-947e-e729e0851bd3\") " pod="openstack/nova-api-0" Dec 06 01:25:00 crc kubenswrapper[4991]: I1206 01:25:00.352293 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a1e5c38-a110-4c12-947e-e729e0851bd3-logs\") pod \"nova-api-0\" (UID: \"4a1e5c38-a110-4c12-947e-e729e0851bd3\") " pod="openstack/nova-api-0" Dec 06 01:25:00 crc kubenswrapper[4991]: I1206 01:25:00.356312 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8f4ebbe-9528-4c3d-b28c-ac0eff0176d8-kube-api-access-jqbh5" (OuterVolumeSpecName: "kube-api-access-jqbh5") pod "c8f4ebbe-9528-4c3d-b28c-ac0eff0176d8" (UID: "c8f4ebbe-9528-4c3d-b28c-ac0eff0176d8"). InnerVolumeSpecName "kube-api-access-jqbh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:25:00 crc kubenswrapper[4991]: I1206 01:25:00.360561 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a1e5c38-a110-4c12-947e-e729e0851bd3-config-data\") pod \"nova-api-0\" (UID: \"4a1e5c38-a110-4c12-947e-e729e0851bd3\") " pod="openstack/nova-api-0" Dec 06 01:25:00 crc kubenswrapper[4991]: I1206 01:25:00.360940 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a1e5c38-a110-4c12-947e-e729e0851bd3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4a1e5c38-a110-4c12-947e-e729e0851bd3\") " pod="openstack/nova-api-0" Dec 06 01:25:00 crc kubenswrapper[4991]: I1206 01:25:00.380191 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x5fq\" (UniqueName: \"kubernetes.io/projected/4a1e5c38-a110-4c12-947e-e729e0851bd3-kube-api-access-9x5fq\") pod \"nova-api-0\" (UID: \"4a1e5c38-a110-4c12-947e-e729e0851bd3\") " pod="openstack/nova-api-0" Dec 06 01:25:00 crc kubenswrapper[4991]: I1206 01:25:00.453451 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqbh5\" (UniqueName: \"kubernetes.io/projected/c8f4ebbe-9528-4c3d-b28c-ac0eff0176d8-kube-api-access-jqbh5\") on node \"crc\" DevicePath \"\"" Dec 06 01:25:00 crc kubenswrapper[4991]: I1206 01:25:00.525813 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 01:25:01 crc kubenswrapper[4991]: I1206 01:25:01.041738 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 06 01:25:01 crc kubenswrapper[4991]: I1206 01:25:01.065529 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6292e839-f3b8-4072-a6d3-40c9a88e5656" path="/var/lib/kubelet/pods/6292e839-f3b8-4072-a6d3-40c9a88e5656/volumes" Dec 06 01:25:01 crc kubenswrapper[4991]: I1206 01:25:01.066174 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90efc906-29ab-474c-9db5-d8b8dc495afc" path="/var/lib/kubelet/pods/90efc906-29ab-474c-9db5-d8b8dc495afc/volumes" Dec 06 01:25:01 crc kubenswrapper[4991]: I1206 01:25:01.066910 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c8f4ebbe-9528-4c3d-b28c-ac0eff0176d8","Type":"ContainerDied","Data":"668a2e308d7ca0fab9a9fa25c913a5bf844fd23179999c4eb672a974283b9773"} Dec 06 01:25:01 crc kubenswrapper[4991]: I1206 01:25:01.066945 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fed182dc-f3b0-470b-965c-66f5cbb719b5","Type":"ContainerStarted","Data":"cb371c896f879db4159a31a00a882866b3c26a725fcc5d68b39c01108d773cdf"} Dec 06 01:25:01 crc kubenswrapper[4991]: I1206 01:25:01.067144 4991 scope.go:117] "RemoveContainer" containerID="f3a1e7416e7bf4f0b9c7aec629a783a2ad163987c711023f3b779d6c2d3fa94e" Dec 06 01:25:01 crc kubenswrapper[4991]: I1206 01:25:01.078947 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.078931887 podStartE2EDuration="2.078931887s" podCreationTimestamp="2025-12-06 01:24:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:25:01.074480103 +0000 UTC m=+1374.424770581" watchObservedRunningTime="2025-12-06 01:25:01.078931887 +0000 UTC m=+1374.429222355" Dec 06 01:25:01 crc kubenswrapper[4991]: I1206 01:25:01.097320 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 01:25:01 crc kubenswrapper[4991]: I1206 01:25:01.107051 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 01:25:01 crc kubenswrapper[4991]: I1206 01:25:01.117401 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 01:25:01 crc kubenswrapper[4991]: I1206 01:25:01.141758 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 01:25:01 crc kubenswrapper[4991]: E1206 01:25:01.147053 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8f4ebbe-9528-4c3d-b28c-ac0eff0176d8" containerName="kube-state-metrics" Dec 06 01:25:01 crc kubenswrapper[4991]: I1206 01:25:01.147081 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8f4ebbe-9528-4c3d-b28c-ac0eff0176d8" containerName="kube-state-metrics" Dec 06 01:25:01 crc kubenswrapper[4991]: I1206 01:25:01.147649 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8f4ebbe-9528-4c3d-b28c-ac0eff0176d8" containerName="kube-state-metrics" Dec 06 01:25:01 crc kubenswrapper[4991]: I1206 01:25:01.148820 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 06 01:25:01 crc kubenswrapper[4991]: I1206 01:25:01.157487 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 06 01:25:01 crc kubenswrapper[4991]: I1206 01:25:01.158257 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 06 01:25:01 crc kubenswrapper[4991]: I1206 01:25:01.180733 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 01:25:01 crc kubenswrapper[4991]: I1206 01:25:01.280139 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/64844b68-4772-49f7-9328-8eed4c925267-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"64844b68-4772-49f7-9328-8eed4c925267\") " pod="openstack/kube-state-metrics-0" Dec 06 01:25:01 crc kubenswrapper[4991]: I1206 01:25:01.280308 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlxj6\" (UniqueName: \"kubernetes.io/projected/64844b68-4772-49f7-9328-8eed4c925267-kube-api-access-dlxj6\") pod \"kube-state-metrics-0\" (UID: \"64844b68-4772-49f7-9328-8eed4c925267\") " pod="openstack/kube-state-metrics-0" Dec 06 01:25:01 crc kubenswrapper[4991]: I1206 01:25:01.280528 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/64844b68-4772-49f7-9328-8eed4c925267-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"64844b68-4772-49f7-9328-8eed4c925267\") " pod="openstack/kube-state-metrics-0" Dec 06 01:25:01 crc kubenswrapper[4991]: I1206 01:25:01.280723 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64844b68-4772-49f7-9328-8eed4c925267-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"64844b68-4772-49f7-9328-8eed4c925267\") " pod="openstack/kube-state-metrics-0" Dec 06 01:25:01 crc kubenswrapper[4991]: I1206 01:25:01.387153 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/64844b68-4772-49f7-9328-8eed4c925267-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"64844b68-4772-49f7-9328-8eed4c925267\") " pod="openstack/kube-state-metrics-0" Dec 06 01:25:01 crc kubenswrapper[4991]: I1206 01:25:01.387814 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64844b68-4772-49f7-9328-8eed4c925267-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"64844b68-4772-49f7-9328-8eed4c925267\") " pod="openstack/kube-state-metrics-0" Dec 06 01:25:01 crc kubenswrapper[4991]: I1206 01:25:01.387901 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/64844b68-4772-49f7-9328-8eed4c925267-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"64844b68-4772-49f7-9328-8eed4c925267\") " pod="openstack/kube-state-metrics-0" Dec 06 01:25:01 crc kubenswrapper[4991]: I1206 01:25:01.387927 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlxj6\" (UniqueName: \"kubernetes.io/projected/64844b68-4772-49f7-9328-8eed4c925267-kube-api-access-dlxj6\") pod \"kube-state-metrics-0\" (UID: \"64844b68-4772-49f7-9328-8eed4c925267\") " pod="openstack/kube-state-metrics-0" Dec 06 01:25:01 crc kubenswrapper[4991]: I1206 01:25:01.391684 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/64844b68-4772-49f7-9328-8eed4c925267-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"64844b68-4772-49f7-9328-8eed4c925267\") " pod="openstack/kube-state-metrics-0" Dec 06 01:25:01 crc kubenswrapper[4991]: I1206 01:25:01.393841 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/64844b68-4772-49f7-9328-8eed4c925267-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"64844b68-4772-49f7-9328-8eed4c925267\") " pod="openstack/kube-state-metrics-0" Dec 06 01:25:01 crc kubenswrapper[4991]: I1206 01:25:01.396244 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64844b68-4772-49f7-9328-8eed4c925267-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"64844b68-4772-49f7-9328-8eed4c925267\") " pod="openstack/kube-state-metrics-0" Dec 06 01:25:01 crc kubenswrapper[4991]: I1206 01:25:01.411564 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlxj6\" (UniqueName: \"kubernetes.io/projected/64844b68-4772-49f7-9328-8eed4c925267-kube-api-access-dlxj6\") pod \"kube-state-metrics-0\" (UID: \"64844b68-4772-49f7-9328-8eed4c925267\") " pod="openstack/kube-state-metrics-0" Dec 06 01:25:01 crc kubenswrapper[4991]: I1206 01:25:01.557283 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 06 01:25:01 crc kubenswrapper[4991]: I1206 01:25:01.949742 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:25:01 crc kubenswrapper[4991]: I1206 01:25:01.950269 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="52b047e6-8844-421a-877f-3cbda2e6d6a7" containerName="ceilometer-central-agent" containerID="cri-o://c18cb57f467f2a5287fc106969693f91ed3079a86c8dbde8b35af87df6dc4c16" gracePeriod=30 Dec 06 01:25:01 crc kubenswrapper[4991]: I1206 01:25:01.950395 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="52b047e6-8844-421a-877f-3cbda2e6d6a7" containerName="ceilometer-notification-agent" containerID="cri-o://e6daab8b21fd31e0304f1079dc3cf6bf52fa4ee9e5f3808aa67af05423c74d6c" gracePeriod=30 Dec 06 01:25:01 crc kubenswrapper[4991]: I1206 01:25:01.950394 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="52b047e6-8844-421a-877f-3cbda2e6d6a7" containerName="sg-core" containerID="cri-o://0a1cd767529c52b46a8c6ef7ac64e97205c0053589d1730827360c349e47f48b" gracePeriod=30 Dec 06 01:25:01 crc kubenswrapper[4991]: I1206 01:25:01.950343 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="52b047e6-8844-421a-877f-3cbda2e6d6a7" containerName="proxy-httpd" containerID="cri-o://113dcd9de662fe5153f1615b7d15bcb1686d74ed586a7ce45a06f0af10645bcb" gracePeriod=30 Dec 06 01:25:02 crc kubenswrapper[4991]: W1206 01:25:02.033287 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64844b68_4772_49f7_9328_8eed4c925267.slice/crio-b54e2fa3ff29410ac3d89d77e72521b718561195bb5e7dc38e6d044d145a86f7 WatchSource:0}: Error finding container b54e2fa3ff29410ac3d89d77e72521b718561195bb5e7dc38e6d044d145a86f7: Status 404 returned error can't find the container with id b54e2fa3ff29410ac3d89d77e72521b718561195bb5e7dc38e6d044d145a86f7 Dec 06 01:25:02 crc kubenswrapper[4991]: I1206 01:25:02.036737 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 06 01:25:02 crc kubenswrapper[4991]: I1206 01:25:02.060942 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"64844b68-4772-49f7-9328-8eed4c925267","Type":"ContainerStarted","Data":"b54e2fa3ff29410ac3d89d77e72521b718561195bb5e7dc38e6d044d145a86f7"} Dec 06 01:25:02 crc kubenswrapper[4991]: I1206 01:25:02.070337 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4a1e5c38-a110-4c12-947e-e729e0851bd3","Type":"ContainerStarted","Data":"7ab10cfbe824c8d6686e477b0603a2a32fb93bc03ac4c74cc46526e24f848d39"} Dec 06 01:25:02 crc kubenswrapper[4991]: I1206 01:25:02.070392 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4a1e5c38-a110-4c12-947e-e729e0851bd3","Type":"ContainerStarted","Data":"be9f29dbb68234821bd0482951976d9dc4c8c2cdd12c11c0ecb09366a7cb0e11"} Dec 06 01:25:02 crc kubenswrapper[4991]: I1206 01:25:02.070405 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4a1e5c38-a110-4c12-947e-e729e0851bd3","Type":"ContainerStarted","Data":"947bcfd50220fb299ccdf3cbbec1a7df54b1cfed7e4b1290824ec71cd7783a22"} Dec 06 01:25:02 crc kubenswrapper[4991]: I1206 01:25:02.097482 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.097444052 podStartE2EDuration="2.097444052s" podCreationTimestamp="2025-12-06 01:25:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:25:02.092971837 +0000 UTC m=+1375.443262315" watchObservedRunningTime="2025-12-06 01:25:02.097444052 +0000 UTC m=+1375.447734520" Dec 06 01:25:03 crc kubenswrapper[4991]: I1206 01:25:03.052263 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8f4ebbe-9528-4c3d-b28c-ac0eff0176d8" path="/var/lib/kubelet/pods/c8f4ebbe-9528-4c3d-b28c-ac0eff0176d8/volumes" Dec 06 01:25:03 crc kubenswrapper[4991]: I1206 01:25:03.085468 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"64844b68-4772-49f7-9328-8eed4c925267","Type":"ContainerStarted","Data":"8ba91548117c1f98bf92621dc0dd5023e1d0b25bcc808e5f21b899e29cb777a7"} Dec 06 01:25:03 crc kubenswrapper[4991]: I1206 01:25:03.085589 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 06 01:25:03 crc kubenswrapper[4991]: I1206 01:25:03.090283 4991 generic.go:334] "Generic (PLEG): container finished" podID="52b047e6-8844-421a-877f-3cbda2e6d6a7" containerID="113dcd9de662fe5153f1615b7d15bcb1686d74ed586a7ce45a06f0af10645bcb" exitCode=0 Dec 06 01:25:03 crc kubenswrapper[4991]: I1206 01:25:03.090317 4991 generic.go:334] "Generic (PLEG): container finished" podID="52b047e6-8844-421a-877f-3cbda2e6d6a7" containerID="0a1cd767529c52b46a8c6ef7ac64e97205c0053589d1730827360c349e47f48b" exitCode=2 Dec 06 01:25:03 crc kubenswrapper[4991]: I1206 01:25:03.090326 4991 generic.go:334] "Generic (PLEG): container finished" podID="52b047e6-8844-421a-877f-3cbda2e6d6a7" containerID="c18cb57f467f2a5287fc106969693f91ed3079a86c8dbde8b35af87df6dc4c16" exitCode=0 Dec 06 01:25:03 crc kubenswrapper[4991]: I1206 01:25:03.090335 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52b047e6-8844-421a-877f-3cbda2e6d6a7","Type":"ContainerDied","Data":"113dcd9de662fe5153f1615b7d15bcb1686d74ed586a7ce45a06f0af10645bcb"} Dec 06 01:25:03 crc kubenswrapper[4991]: I1206 01:25:03.090386 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52b047e6-8844-421a-877f-3cbda2e6d6a7","Type":"ContainerDied","Data":"0a1cd767529c52b46a8c6ef7ac64e97205c0053589d1730827360c349e47f48b"} Dec 06 01:25:03 crc kubenswrapper[4991]: I1206 01:25:03.090403 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52b047e6-8844-421a-877f-3cbda2e6d6a7","Type":"ContainerDied","Data":"c18cb57f467f2a5287fc106969693f91ed3079a86c8dbde8b35af87df6dc4c16"} Dec 06 01:25:03 crc kubenswrapper[4991]: I1206 01:25:03.103793 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.737756042 podStartE2EDuration="2.103778375s" podCreationTimestamp="2025-12-06 01:25:01 +0000 UTC" firstStartedPulling="2025-12-06 01:25:02.035753305 +0000 UTC m=+1375.386043773" lastFinishedPulling="2025-12-06 01:25:02.401775638 +0000 UTC m=+1375.752066106" observedRunningTime="2025-12-06 01:25:03.102062721 +0000 UTC m=+1376.452353189" watchObservedRunningTime="2025-12-06 01:25:03.103778375 +0000 UTC m=+1376.454068843" Dec 06 01:25:03 crc kubenswrapper[4991]: I1206 01:25:03.370601 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 06 01:25:04 crc kubenswrapper[4991]: I1206 01:25:04.386960 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 06 01:25:04 crc kubenswrapper[4991]: I1206 01:25:04.567256 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 06 01:25:04 crc kubenswrapper[4991]: I1206 01:25:04.568268 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 06 01:25:05 crc kubenswrapper[4991]: I1206 01:25:05.145741 4991 generic.go:334] "Generic (PLEG): container finished" podID="52b047e6-8844-421a-877f-3cbda2e6d6a7" containerID="e6daab8b21fd31e0304f1079dc3cf6bf52fa4ee9e5f3808aa67af05423c74d6c" exitCode=0 Dec 06 01:25:05 crc kubenswrapper[4991]: I1206 01:25:05.147396 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52b047e6-8844-421a-877f-3cbda2e6d6a7","Type":"ContainerDied","Data":"e6daab8b21fd31e0304f1079dc3cf6bf52fa4ee9e5f3808aa67af05423c74d6c"} Dec 06 01:25:05 crc kubenswrapper[4991]: I1206 01:25:05.287917 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 01:25:05 crc kubenswrapper[4991]: I1206 01:25:05.387999 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvkmv\" (UniqueName: \"kubernetes.io/projected/52b047e6-8844-421a-877f-3cbda2e6d6a7-kube-api-access-dvkmv\") pod \"52b047e6-8844-421a-877f-3cbda2e6d6a7\" (UID: \"52b047e6-8844-421a-877f-3cbda2e6d6a7\") " Dec 06 01:25:05 crc kubenswrapper[4991]: I1206 01:25:05.389339 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52b047e6-8844-421a-877f-3cbda2e6d6a7-config-data\") pod \"52b047e6-8844-421a-877f-3cbda2e6d6a7\" (UID: \"52b047e6-8844-421a-877f-3cbda2e6d6a7\") " Dec 06 01:25:05 crc kubenswrapper[4991]: I1206 01:25:05.390560 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52b047e6-8844-421a-877f-3cbda2e6d6a7-run-httpd\") pod \"52b047e6-8844-421a-877f-3cbda2e6d6a7\" (UID: \"52b047e6-8844-421a-877f-3cbda2e6d6a7\") " Dec 06 01:25:05 crc kubenswrapper[4991]: I1206 01:25:05.391200 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52b047e6-8844-421a-877f-3cbda2e6d6a7-log-httpd\") pod \"52b047e6-8844-421a-877f-3cbda2e6d6a7\" (UID: \"52b047e6-8844-421a-877f-3cbda2e6d6a7\") " Dec 06 01:25:05 crc kubenswrapper[4991]: I1206 01:25:05.391872 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52b047e6-8844-421a-877f-3cbda2e6d6a7-combined-ca-bundle\") pod \"52b047e6-8844-421a-877f-3cbda2e6d6a7\" (UID: \"52b047e6-8844-421a-877f-3cbda2e6d6a7\") " Dec 06 01:25:05 crc kubenswrapper[4991]: I1206 01:25:05.391071 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52b047e6-8844-421a-877f-3cbda2e6d6a7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "52b047e6-8844-421a-877f-3cbda2e6d6a7" (UID: "52b047e6-8844-421a-877f-3cbda2e6d6a7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:25:05 crc kubenswrapper[4991]: I1206 01:25:05.393356 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52b047e6-8844-421a-877f-3cbda2e6d6a7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "52b047e6-8844-421a-877f-3cbda2e6d6a7" (UID: "52b047e6-8844-421a-877f-3cbda2e6d6a7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:25:05 crc kubenswrapper[4991]: I1206 01:25:05.393526 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/52b047e6-8844-421a-877f-3cbda2e6d6a7-sg-core-conf-yaml\") pod \"52b047e6-8844-421a-877f-3cbda2e6d6a7\" (UID: \"52b047e6-8844-421a-877f-3cbda2e6d6a7\") " Dec 06 01:25:05 crc kubenswrapper[4991]: I1206 01:25:05.393628 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52b047e6-8844-421a-877f-3cbda2e6d6a7-scripts\") pod \"52b047e6-8844-421a-877f-3cbda2e6d6a7\" (UID: \"52b047e6-8844-421a-877f-3cbda2e6d6a7\") " Dec 06 01:25:05 crc kubenswrapper[4991]: I1206 01:25:05.395766 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52b047e6-8844-421a-877f-3cbda2e6d6a7-kube-api-access-dvkmv" (OuterVolumeSpecName: "kube-api-access-dvkmv") pod "52b047e6-8844-421a-877f-3cbda2e6d6a7" (UID: "52b047e6-8844-421a-877f-3cbda2e6d6a7"). InnerVolumeSpecName "kube-api-access-dvkmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:25:05 crc kubenswrapper[4991]: I1206 01:25:05.395930 4991 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52b047e6-8844-421a-877f-3cbda2e6d6a7-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 01:25:05 crc kubenswrapper[4991]: I1206 01:25:05.400178 4991 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52b047e6-8844-421a-877f-3cbda2e6d6a7-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 01:25:05 crc kubenswrapper[4991]: I1206 01:25:05.400137 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52b047e6-8844-421a-877f-3cbda2e6d6a7-scripts" (OuterVolumeSpecName: "scripts") pod "52b047e6-8844-421a-877f-3cbda2e6d6a7" (UID: "52b047e6-8844-421a-877f-3cbda2e6d6a7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:25:05 crc kubenswrapper[4991]: I1206 01:25:05.422964 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52b047e6-8844-421a-877f-3cbda2e6d6a7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "52b047e6-8844-421a-877f-3cbda2e6d6a7" (UID: "52b047e6-8844-421a-877f-3cbda2e6d6a7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:25:05 crc kubenswrapper[4991]: I1206 01:25:05.498515 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52b047e6-8844-421a-877f-3cbda2e6d6a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52b047e6-8844-421a-877f-3cbda2e6d6a7" (UID: "52b047e6-8844-421a-877f-3cbda2e6d6a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:25:05 crc kubenswrapper[4991]: I1206 01:25:05.501781 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvkmv\" (UniqueName: \"kubernetes.io/projected/52b047e6-8844-421a-877f-3cbda2e6d6a7-kube-api-access-dvkmv\") on node \"crc\" DevicePath \"\"" Dec 06 01:25:05 crc kubenswrapper[4991]: I1206 01:25:05.501815 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52b047e6-8844-421a-877f-3cbda2e6d6a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:25:05 crc kubenswrapper[4991]: I1206 01:25:05.501826 4991 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/52b047e6-8844-421a-877f-3cbda2e6d6a7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 06 01:25:05 crc kubenswrapper[4991]: I1206 01:25:05.501836 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52b047e6-8844-421a-877f-3cbda2e6d6a7-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:25:05 crc kubenswrapper[4991]: I1206 01:25:05.507422 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52b047e6-8844-421a-877f-3cbda2e6d6a7-config-data" (OuterVolumeSpecName: "config-data") pod "52b047e6-8844-421a-877f-3cbda2e6d6a7" (UID: "52b047e6-8844-421a-877f-3cbda2e6d6a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:25:05 crc kubenswrapper[4991]: I1206 01:25:05.581185 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8603a428-6117-4bbb-a409-45aa3c447611" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 06 01:25:05 crc kubenswrapper[4991]: I1206 01:25:05.581253 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8603a428-6117-4bbb-a409-45aa3c447611" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 06 01:25:05 crc kubenswrapper[4991]: I1206 01:25:05.604111 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52b047e6-8844-421a-877f-3cbda2e6d6a7-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.160935 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52b047e6-8844-421a-877f-3cbda2e6d6a7","Type":"ContainerDied","Data":"866c04d3860e48fbb0f3c6008b430b7c3860ebafeab8b86b220512051e368c15"} Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.161058 4991 scope.go:117] "RemoveContainer" containerID="113dcd9de662fe5153f1615b7d15bcb1686d74ed586a7ce45a06f0af10645bcb" Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.161081 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.202690 4991 scope.go:117] "RemoveContainer" containerID="0a1cd767529c52b46a8c6ef7ac64e97205c0053589d1730827360c349e47f48b" Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.253393 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.268073 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.273409 4991 scope.go:117] "RemoveContainer" containerID="e6daab8b21fd31e0304f1079dc3cf6bf52fa4ee9e5f3808aa67af05423c74d6c" Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.285136 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:25:06 crc kubenswrapper[4991]: E1206 01:25:06.285873 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52b047e6-8844-421a-877f-3cbda2e6d6a7" containerName="sg-core" Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.285906 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="52b047e6-8844-421a-877f-3cbda2e6d6a7" containerName="sg-core" Dec 06 01:25:06 crc kubenswrapper[4991]: E1206 01:25:06.285937 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52b047e6-8844-421a-877f-3cbda2e6d6a7" containerName="ceilometer-notification-agent" Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.285951 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="52b047e6-8844-421a-877f-3cbda2e6d6a7" containerName="ceilometer-notification-agent" Dec 06 01:25:06 crc kubenswrapper[4991]: E1206 01:25:06.286087 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52b047e6-8844-421a-877f-3cbda2e6d6a7" containerName="proxy-httpd" Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.286114 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="52b047e6-8844-421a-877f-3cbda2e6d6a7" containerName="proxy-httpd" Dec 06 01:25:06 crc kubenswrapper[4991]: E1206 01:25:06.286141 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52b047e6-8844-421a-877f-3cbda2e6d6a7" containerName="ceilometer-central-agent" Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.286154 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="52b047e6-8844-421a-877f-3cbda2e6d6a7" containerName="ceilometer-central-agent" Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.286503 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="52b047e6-8844-421a-877f-3cbda2e6d6a7" containerName="sg-core" Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.286542 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="52b047e6-8844-421a-877f-3cbda2e6d6a7" containerName="ceilometer-notification-agent" Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.286602 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="52b047e6-8844-421a-877f-3cbda2e6d6a7" containerName="ceilometer-central-agent" Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.286636 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="52b047e6-8844-421a-877f-3cbda2e6d6a7" containerName="proxy-httpd" Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.293751 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.300223 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.300466 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.301493 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.302476 4991 scope.go:117] "RemoveContainer" containerID="c18cb57f467f2a5287fc106969693f91ed3079a86c8dbde8b35af87df6dc4c16" Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.324051 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-run-httpd\") pod \"ceilometer-0\" (UID: \"a923bb5c-27a6-4be3-8216-25d6a28d5f2a\") " pod="openstack/ceilometer-0" Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.324132 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9csz\" (UniqueName: \"kubernetes.io/projected/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-kube-api-access-k9csz\") pod \"ceilometer-0\" (UID: \"a923bb5c-27a6-4be3-8216-25d6a28d5f2a\") " pod="openstack/ceilometer-0" Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.324170 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a923bb5c-27a6-4be3-8216-25d6a28d5f2a\") " pod="openstack/ceilometer-0" Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.324222 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-config-data\") pod \"ceilometer-0\" (UID: \"a923bb5c-27a6-4be3-8216-25d6a28d5f2a\") " pod="openstack/ceilometer-0" Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.324304 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-log-httpd\") pod \"ceilometer-0\" (UID: \"a923bb5c-27a6-4be3-8216-25d6a28d5f2a\") " pod="openstack/ceilometer-0" Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.324388 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a923bb5c-27a6-4be3-8216-25d6a28d5f2a\") " pod="openstack/ceilometer-0" Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.324413 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a923bb5c-27a6-4be3-8216-25d6a28d5f2a\") " pod="openstack/ceilometer-0" Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.324500 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-scripts\") pod \"ceilometer-0\" (UID: \"a923bb5c-27a6-4be3-8216-25d6a28d5f2a\") " pod="openstack/ceilometer-0" Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.325447 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.425478 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-scripts\") pod \"ceilometer-0\" (UID: \"a923bb5c-27a6-4be3-8216-25d6a28d5f2a\") " pod="openstack/ceilometer-0" Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.425556 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-run-httpd\") pod \"ceilometer-0\" (UID: \"a923bb5c-27a6-4be3-8216-25d6a28d5f2a\") " pod="openstack/ceilometer-0" Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.425592 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9csz\" (UniqueName: \"kubernetes.io/projected/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-kube-api-access-k9csz\") pod \"ceilometer-0\" (UID: \"a923bb5c-27a6-4be3-8216-25d6a28d5f2a\") " pod="openstack/ceilometer-0" Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.425614 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a923bb5c-27a6-4be3-8216-25d6a28d5f2a\") " pod="openstack/ceilometer-0" Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.425645 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-config-data\") pod \"ceilometer-0\" (UID: \"a923bb5c-27a6-4be3-8216-25d6a28d5f2a\") " pod="openstack/ceilometer-0" Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.425673 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-log-httpd\") pod \"ceilometer-0\" (UID: \"a923bb5c-27a6-4be3-8216-25d6a28d5f2a\") " pod="openstack/ceilometer-0" Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.425714 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a923bb5c-27a6-4be3-8216-25d6a28d5f2a\") " pod="openstack/ceilometer-0" Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.425741 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a923bb5c-27a6-4be3-8216-25d6a28d5f2a\") " pod="openstack/ceilometer-0" Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.426819 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-run-httpd\") pod \"ceilometer-0\" (UID: \"a923bb5c-27a6-4be3-8216-25d6a28d5f2a\") " pod="openstack/ceilometer-0" Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.426861 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-log-httpd\") pod \"ceilometer-0\" (UID: \"a923bb5c-27a6-4be3-8216-25d6a28d5f2a\") " pod="openstack/ceilometer-0" Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.430807 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a923bb5c-27a6-4be3-8216-25d6a28d5f2a\") " pod="openstack/ceilometer-0" Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.431751 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-scripts\") pod \"ceilometer-0\" (UID: \"a923bb5c-27a6-4be3-8216-25d6a28d5f2a\") " pod="openstack/ceilometer-0" Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.433667 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a923bb5c-27a6-4be3-8216-25d6a28d5f2a\") " pod="openstack/ceilometer-0" Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.440558 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-config-data\") pod \"ceilometer-0\" (UID: \"a923bb5c-27a6-4be3-8216-25d6a28d5f2a\") " pod="openstack/ceilometer-0" Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.444843 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a923bb5c-27a6-4be3-8216-25d6a28d5f2a\") " pod="openstack/ceilometer-0" Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.445336 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9csz\" (UniqueName: \"kubernetes.io/projected/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-kube-api-access-k9csz\") pod \"ceilometer-0\" (UID: \"a923bb5c-27a6-4be3-8216-25d6a28d5f2a\") " pod="openstack/ceilometer-0" Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.618232 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 01:25:06 crc kubenswrapper[4991]: I1206 01:25:06.879566 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:25:07 crc kubenswrapper[4991]: I1206 01:25:07.064542 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52b047e6-8844-421a-877f-3cbda2e6d6a7" path="/var/lib/kubelet/pods/52b047e6-8844-421a-877f-3cbda2e6d6a7/volumes" Dec 06 01:25:07 crc kubenswrapper[4991]: I1206 01:25:07.171564 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a923bb5c-27a6-4be3-8216-25d6a28d5f2a","Type":"ContainerStarted","Data":"b0150f6a055cdf40802a71cc912f38729bce07e56c2ed2f66a4482c2129849a2"} Dec 06 01:25:08 crc kubenswrapper[4991]: I1206 01:25:08.199234 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a923bb5c-27a6-4be3-8216-25d6a28d5f2a","Type":"ContainerStarted","Data":"d1e785f4a696fb7c97e1c13eb2a812140ddbfec63424579f91f8516f292cd5c2"} Dec 06 01:25:09 crc kubenswrapper[4991]: I1206 01:25:09.214972 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a923bb5c-27a6-4be3-8216-25d6a28d5f2a","Type":"ContainerStarted","Data":"fa40c374476da9ee5a040231e357926cb1ca118048e25e63234d20bd5bef9570"} Dec 06 01:25:09 crc kubenswrapper[4991]: I1206 01:25:09.383716 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 06 01:25:09 crc kubenswrapper[4991]: I1206 01:25:09.435556 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 06 01:25:10 crc kubenswrapper[4991]: I1206 01:25:10.236706 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a923bb5c-27a6-4be3-8216-25d6a28d5f2a","Type":"ContainerStarted","Data":"b751924537a72c83133bb3b7375b98f27bdc0a7cef9c0a237269544da66816d6"} Dec 06 01:25:10 crc kubenswrapper[4991]: I1206 01:25:10.277705 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 06 01:25:10 crc kubenswrapper[4991]: I1206 01:25:10.526732 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 06 01:25:10 crc kubenswrapper[4991]: I1206 01:25:10.528343 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 06 01:25:11 crc kubenswrapper[4991]: I1206 01:25:11.255397 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a923bb5c-27a6-4be3-8216-25d6a28d5f2a","Type":"ContainerStarted","Data":"ef80a7daee3ddf6976c39e699c3cb20a86ff155c30f8fb5a5cdb3e1ecd29f03c"} Dec 06 01:25:11 crc kubenswrapper[4991]: I1206 01:25:11.255711 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 06 01:25:11 crc kubenswrapper[4991]: I1206 01:25:11.279920 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.946358258 podStartE2EDuration="5.279893854s" podCreationTimestamp="2025-12-06 01:25:06 +0000 UTC" firstStartedPulling="2025-12-06 01:25:06.89207616 +0000 UTC m=+1380.242366638" lastFinishedPulling="2025-12-06 01:25:10.225611766 +0000 UTC m=+1383.575902234" observedRunningTime="2025-12-06 01:25:11.278468788 +0000 UTC m=+1384.628759256" watchObservedRunningTime="2025-12-06 01:25:11.279893854 +0000 UTC m=+1384.630184332" Dec 06 01:25:11 crc kubenswrapper[4991]: I1206 01:25:11.609479 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4a1e5c38-a110-4c12-947e-e729e0851bd3" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 06 01:25:11 crc kubenswrapper[4991]: I1206 01:25:11.609855 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4a1e5c38-a110-4c12-947e-e729e0851bd3" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 06 01:25:11 crc kubenswrapper[4991]: I1206 01:25:11.628665 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 06 01:25:14 crc kubenswrapper[4991]: I1206 01:25:14.581448 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 06 01:25:14 crc kubenswrapper[4991]: I1206 01:25:14.582837 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 06 01:25:14 crc kubenswrapper[4991]: I1206 01:25:14.596148 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 06 01:25:14 crc kubenswrapper[4991]: I1206 01:25:14.598875 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 06 01:25:17 crc kubenswrapper[4991]: I1206 01:25:17.153906 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 06 01:25:17 crc kubenswrapper[4991]: I1206 01:25:17.251248 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxbkp\" (UniqueName: \"kubernetes.io/projected/fdcb1d06-e919-463f-9301-596a4524e303-kube-api-access-nxbkp\") pod \"fdcb1d06-e919-463f-9301-596a4524e303\" (UID: \"fdcb1d06-e919-463f-9301-596a4524e303\") " Dec 06 01:25:17 crc kubenswrapper[4991]: I1206 01:25:17.251296 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdcb1d06-e919-463f-9301-596a4524e303-config-data\") pod \"fdcb1d06-e919-463f-9301-596a4524e303\" (UID: \"fdcb1d06-e919-463f-9301-596a4524e303\") " Dec 06 01:25:17 crc kubenswrapper[4991]: I1206 01:25:17.251473 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdcb1d06-e919-463f-9301-596a4524e303-combined-ca-bundle\") pod \"fdcb1d06-e919-463f-9301-596a4524e303\" (UID: \"fdcb1d06-e919-463f-9301-596a4524e303\") " Dec 06 01:25:17 crc kubenswrapper[4991]: I1206 01:25:17.257118 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdcb1d06-e919-463f-9301-596a4524e303-kube-api-access-nxbkp" (OuterVolumeSpecName: "kube-api-access-nxbkp") pod "fdcb1d06-e919-463f-9301-596a4524e303" (UID: "fdcb1d06-e919-463f-9301-596a4524e303"). InnerVolumeSpecName "kube-api-access-nxbkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:25:17 crc kubenswrapper[4991]: I1206 01:25:17.292097 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdcb1d06-e919-463f-9301-596a4524e303-config-data" (OuterVolumeSpecName: "config-data") pod "fdcb1d06-e919-463f-9301-596a4524e303" (UID: "fdcb1d06-e919-463f-9301-596a4524e303"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:25:17 crc kubenswrapper[4991]: I1206 01:25:17.298702 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdcb1d06-e919-463f-9301-596a4524e303-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fdcb1d06-e919-463f-9301-596a4524e303" (UID: "fdcb1d06-e919-463f-9301-596a4524e303"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:25:17 crc kubenswrapper[4991]: I1206 01:25:17.335534 4991 generic.go:334] "Generic (PLEG): container finished" podID="fdcb1d06-e919-463f-9301-596a4524e303" containerID="e5a1c10272cc48df479abbaf5daf6cf6fa65ac024ad2d9f1b252319378b9401a" exitCode=137 Dec 06 01:25:17 crc kubenswrapper[4991]: I1206 01:25:17.335584 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fdcb1d06-e919-463f-9301-596a4524e303","Type":"ContainerDied","Data":"e5a1c10272cc48df479abbaf5daf6cf6fa65ac024ad2d9f1b252319378b9401a"} Dec 06 01:25:17 crc kubenswrapper[4991]: I1206 01:25:17.335621 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fdcb1d06-e919-463f-9301-596a4524e303","Type":"ContainerDied","Data":"195954df317a2d0f73412817d491fd27048e7823002d7abe1db5389b65a323eb"} Dec 06 01:25:17 crc kubenswrapper[4991]: I1206 01:25:17.335612 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 06 01:25:17 crc kubenswrapper[4991]: I1206 01:25:17.335689 4991 scope.go:117] "RemoveContainer" containerID="e5a1c10272cc48df479abbaf5daf6cf6fa65ac024ad2d9f1b252319378b9401a" Dec 06 01:25:17 crc kubenswrapper[4991]: I1206 01:25:17.353396 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdcb1d06-e919-463f-9301-596a4524e303-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:25:17 crc kubenswrapper[4991]: I1206 01:25:17.353435 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxbkp\" (UniqueName: \"kubernetes.io/projected/fdcb1d06-e919-463f-9301-596a4524e303-kube-api-access-nxbkp\") on node \"crc\" DevicePath \"\"" Dec 06 01:25:17 crc kubenswrapper[4991]: I1206 01:25:17.353451 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdcb1d06-e919-463f-9301-596a4524e303-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 01:25:17 crc kubenswrapper[4991]: I1206 01:25:17.371463 4991 scope.go:117] "RemoveContainer" containerID="e5a1c10272cc48df479abbaf5daf6cf6fa65ac024ad2d9f1b252319378b9401a" Dec 06 01:25:17 crc kubenswrapper[4991]: E1206 01:25:17.371862 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5a1c10272cc48df479abbaf5daf6cf6fa65ac024ad2d9f1b252319378b9401a\": container with ID starting with e5a1c10272cc48df479abbaf5daf6cf6fa65ac024ad2d9f1b252319378b9401a not found: ID does not exist" containerID="e5a1c10272cc48df479abbaf5daf6cf6fa65ac024ad2d9f1b252319378b9401a" Dec 06 01:25:17 crc kubenswrapper[4991]: I1206 01:25:17.371908 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5a1c10272cc48df479abbaf5daf6cf6fa65ac024ad2d9f1b252319378b9401a"} err="failed to get container status \"e5a1c10272cc48df479abbaf5daf6cf6fa65ac024ad2d9f1b252319378b9401a\": rpc error: code = NotFound desc = could not find container \"e5a1c10272cc48df479abbaf5daf6cf6fa65ac024ad2d9f1b252319378b9401a\": container with ID starting with e5a1c10272cc48df479abbaf5daf6cf6fa65ac024ad2d9f1b252319378b9401a not found: ID does not exist" Dec 06 01:25:17 crc kubenswrapper[4991]: I1206 01:25:17.375565 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 01:25:17 crc kubenswrapper[4991]: I1206 01:25:17.390457 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 01:25:17 crc kubenswrapper[4991]: I1206 01:25:17.403109 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 01:25:17 crc kubenswrapper[4991]: E1206 01:25:17.403622 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdcb1d06-e919-463f-9301-596a4524e303" containerName="nova-cell1-novncproxy-novncproxy" Dec 06 01:25:17 crc kubenswrapper[4991]: I1206 01:25:17.403645 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdcb1d06-e919-463f-9301-596a4524e303" containerName="nova-cell1-novncproxy-novncproxy" Dec 06 01:25:17 crc kubenswrapper[4991]: I1206 01:25:17.403901 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdcb1d06-e919-463f-9301-596a4524e303" containerName="nova-cell1-novncproxy-novncproxy" Dec 06 01:25:17 crc kubenswrapper[4991]: I1206 01:25:17.404727 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 06 01:25:17 crc kubenswrapper[4991]: I1206 01:25:17.407235 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 06 01:25:17 crc kubenswrapper[4991]: I1206 01:25:17.409177 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 06 01:25:17 crc kubenswrapper[4991]: I1206 01:25:17.409406 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 06 01:25:17 crc kubenswrapper[4991]: I1206 01:25:17.417907 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 01:25:17 crc kubenswrapper[4991]: I1206 01:25:17.557543 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5be01c8-e833-4da5-a09e-da95ec913708-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c5be01c8-e833-4da5-a09e-da95ec913708\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 01:25:17 crc kubenswrapper[4991]: I1206 01:25:17.557603 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5be01c8-e833-4da5-a09e-da95ec913708-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c5be01c8-e833-4da5-a09e-da95ec913708\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 01:25:17 crc kubenswrapper[4991]: I1206 01:25:17.557638 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5be01c8-e833-4da5-a09e-da95ec913708-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c5be01c8-e833-4da5-a09e-da95ec913708\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 01:25:17 crc kubenswrapper[4991]: I1206 01:25:17.557705 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hzsz\" (UniqueName: \"kubernetes.io/projected/c5be01c8-e833-4da5-a09e-da95ec913708-kube-api-access-8hzsz\") pod \"nova-cell1-novncproxy-0\" (UID: \"c5be01c8-e833-4da5-a09e-da95ec913708\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 01:25:17 crc kubenswrapper[4991]: I1206 01:25:17.558224 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5be01c8-e833-4da5-a09e-da95ec913708-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c5be01c8-e833-4da5-a09e-da95ec913708\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 01:25:17 crc kubenswrapper[4991]: I1206 01:25:17.659975 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5be01c8-e833-4da5-a09e-da95ec913708-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c5be01c8-e833-4da5-a09e-da95ec913708\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 01:25:17 crc kubenswrapper[4991]: I1206 01:25:17.660103 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5be01c8-e833-4da5-a09e-da95ec913708-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c5be01c8-e833-4da5-a09e-da95ec913708\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 01:25:17 crc kubenswrapper[4991]: I1206 01:25:17.660185 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5be01c8-e833-4da5-a09e-da95ec913708-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c5be01c8-e833-4da5-a09e-da95ec913708\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 01:25:17 crc kubenswrapper[4991]: I1206 01:25:17.660236 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5be01c8-e833-4da5-a09e-da95ec913708-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c5be01c8-e833-4da5-a09e-da95ec913708\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 01:25:17 crc kubenswrapper[4991]: I1206 01:25:17.660341 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hzsz\" (UniqueName: \"kubernetes.io/projected/c5be01c8-e833-4da5-a09e-da95ec913708-kube-api-access-8hzsz\") pod \"nova-cell1-novncproxy-0\" (UID: \"c5be01c8-e833-4da5-a09e-da95ec913708\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 01:25:17 crc kubenswrapper[4991]: I1206 01:25:17.664249 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5be01c8-e833-4da5-a09e-da95ec913708-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c5be01c8-e833-4da5-a09e-da95ec913708\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 01:25:17 crc kubenswrapper[4991]: I1206 01:25:17.664299 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5be01c8-e833-4da5-a09e-da95ec913708-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c5be01c8-e833-4da5-a09e-da95ec913708\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 01:25:17 crc kubenswrapper[4991]: I1206 01:25:17.664503 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5be01c8-e833-4da5-a09e-da95ec913708-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c5be01c8-e833-4da5-a09e-da95ec913708\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 01:25:17 crc kubenswrapper[4991]: I1206 01:25:17.666710 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5be01c8-e833-4da5-a09e-da95ec913708-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c5be01c8-e833-4da5-a09e-da95ec913708\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 01:25:17 crc kubenswrapper[4991]: I1206 01:25:17.677326 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hzsz\" (UniqueName: \"kubernetes.io/projected/c5be01c8-e833-4da5-a09e-da95ec913708-kube-api-access-8hzsz\") pod \"nova-cell1-novncproxy-0\" (UID: \"c5be01c8-e833-4da5-a09e-da95ec913708\") " pod="openstack/nova-cell1-novncproxy-0" Dec 06 01:25:17 crc kubenswrapper[4991]: I1206 01:25:17.732284 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 06 01:25:18 crc kubenswrapper[4991]: I1206 01:25:18.271366 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 06 01:25:18 crc kubenswrapper[4991]: W1206 01:25:18.275140 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5be01c8_e833_4da5_a09e_da95ec913708.slice/crio-37a547c6a2f8e3ae004e286bf915a8f617a3dff09f2fcf54cc29483cea11e5af WatchSource:0}: Error finding container 37a547c6a2f8e3ae004e286bf915a8f617a3dff09f2fcf54cc29483cea11e5af: Status 404 returned error can't find the container with id 37a547c6a2f8e3ae004e286bf915a8f617a3dff09f2fcf54cc29483cea11e5af Dec 06 01:25:18 crc kubenswrapper[4991]: I1206 01:25:18.356044 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c5be01c8-e833-4da5-a09e-da95ec913708","Type":"ContainerStarted","Data":"37a547c6a2f8e3ae004e286bf915a8f617a3dff09f2fcf54cc29483cea11e5af"} Dec 06 01:25:19 crc kubenswrapper[4991]: I1206 01:25:19.054095 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdcb1d06-e919-463f-9301-596a4524e303" path="/var/lib/kubelet/pods/fdcb1d06-e919-463f-9301-596a4524e303/volumes" Dec 06 01:25:19 crc kubenswrapper[4991]: I1206 01:25:19.370633 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c5be01c8-e833-4da5-a09e-da95ec913708","Type":"ContainerStarted","Data":"622d4fe926ed0a11eaa2ed99a08e50bcd21e8bd730190b75a09eefef70a2728b"} Dec 06 01:25:19 crc kubenswrapper[4991]: I1206 01:25:19.406358 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.406331155 podStartE2EDuration="2.406331155s" podCreationTimestamp="2025-12-06 01:25:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:25:19.40183564 +0000 UTC m=+1392.752126158" watchObservedRunningTime="2025-12-06 01:25:19.406331155 +0000 UTC m=+1392.756621653" Dec 06 01:25:20 crc kubenswrapper[4991]: I1206 01:25:20.534515 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 06 01:25:20 crc kubenswrapper[4991]: I1206 01:25:20.535744 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 06 01:25:20 crc kubenswrapper[4991]: I1206 01:25:20.541945 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 06 01:25:20 crc kubenswrapper[4991]: I1206 01:25:20.542143 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 06 01:25:21 crc kubenswrapper[4991]: I1206 01:25:21.392383 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 06 01:25:21 crc kubenswrapper[4991]: I1206 01:25:21.398703 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 06 01:25:21 crc kubenswrapper[4991]: I1206 01:25:21.648204 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-7d6sp"] Dec 06 01:25:21 crc kubenswrapper[4991]: I1206 01:25:21.650177 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-7d6sp" Dec 06 01:25:21 crc kubenswrapper[4991]: I1206 01:25:21.669424 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-7d6sp"] Dec 06 01:25:21 crc kubenswrapper[4991]: I1206 01:25:21.755696 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-7d6sp\" (UID: \"73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-7d6sp" Dec 06 01:25:21 crc kubenswrapper[4991]: I1206 01:25:21.755750 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-7d6sp\" (UID: \"73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-7d6sp" Dec 06 01:25:21 crc kubenswrapper[4991]: I1206 01:25:21.755779 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjs2b\" (UniqueName: \"kubernetes.io/projected/73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49-kube-api-access-mjs2b\") pod \"dnsmasq-dns-5c7b6c5df9-7d6sp\" (UID: \"73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-7d6sp" Dec 06 01:25:21 crc kubenswrapper[4991]: I1206 01:25:21.756032 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-7d6sp\" (UID: \"73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-7d6sp" Dec 06 01:25:21 crc kubenswrapper[4991]: I1206 01:25:21.756103 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49-config\") pod \"dnsmasq-dns-5c7b6c5df9-7d6sp\" (UID: \"73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-7d6sp" Dec 06 01:25:21 crc kubenswrapper[4991]: I1206 01:25:21.756603 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-7d6sp\" (UID: \"73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-7d6sp" Dec 06 01:25:21 crc kubenswrapper[4991]: I1206 01:25:21.858718 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjs2b\" (UniqueName: \"kubernetes.io/projected/73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49-kube-api-access-mjs2b\") pod \"dnsmasq-dns-5c7b6c5df9-7d6sp\" (UID: \"73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-7d6sp" Dec 06 01:25:21 crc kubenswrapper[4991]: I1206 01:25:21.859173 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-7d6sp\" (UID: \"73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-7d6sp" Dec 06 01:25:21 crc kubenswrapper[4991]: I1206 01:25:21.859224 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-7d6sp\" (UID: \"73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-7d6sp" Dec 06 01:25:21 crc kubenswrapper[4991]: I1206 01:25:21.859243 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49-config\") pod \"dnsmasq-dns-5c7b6c5df9-7d6sp\" (UID: \"73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-7d6sp" Dec 06 01:25:21 crc kubenswrapper[4991]: I1206 01:25:21.859309 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-7d6sp\" (UID: \"73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-7d6sp" Dec 06 01:25:21 crc kubenswrapper[4991]: I1206 01:25:21.859394 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-7d6sp\" (UID: \"73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-7d6sp" Dec 06 01:25:21 crc kubenswrapper[4991]: I1206 01:25:21.860323 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-7d6sp\" (UID: \"73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-7d6sp" Dec 06 01:25:21 crc kubenswrapper[4991]: I1206 01:25:21.861139 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-7d6sp\" (UID: \"73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-7d6sp" Dec 06 01:25:21 crc kubenswrapper[4991]: I1206 01:25:21.861670 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-7d6sp\" (UID: \"73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-7d6sp" Dec 06 01:25:21 crc kubenswrapper[4991]: I1206 01:25:21.862603 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-7d6sp\" (UID: \"73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-7d6sp" Dec 06 01:25:21 crc kubenswrapper[4991]: I1206 01:25:21.862862 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49-config\") pod \"dnsmasq-dns-5c7b6c5df9-7d6sp\" (UID: \"73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-7d6sp" Dec 06 01:25:21 crc kubenswrapper[4991]: I1206 01:25:21.880233 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjs2b\" (UniqueName: \"kubernetes.io/projected/73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49-kube-api-access-mjs2b\") pod \"dnsmasq-dns-5c7b6c5df9-7d6sp\" (UID: \"73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-7d6sp" Dec 06 01:25:21 crc kubenswrapper[4991]: I1206 01:25:21.982800 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-7d6sp" Dec 06 01:25:22 crc kubenswrapper[4991]: W1206 01:25:22.525021 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73f37bf5_c4a0_4a59_9d3e_d18a1de7dd49.slice/crio-4cd1a750ec6dabfb18b69aab52b5357900a97ee2eb56c3eb683afa20f364fd77 WatchSource:0}: Error finding container 4cd1a750ec6dabfb18b69aab52b5357900a97ee2eb56c3eb683afa20f364fd77: Status 404 returned error can't find the container with id 4cd1a750ec6dabfb18b69aab52b5357900a97ee2eb56c3eb683afa20f364fd77 Dec 06 01:25:22 crc kubenswrapper[4991]: I1206 01:25:22.529106 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-7d6sp"] Dec 06 01:25:22 crc kubenswrapper[4991]: I1206 01:25:22.733132 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 06 01:25:23 crc kubenswrapper[4991]: I1206 01:25:23.424167 4991 generic.go:334] "Generic (PLEG): container finished" podID="73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49" containerID="d38d022bccd7765326407e3d0f4e63f463bfd51e4267c33efe7ac06179cc8494" exitCode=0 Dec 06 01:25:23 crc kubenswrapper[4991]: I1206 01:25:23.424258 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-7d6sp" event={"ID":"73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49","Type":"ContainerDied","Data":"d38d022bccd7765326407e3d0f4e63f463bfd51e4267c33efe7ac06179cc8494"} Dec 06 01:25:23 crc kubenswrapper[4991]: I1206 01:25:23.424611 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-7d6sp" event={"ID":"73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49","Type":"ContainerStarted","Data":"4cd1a750ec6dabfb18b69aab52b5357900a97ee2eb56c3eb683afa20f364fd77"} Dec 06 01:25:23 crc kubenswrapper[4991]: I1206 01:25:23.802250 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:25:23 crc kubenswrapper[4991]: I1206 01:25:23.803522 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a923bb5c-27a6-4be3-8216-25d6a28d5f2a" containerName="ceilometer-central-agent" containerID="cri-o://d1e785f4a696fb7c97e1c13eb2a812140ddbfec63424579f91f8516f292cd5c2" gracePeriod=30 Dec 06 01:25:23 crc kubenswrapper[4991]: I1206 01:25:23.803680 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a923bb5c-27a6-4be3-8216-25d6a28d5f2a" containerName="proxy-httpd" containerID="cri-o://ef80a7daee3ddf6976c39e699c3cb20a86ff155c30f8fb5a5cdb3e1ecd29f03c" gracePeriod=30 Dec 06 01:25:23 crc kubenswrapper[4991]: I1206 01:25:23.803701 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a923bb5c-27a6-4be3-8216-25d6a28d5f2a" containerName="sg-core" containerID="cri-o://b751924537a72c83133bb3b7375b98f27bdc0a7cef9c0a237269544da66816d6" gracePeriod=30 Dec 06 01:25:23 crc kubenswrapper[4991]: I1206 01:25:23.803760 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a923bb5c-27a6-4be3-8216-25d6a28d5f2a" containerName="ceilometer-notification-agent" containerID="cri-o://fa40c374476da9ee5a040231e357926cb1ca118048e25e63234d20bd5bef9570" gracePeriod=30 Dec 06 01:25:23 crc kubenswrapper[4991]: I1206 01:25:23.813526 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="a923bb5c-27a6-4be3-8216-25d6a28d5f2a" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.197:3000/\": read tcp 10.217.0.2:33222->10.217.0.197:3000: read: connection reset by peer" Dec 06 01:25:24 crc kubenswrapper[4991]: I1206 01:25:24.151382 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 06 01:25:24 crc kubenswrapper[4991]: I1206 01:25:24.442089 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-7d6sp" event={"ID":"73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49","Type":"ContainerStarted","Data":"13e40c62eb15dcd741fac3f59f61b9a7f59d6e9b4f5146d35a55adb68b2e8de6"} Dec 06 01:25:24 crc kubenswrapper[4991]: I1206 01:25:24.442294 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c7b6c5df9-7d6sp" Dec 06 01:25:24 crc kubenswrapper[4991]: I1206 01:25:24.446274 4991 generic.go:334] "Generic (PLEG): container finished" podID="a923bb5c-27a6-4be3-8216-25d6a28d5f2a" containerID="ef80a7daee3ddf6976c39e699c3cb20a86ff155c30f8fb5a5cdb3e1ecd29f03c" exitCode=0 Dec 06 01:25:24 crc kubenswrapper[4991]: I1206 01:25:24.446320 4991 generic.go:334] "Generic (PLEG): container finished" podID="a923bb5c-27a6-4be3-8216-25d6a28d5f2a" containerID="b751924537a72c83133bb3b7375b98f27bdc0a7cef9c0a237269544da66816d6" exitCode=2 Dec 06 01:25:24 crc kubenswrapper[4991]: I1206 01:25:24.446332 4991 generic.go:334] "Generic (PLEG): container finished" podID="a923bb5c-27a6-4be3-8216-25d6a28d5f2a" containerID="d1e785f4a696fb7c97e1c13eb2a812140ddbfec63424579f91f8516f292cd5c2" exitCode=0 Dec 06 01:25:24 crc kubenswrapper[4991]: I1206 01:25:24.446342 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a923bb5c-27a6-4be3-8216-25d6a28d5f2a","Type":"ContainerDied","Data":"ef80a7daee3ddf6976c39e699c3cb20a86ff155c30f8fb5a5cdb3e1ecd29f03c"} Dec 06 01:25:24 crc kubenswrapper[4991]: I1206 01:25:24.446381 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a923bb5c-27a6-4be3-8216-25d6a28d5f2a","Type":"ContainerDied","Data":"b751924537a72c83133bb3b7375b98f27bdc0a7cef9c0a237269544da66816d6"} Dec 06 01:25:24 crc kubenswrapper[4991]: I1206 01:25:24.446394 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a923bb5c-27a6-4be3-8216-25d6a28d5f2a","Type":"ContainerDied","Data":"d1e785f4a696fb7c97e1c13eb2a812140ddbfec63424579f91f8516f292cd5c2"} Dec 06 01:25:24 crc kubenswrapper[4991]: I1206 01:25:24.446596 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4a1e5c38-a110-4c12-947e-e729e0851bd3" containerName="nova-api-log" containerID="cri-o://be9f29dbb68234821bd0482951976d9dc4c8c2cdd12c11c0ecb09366a7cb0e11" gracePeriod=30 Dec 06 01:25:24 crc kubenswrapper[4991]: I1206 01:25:24.446624 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4a1e5c38-a110-4c12-947e-e729e0851bd3" containerName="nova-api-api" containerID="cri-o://7ab10cfbe824c8d6686e477b0603a2a32fb93bc03ac4c74cc46526e24f848d39" gracePeriod=30 Dec 06 01:25:24 crc kubenswrapper[4991]: I1206 01:25:24.466686 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c7b6c5df9-7d6sp" podStartSLOduration=3.466661193 podStartE2EDuration="3.466661193s" podCreationTimestamp="2025-12-06 01:25:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:25:24.463931153 +0000 UTC m=+1397.814221641" watchObservedRunningTime="2025-12-06 01:25:24.466661193 +0000 UTC m=+1397.816951651" Dec 06 01:25:25 crc kubenswrapper[4991]: I1206 01:25:25.459497 4991 generic.go:334] "Generic (PLEG): container finished" podID="4a1e5c38-a110-4c12-947e-e729e0851bd3" containerID="be9f29dbb68234821bd0482951976d9dc4c8c2cdd12c11c0ecb09366a7cb0e11" exitCode=143 Dec 06 01:25:25 crc kubenswrapper[4991]: I1206 01:25:25.459620 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4a1e5c38-a110-4c12-947e-e729e0851bd3","Type":"ContainerDied","Data":"be9f29dbb68234821bd0482951976d9dc4c8c2cdd12c11c0ecb09366a7cb0e11"} Dec 06 01:25:27 crc kubenswrapper[4991]: I1206 01:25:27.733145 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 06 01:25:27 crc kubenswrapper[4991]: I1206 01:25:27.752143 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.101716 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.207978 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a1e5c38-a110-4c12-947e-e729e0851bd3-logs\") pod \"4a1e5c38-a110-4c12-947e-e729e0851bd3\" (UID: \"4a1e5c38-a110-4c12-947e-e729e0851bd3\") " Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.208135 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a1e5c38-a110-4c12-947e-e729e0851bd3-combined-ca-bundle\") pod \"4a1e5c38-a110-4c12-947e-e729e0851bd3\" (UID: \"4a1e5c38-a110-4c12-947e-e729e0851bd3\") " Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.208180 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9x5fq\" (UniqueName: \"kubernetes.io/projected/4a1e5c38-a110-4c12-947e-e729e0851bd3-kube-api-access-9x5fq\") pod \"4a1e5c38-a110-4c12-947e-e729e0851bd3\" (UID: \"4a1e5c38-a110-4c12-947e-e729e0851bd3\") " Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.208439 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a1e5c38-a110-4c12-947e-e729e0851bd3-config-data\") pod \"4a1e5c38-a110-4c12-947e-e729e0851bd3\" (UID: \"4a1e5c38-a110-4c12-947e-e729e0851bd3\") " Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.208708 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a1e5c38-a110-4c12-947e-e729e0851bd3-logs" (OuterVolumeSpecName: "logs") pod "4a1e5c38-a110-4c12-947e-e729e0851bd3" (UID: "4a1e5c38-a110-4c12-947e-e729e0851bd3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.209251 4991 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a1e5c38-a110-4c12-947e-e729e0851bd3-logs\") on node \"crc\" DevicePath \"\"" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.235816 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a1e5c38-a110-4c12-947e-e729e0851bd3-kube-api-access-9x5fq" (OuterVolumeSpecName: "kube-api-access-9x5fq") pod "4a1e5c38-a110-4c12-947e-e729e0851bd3" (UID: "4a1e5c38-a110-4c12-947e-e729e0851bd3"). InnerVolumeSpecName "kube-api-access-9x5fq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.242330 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a1e5c38-a110-4c12-947e-e729e0851bd3-config-data" (OuterVolumeSpecName: "config-data") pod "4a1e5c38-a110-4c12-947e-e729e0851bd3" (UID: "4a1e5c38-a110-4c12-947e-e729e0851bd3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.249346 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a1e5c38-a110-4c12-947e-e729e0851bd3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a1e5c38-a110-4c12-947e-e729e0851bd3" (UID: "4a1e5c38-a110-4c12-947e-e729e0851bd3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.311102 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9x5fq\" (UniqueName: \"kubernetes.io/projected/4a1e5c38-a110-4c12-947e-e729e0851bd3-kube-api-access-9x5fq\") on node \"crc\" DevicePath \"\"" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.311149 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a1e5c38-a110-4c12-947e-e729e0851bd3-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.311165 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a1e5c38-a110-4c12-947e-e729e0851bd3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.491428 4991 generic.go:334] "Generic (PLEG): container finished" podID="4a1e5c38-a110-4c12-947e-e729e0851bd3" containerID="7ab10cfbe824c8d6686e477b0603a2a32fb93bc03ac4c74cc46526e24f848d39" exitCode=0 Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.491471 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.491533 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4a1e5c38-a110-4c12-947e-e729e0851bd3","Type":"ContainerDied","Data":"7ab10cfbe824c8d6686e477b0603a2a32fb93bc03ac4c74cc46526e24f848d39"} Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.491595 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4a1e5c38-a110-4c12-947e-e729e0851bd3","Type":"ContainerDied","Data":"947bcfd50220fb299ccdf3cbbec1a7df54b1cfed7e4b1290824ec71cd7783a22"} Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.491614 4991 scope.go:117] "RemoveContainer" containerID="7ab10cfbe824c8d6686e477b0603a2a32fb93bc03ac4c74cc46526e24f848d39" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.519031 4991 scope.go:117] "RemoveContainer" containerID="be9f29dbb68234821bd0482951976d9dc4c8c2cdd12c11c0ecb09366a7cb0e11" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.542115 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.545547 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.570406 4991 scope.go:117] "RemoveContainer" containerID="7ab10cfbe824c8d6686e477b0603a2a32fb93bc03ac4c74cc46526e24f848d39" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.570896 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 06 01:25:28 crc kubenswrapper[4991]: E1206 01:25:28.571951 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ab10cfbe824c8d6686e477b0603a2a32fb93bc03ac4c74cc46526e24f848d39\": container with ID starting with 7ab10cfbe824c8d6686e477b0603a2a32fb93bc03ac4c74cc46526e24f848d39 not found: ID does not exist" containerID="7ab10cfbe824c8d6686e477b0603a2a32fb93bc03ac4c74cc46526e24f848d39" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.572007 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ab10cfbe824c8d6686e477b0603a2a32fb93bc03ac4c74cc46526e24f848d39"} err="failed to get container status \"7ab10cfbe824c8d6686e477b0603a2a32fb93bc03ac4c74cc46526e24f848d39\": rpc error: code = NotFound desc = could not find container \"7ab10cfbe824c8d6686e477b0603a2a32fb93bc03ac4c74cc46526e24f848d39\": container with ID starting with 7ab10cfbe824c8d6686e477b0603a2a32fb93bc03ac4c74cc46526e24f848d39 not found: ID does not exist" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.572037 4991 scope.go:117] "RemoveContainer" containerID="be9f29dbb68234821bd0482951976d9dc4c8c2cdd12c11c0ecb09366a7cb0e11" Dec 06 01:25:28 crc kubenswrapper[4991]: E1206 01:25:28.572555 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be9f29dbb68234821bd0482951976d9dc4c8c2cdd12c11c0ecb09366a7cb0e11\": container with ID starting with be9f29dbb68234821bd0482951976d9dc4c8c2cdd12c11c0ecb09366a7cb0e11 not found: ID does not exist" containerID="be9f29dbb68234821bd0482951976d9dc4c8c2cdd12c11c0ecb09366a7cb0e11" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.572588 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be9f29dbb68234821bd0482951976d9dc4c8c2cdd12c11c0ecb09366a7cb0e11"} err="failed to get container status \"be9f29dbb68234821bd0482951976d9dc4c8c2cdd12c11c0ecb09366a7cb0e11\": rpc error: code = NotFound desc = could not find container \"be9f29dbb68234821bd0482951976d9dc4c8c2cdd12c11c0ecb09366a7cb0e11\": container with ID starting with be9f29dbb68234821bd0482951976d9dc4c8c2cdd12c11c0ecb09366a7cb0e11 not found: ID does not exist" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.587201 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 06 01:25:28 crc kubenswrapper[4991]: E1206 01:25:28.587793 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a1e5c38-a110-4c12-947e-e729e0851bd3" containerName="nova-api-api" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.587816 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a1e5c38-a110-4c12-947e-e729e0851bd3" containerName="nova-api-api" Dec 06 01:25:28 crc kubenswrapper[4991]: E1206 01:25:28.587848 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a1e5c38-a110-4c12-947e-e729e0851bd3" containerName="nova-api-log" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.587855 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a1e5c38-a110-4c12-947e-e729e0851bd3" containerName="nova-api-log" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.588103 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a1e5c38-a110-4c12-947e-e729e0851bd3" containerName="nova-api-api" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.588129 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a1e5c38-a110-4c12-947e-e729e0851bd3" containerName="nova-api-log" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.589240 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.596714 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.597268 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.597769 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.600681 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.718937 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86-public-tls-certs\") pod \"nova-api-0\" (UID: \"1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86\") " pod="openstack/nova-api-0" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.719031 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4glvp\" (UniqueName: \"kubernetes.io/projected/1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86-kube-api-access-4glvp\") pod \"nova-api-0\" (UID: \"1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86\") " pod="openstack/nova-api-0" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.719088 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86\") " pod="openstack/nova-api-0" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.719153 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86-logs\") pod \"nova-api-0\" (UID: \"1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86\") " pod="openstack/nova-api-0" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.719311 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86-config-data\") pod \"nova-api-0\" (UID: \"1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86\") " pod="openstack/nova-api-0" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.719383 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86\") " pod="openstack/nova-api-0" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.810948 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-wkngq"] Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.813341 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-wkngq" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.818701 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.820920 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4glvp\" (UniqueName: \"kubernetes.io/projected/1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86-kube-api-access-4glvp\") pod \"nova-api-0\" (UID: \"1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86\") " pod="openstack/nova-api-0" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.821018 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86\") " pod="openstack/nova-api-0" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.821112 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86-logs\") pod \"nova-api-0\" (UID: \"1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86\") " pod="openstack/nova-api-0" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.821184 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86-config-data\") pod \"nova-api-0\" (UID: \"1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86\") " pod="openstack/nova-api-0" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.821214 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86\") " pod="openstack/nova-api-0" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.821263 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86-public-tls-certs\") pod \"nova-api-0\" (UID: \"1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86\") " pod="openstack/nova-api-0" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.822403 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86-logs\") pod \"nova-api-0\" (UID: \"1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86\") " pod="openstack/nova-api-0" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.826672 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86\") " pod="openstack/nova-api-0" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.837425 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.837973 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86-config-data\") pod \"nova-api-0\" (UID: \"1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86\") " pod="openstack/nova-api-0" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.838680 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86-public-tls-certs\") pod \"nova-api-0\" (UID: \"1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86\") " pod="openstack/nova-api-0" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.853813 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86\") " pod="openstack/nova-api-0" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.859593 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4glvp\" (UniqueName: \"kubernetes.io/projected/1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86-kube-api-access-4glvp\") pod \"nova-api-0\" (UID: \"1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86\") " pod="openstack/nova-api-0" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.906349 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.911159 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-wkngq"] Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.927783 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skchp\" (UniqueName: \"kubernetes.io/projected/df43ecec-34dd-4ec1-aa1e-bdc70ff19359-kube-api-access-skchp\") pod \"nova-cell1-cell-mapping-wkngq\" (UID: \"df43ecec-34dd-4ec1-aa1e-bdc70ff19359\") " pod="openstack/nova-cell1-cell-mapping-wkngq" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.927846 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df43ecec-34dd-4ec1-aa1e-bdc70ff19359-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-wkngq\" (UID: \"df43ecec-34dd-4ec1-aa1e-bdc70ff19359\") " pod="openstack/nova-cell1-cell-mapping-wkngq" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.927934 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df43ecec-34dd-4ec1-aa1e-bdc70ff19359-config-data\") pod \"nova-cell1-cell-mapping-wkngq\" (UID: \"df43ecec-34dd-4ec1-aa1e-bdc70ff19359\") " pod="openstack/nova-cell1-cell-mapping-wkngq" Dec 06 01:25:28 crc kubenswrapper[4991]: I1206 01:25:28.927971 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df43ecec-34dd-4ec1-aa1e-bdc70ff19359-scripts\") pod \"nova-cell1-cell-mapping-wkngq\" (UID: \"df43ecec-34dd-4ec1-aa1e-bdc70ff19359\") " pod="openstack/nova-cell1-cell-mapping-wkngq" Dec 06 01:25:29 crc kubenswrapper[4991]: I1206 01:25:29.029679 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df43ecec-34dd-4ec1-aa1e-bdc70ff19359-config-data\") pod \"nova-cell1-cell-mapping-wkngq\" (UID: \"df43ecec-34dd-4ec1-aa1e-bdc70ff19359\") " pod="openstack/nova-cell1-cell-mapping-wkngq" Dec 06 01:25:29 crc kubenswrapper[4991]: I1206 01:25:29.029742 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df43ecec-34dd-4ec1-aa1e-bdc70ff19359-scripts\") pod \"nova-cell1-cell-mapping-wkngq\" (UID: \"df43ecec-34dd-4ec1-aa1e-bdc70ff19359\") " pod="openstack/nova-cell1-cell-mapping-wkngq" Dec 06 01:25:29 crc kubenswrapper[4991]: I1206 01:25:29.029810 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skchp\" (UniqueName: \"kubernetes.io/projected/df43ecec-34dd-4ec1-aa1e-bdc70ff19359-kube-api-access-skchp\") pod \"nova-cell1-cell-mapping-wkngq\" (UID: \"df43ecec-34dd-4ec1-aa1e-bdc70ff19359\") " pod="openstack/nova-cell1-cell-mapping-wkngq" Dec 06 01:25:29 crc kubenswrapper[4991]: I1206 01:25:29.029835 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df43ecec-34dd-4ec1-aa1e-bdc70ff19359-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-wkngq\" (UID: \"df43ecec-34dd-4ec1-aa1e-bdc70ff19359\") " pod="openstack/nova-cell1-cell-mapping-wkngq" Dec 06 01:25:29 crc kubenswrapper[4991]: I1206 01:25:29.037559 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df43ecec-34dd-4ec1-aa1e-bdc70ff19359-scripts\") pod \"nova-cell1-cell-mapping-wkngq\" (UID: \"df43ecec-34dd-4ec1-aa1e-bdc70ff19359\") " pod="openstack/nova-cell1-cell-mapping-wkngq" Dec 06 01:25:29 crc kubenswrapper[4991]: I1206 01:25:29.041670 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df43ecec-34dd-4ec1-aa1e-bdc70ff19359-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-wkngq\" (UID: \"df43ecec-34dd-4ec1-aa1e-bdc70ff19359\") " pod="openstack/nova-cell1-cell-mapping-wkngq" Dec 06 01:25:29 crc kubenswrapper[4991]: I1206 01:25:29.043024 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df43ecec-34dd-4ec1-aa1e-bdc70ff19359-config-data\") pod \"nova-cell1-cell-mapping-wkngq\" (UID: \"df43ecec-34dd-4ec1-aa1e-bdc70ff19359\") " pod="openstack/nova-cell1-cell-mapping-wkngq" Dec 06 01:25:29 crc kubenswrapper[4991]: I1206 01:25:29.057964 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skchp\" (UniqueName: \"kubernetes.io/projected/df43ecec-34dd-4ec1-aa1e-bdc70ff19359-kube-api-access-skchp\") pod \"nova-cell1-cell-mapping-wkngq\" (UID: \"df43ecec-34dd-4ec1-aa1e-bdc70ff19359\") " pod="openstack/nova-cell1-cell-mapping-wkngq" Dec 06 01:25:29 crc kubenswrapper[4991]: I1206 01:25:29.086441 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a1e5c38-a110-4c12-947e-e729e0851bd3" path="/var/lib/kubelet/pods/4a1e5c38-a110-4c12-947e-e729e0851bd3/volumes" Dec 06 01:25:29 crc kubenswrapper[4991]: I1206 01:25:29.100789 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-wkngq" Dec 06 01:25:29 crc kubenswrapper[4991]: I1206 01:25:29.478379 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 01:25:29 crc kubenswrapper[4991]: W1206 01:25:29.489246 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fa8ab09_7dd4_4203_b8ba_6f6ff1cc2b86.slice/crio-fe9abf9d95c88ec4cdf9d9d2f08066f82e3a3ee5dc186ccf9e487bec061b068a WatchSource:0}: Error finding container fe9abf9d95c88ec4cdf9d9d2f08066f82e3a3ee5dc186ccf9e487bec061b068a: Status 404 returned error can't find the container with id fe9abf9d95c88ec4cdf9d9d2f08066f82e3a3ee5dc186ccf9e487bec061b068a Dec 06 01:25:29 crc kubenswrapper[4991]: I1206 01:25:29.500930 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86","Type":"ContainerStarted","Data":"fe9abf9d95c88ec4cdf9d9d2f08066f82e3a3ee5dc186ccf9e487bec061b068a"} Dec 06 01:25:29 crc kubenswrapper[4991]: I1206 01:25:29.503624 4991 generic.go:334] "Generic (PLEG): container finished" podID="a923bb5c-27a6-4be3-8216-25d6a28d5f2a" containerID="fa40c374476da9ee5a040231e357926cb1ca118048e25e63234d20bd5bef9570" exitCode=0 Dec 06 01:25:29 crc kubenswrapper[4991]: I1206 01:25:29.503688 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a923bb5c-27a6-4be3-8216-25d6a28d5f2a","Type":"ContainerDied","Data":"fa40c374476da9ee5a040231e357926cb1ca118048e25e63234d20bd5bef9570"} Dec 06 01:25:29 crc kubenswrapper[4991]: I1206 01:25:29.598132 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-wkngq"] Dec 06 01:25:29 crc kubenswrapper[4991]: I1206 01:25:29.618676 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 01:25:29 crc kubenswrapper[4991]: I1206 01:25:29.665427 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-ceilometer-tls-certs\") pod \"a923bb5c-27a6-4be3-8216-25d6a28d5f2a\" (UID: \"a923bb5c-27a6-4be3-8216-25d6a28d5f2a\") " Dec 06 01:25:29 crc kubenswrapper[4991]: I1206 01:25:29.665559 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-sg-core-conf-yaml\") pod \"a923bb5c-27a6-4be3-8216-25d6a28d5f2a\" (UID: \"a923bb5c-27a6-4be3-8216-25d6a28d5f2a\") " Dec 06 01:25:29 crc kubenswrapper[4991]: I1206 01:25:29.665610 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-combined-ca-bundle\") pod \"a923bb5c-27a6-4be3-8216-25d6a28d5f2a\" (UID: \"a923bb5c-27a6-4be3-8216-25d6a28d5f2a\") " Dec 06 01:25:29 crc kubenswrapper[4991]: I1206 01:25:29.665639 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-log-httpd\") pod \"a923bb5c-27a6-4be3-8216-25d6a28d5f2a\" (UID: \"a923bb5c-27a6-4be3-8216-25d6a28d5f2a\") " Dec 06 01:25:29 crc kubenswrapper[4991]: I1206 01:25:29.665741 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-scripts\") pod \"a923bb5c-27a6-4be3-8216-25d6a28d5f2a\" (UID: \"a923bb5c-27a6-4be3-8216-25d6a28d5f2a\") " Dec 06 01:25:29 crc kubenswrapper[4991]: I1206 01:25:29.665771 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-config-data\") pod \"a923bb5c-27a6-4be3-8216-25d6a28d5f2a\" (UID: \"a923bb5c-27a6-4be3-8216-25d6a28d5f2a\") " Dec 06 01:25:29 crc kubenswrapper[4991]: I1206 01:25:29.665801 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-run-httpd\") pod \"a923bb5c-27a6-4be3-8216-25d6a28d5f2a\" (UID: \"a923bb5c-27a6-4be3-8216-25d6a28d5f2a\") " Dec 06 01:25:29 crc kubenswrapper[4991]: I1206 01:25:29.665915 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9csz\" (UniqueName: \"kubernetes.io/projected/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-kube-api-access-k9csz\") pod \"a923bb5c-27a6-4be3-8216-25d6a28d5f2a\" (UID: \"a923bb5c-27a6-4be3-8216-25d6a28d5f2a\") " Dec 06 01:25:29 crc kubenswrapper[4991]: I1206 01:25:29.666140 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a923bb5c-27a6-4be3-8216-25d6a28d5f2a" (UID: "a923bb5c-27a6-4be3-8216-25d6a28d5f2a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:25:29 crc kubenswrapper[4991]: I1206 01:25:29.669676 4991 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 01:25:29 crc kubenswrapper[4991]: I1206 01:25:29.672771 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a923bb5c-27a6-4be3-8216-25d6a28d5f2a" (UID: "a923bb5c-27a6-4be3-8216-25d6a28d5f2a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:25:29 crc kubenswrapper[4991]: I1206 01:25:29.673970 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-kube-api-access-k9csz" (OuterVolumeSpecName: "kube-api-access-k9csz") pod "a923bb5c-27a6-4be3-8216-25d6a28d5f2a" (UID: "a923bb5c-27a6-4be3-8216-25d6a28d5f2a"). InnerVolumeSpecName "kube-api-access-k9csz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:25:29 crc kubenswrapper[4991]: I1206 01:25:29.676129 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-scripts" (OuterVolumeSpecName: "scripts") pod "a923bb5c-27a6-4be3-8216-25d6a28d5f2a" (UID: "a923bb5c-27a6-4be3-8216-25d6a28d5f2a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:25:29 crc kubenswrapper[4991]: I1206 01:25:29.698812 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a923bb5c-27a6-4be3-8216-25d6a28d5f2a" (UID: "a923bb5c-27a6-4be3-8216-25d6a28d5f2a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:25:29 crc kubenswrapper[4991]: I1206 01:25:29.751971 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "a923bb5c-27a6-4be3-8216-25d6a28d5f2a" (UID: "a923bb5c-27a6-4be3-8216-25d6a28d5f2a"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:25:29 crc kubenswrapper[4991]: I1206 01:25:29.771417 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9csz\" (UniqueName: \"kubernetes.io/projected/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-kube-api-access-k9csz\") on node \"crc\" DevicePath \"\"" Dec 06 01:25:29 crc kubenswrapper[4991]: I1206 01:25:29.771450 4991 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 01:25:29 crc kubenswrapper[4991]: I1206 01:25:29.771459 4991 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 06 01:25:29 crc kubenswrapper[4991]: I1206 01:25:29.771468 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:25:29 crc kubenswrapper[4991]: I1206 01:25:29.771476 4991 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 06 01:25:29 crc kubenswrapper[4991]: I1206 01:25:29.782846 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a923bb5c-27a6-4be3-8216-25d6a28d5f2a" (UID: "a923bb5c-27a6-4be3-8216-25d6a28d5f2a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:25:29 crc kubenswrapper[4991]: I1206 01:25:29.822194 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-config-data" (OuterVolumeSpecName: "config-data") pod "a923bb5c-27a6-4be3-8216-25d6a28d5f2a" (UID: "a923bb5c-27a6-4be3-8216-25d6a28d5f2a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:25:29 crc kubenswrapper[4991]: I1206 01:25:29.873285 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 01:25:29 crc kubenswrapper[4991]: I1206 01:25:29.873322 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a923bb5c-27a6-4be3-8216-25d6a28d5f2a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.518035 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a923bb5c-27a6-4be3-8216-25d6a28d5f2a","Type":"ContainerDied","Data":"b0150f6a055cdf40802a71cc912f38729bce07e56c2ed2f66a4482c2129849a2"} Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.518097 4991 scope.go:117] "RemoveContainer" containerID="ef80a7daee3ddf6976c39e699c3cb20a86ff155c30f8fb5a5cdb3e1ecd29f03c" Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.518049 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.521338 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-wkngq" event={"ID":"df43ecec-34dd-4ec1-aa1e-bdc70ff19359","Type":"ContainerStarted","Data":"94f8e0d6570e7f3ee8ea306d927da00a57a898b9302ebe496fc79e910650da3c"} Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.521375 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-wkngq" event={"ID":"df43ecec-34dd-4ec1-aa1e-bdc70ff19359","Type":"ContainerStarted","Data":"a9315b01e5c323abecd40eecd4a7512c3acb3a3f47e2cb1d5f5bd14da3f5046b"} Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.525564 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86","Type":"ContainerStarted","Data":"3a1316092873fdbbe77bb596ed4a261fbb31c16f02099a31aa92adb84d85620f"} Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.525631 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86","Type":"ContainerStarted","Data":"b6dceebaa75ae36e7899a4fb40f787688cf1bd50e9ab492a1b0da0dab00b534a"} Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.543562 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-wkngq" podStartSLOduration=2.543541335 podStartE2EDuration="2.543541335s" podCreationTimestamp="2025-12-06 01:25:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:25:30.539279286 +0000 UTC m=+1403.889569754" watchObservedRunningTime="2025-12-06 01:25:30.543541335 +0000 UTC m=+1403.893831803" Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.545331 4991 scope.go:117] "RemoveContainer" containerID="b751924537a72c83133bb3b7375b98f27bdc0a7cef9c0a237269544da66816d6" Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.573865 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.573840729 podStartE2EDuration="2.573840729s" podCreationTimestamp="2025-12-06 01:25:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:25:30.556917696 +0000 UTC m=+1403.907208184" watchObservedRunningTime="2025-12-06 01:25:30.573840729 +0000 UTC m=+1403.924131187" Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.574753 4991 scope.go:117] "RemoveContainer" containerID="fa40c374476da9ee5a040231e357926cb1ca118048e25e63234d20bd5bef9570" Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.594341 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.611391 4991 scope.go:117] "RemoveContainer" containerID="d1e785f4a696fb7c97e1c13eb2a812140ddbfec63424579f91f8516f292cd5c2" Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.613463 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.637448 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:25:30 crc kubenswrapper[4991]: E1206 01:25:30.638028 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a923bb5c-27a6-4be3-8216-25d6a28d5f2a" containerName="sg-core" Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.638046 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="a923bb5c-27a6-4be3-8216-25d6a28d5f2a" containerName="sg-core" Dec 06 01:25:30 crc kubenswrapper[4991]: E1206 01:25:30.638063 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a923bb5c-27a6-4be3-8216-25d6a28d5f2a" containerName="ceilometer-notification-agent" Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.638072 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="a923bb5c-27a6-4be3-8216-25d6a28d5f2a" containerName="ceilometer-notification-agent" Dec 06 01:25:30 crc kubenswrapper[4991]: E1206 01:25:30.638096 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a923bb5c-27a6-4be3-8216-25d6a28d5f2a" containerName="proxy-httpd" Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.638105 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="a923bb5c-27a6-4be3-8216-25d6a28d5f2a" containerName="proxy-httpd" Dec 06 01:25:30 crc kubenswrapper[4991]: E1206 01:25:30.638126 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a923bb5c-27a6-4be3-8216-25d6a28d5f2a" containerName="ceilometer-central-agent" Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.638134 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="a923bb5c-27a6-4be3-8216-25d6a28d5f2a" containerName="ceilometer-central-agent" Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.638410 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="a923bb5c-27a6-4be3-8216-25d6a28d5f2a" containerName="sg-core" Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.638442 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="a923bb5c-27a6-4be3-8216-25d6a28d5f2a" containerName="proxy-httpd" Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.638459 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="a923bb5c-27a6-4be3-8216-25d6a28d5f2a" containerName="ceilometer-central-agent" Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.638469 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="a923bb5c-27a6-4be3-8216-25d6a28d5f2a" containerName="ceilometer-notification-agent" Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.640730 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.643800 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.644155 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.644539 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.666495 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.699278 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b4e35617-6fcf-4b85-b5a4-d48d8a95f458-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b4e35617-6fcf-4b85-b5a4-d48d8a95f458\") " pod="openstack/ceilometer-0" Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.699355 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcwzf\" (UniqueName: \"kubernetes.io/projected/b4e35617-6fcf-4b85-b5a4-d48d8a95f458-kube-api-access-xcwzf\") pod \"ceilometer-0\" (UID: \"b4e35617-6fcf-4b85-b5a4-d48d8a95f458\") " pod="openstack/ceilometer-0" Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.699592 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4e35617-6fcf-4b85-b5a4-d48d8a95f458-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b4e35617-6fcf-4b85-b5a4-d48d8a95f458\") " pod="openstack/ceilometer-0" Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.699695 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4e35617-6fcf-4b85-b5a4-d48d8a95f458-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b4e35617-6fcf-4b85-b5a4-d48d8a95f458\") " pod="openstack/ceilometer-0" Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.699800 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4e35617-6fcf-4b85-b5a4-d48d8a95f458-config-data\") pod \"ceilometer-0\" (UID: \"b4e35617-6fcf-4b85-b5a4-d48d8a95f458\") " pod="openstack/ceilometer-0" Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.699954 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4e35617-6fcf-4b85-b5a4-d48d8a95f458-log-httpd\") pod \"ceilometer-0\" (UID: \"b4e35617-6fcf-4b85-b5a4-d48d8a95f458\") " pod="openstack/ceilometer-0" Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.700064 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4e35617-6fcf-4b85-b5a4-d48d8a95f458-scripts\") pod \"ceilometer-0\" (UID: \"b4e35617-6fcf-4b85-b5a4-d48d8a95f458\") " pod="openstack/ceilometer-0" Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.700116 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4e35617-6fcf-4b85-b5a4-d48d8a95f458-run-httpd\") pod \"ceilometer-0\" (UID: \"b4e35617-6fcf-4b85-b5a4-d48d8a95f458\") " pod="openstack/ceilometer-0" Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.802104 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b4e35617-6fcf-4b85-b5a4-d48d8a95f458-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b4e35617-6fcf-4b85-b5a4-d48d8a95f458\") " pod="openstack/ceilometer-0" Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.802191 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcwzf\" (UniqueName: \"kubernetes.io/projected/b4e35617-6fcf-4b85-b5a4-d48d8a95f458-kube-api-access-xcwzf\") pod \"ceilometer-0\" (UID: \"b4e35617-6fcf-4b85-b5a4-d48d8a95f458\") " pod="openstack/ceilometer-0" Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.802252 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4e35617-6fcf-4b85-b5a4-d48d8a95f458-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b4e35617-6fcf-4b85-b5a4-d48d8a95f458\") " pod="openstack/ceilometer-0" Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.802281 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4e35617-6fcf-4b85-b5a4-d48d8a95f458-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b4e35617-6fcf-4b85-b5a4-d48d8a95f458\") " pod="openstack/ceilometer-0" Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.802318 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4e35617-6fcf-4b85-b5a4-d48d8a95f458-config-data\") pod \"ceilometer-0\" (UID: \"b4e35617-6fcf-4b85-b5a4-d48d8a95f458\") " pod="openstack/ceilometer-0" Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.802377 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4e35617-6fcf-4b85-b5a4-d48d8a95f458-log-httpd\") pod \"ceilometer-0\" (UID: \"b4e35617-6fcf-4b85-b5a4-d48d8a95f458\") " pod="openstack/ceilometer-0" Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.802413 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4e35617-6fcf-4b85-b5a4-d48d8a95f458-scripts\") pod \"ceilometer-0\" (UID: \"b4e35617-6fcf-4b85-b5a4-d48d8a95f458\") " pod="openstack/ceilometer-0" Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.802447 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4e35617-6fcf-4b85-b5a4-d48d8a95f458-run-httpd\") pod \"ceilometer-0\" (UID: \"b4e35617-6fcf-4b85-b5a4-d48d8a95f458\") " pod="openstack/ceilometer-0" Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.806033 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4e35617-6fcf-4b85-b5a4-d48d8a95f458-run-httpd\") pod \"ceilometer-0\" (UID: \"b4e35617-6fcf-4b85-b5a4-d48d8a95f458\") " pod="openstack/ceilometer-0" Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.807372 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4e35617-6fcf-4b85-b5a4-d48d8a95f458-log-httpd\") pod \"ceilometer-0\" (UID: \"b4e35617-6fcf-4b85-b5a4-d48d8a95f458\") " pod="openstack/ceilometer-0" Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.810789 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4e35617-6fcf-4b85-b5a4-d48d8a95f458-scripts\") pod \"ceilometer-0\" (UID: \"b4e35617-6fcf-4b85-b5a4-d48d8a95f458\") " pod="openstack/ceilometer-0" Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.811390 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4e35617-6fcf-4b85-b5a4-d48d8a95f458-config-data\") pod \"ceilometer-0\" (UID: \"b4e35617-6fcf-4b85-b5a4-d48d8a95f458\") " pod="openstack/ceilometer-0" Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.812355 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b4e35617-6fcf-4b85-b5a4-d48d8a95f458-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b4e35617-6fcf-4b85-b5a4-d48d8a95f458\") " pod="openstack/ceilometer-0" Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.817000 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4e35617-6fcf-4b85-b5a4-d48d8a95f458-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b4e35617-6fcf-4b85-b5a4-d48d8a95f458\") " pod="openstack/ceilometer-0" Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.818455 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcwzf\" (UniqueName: \"kubernetes.io/projected/b4e35617-6fcf-4b85-b5a4-d48d8a95f458-kube-api-access-xcwzf\") pod \"ceilometer-0\" (UID: \"b4e35617-6fcf-4b85-b5a4-d48d8a95f458\") " pod="openstack/ceilometer-0" Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.824070 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4e35617-6fcf-4b85-b5a4-d48d8a95f458-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b4e35617-6fcf-4b85-b5a4-d48d8a95f458\") " pod="openstack/ceilometer-0" Dec 06 01:25:30 crc kubenswrapper[4991]: I1206 01:25:30.968038 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 06 01:25:31 crc kubenswrapper[4991]: I1206 01:25:31.094904 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a923bb5c-27a6-4be3-8216-25d6a28d5f2a" path="/var/lib/kubelet/pods/a923bb5c-27a6-4be3-8216-25d6a28d5f2a/volumes" Dec 06 01:25:31 crc kubenswrapper[4991]: I1206 01:25:31.478480 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 06 01:25:31 crc kubenswrapper[4991]: I1206 01:25:31.540175 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4e35617-6fcf-4b85-b5a4-d48d8a95f458","Type":"ContainerStarted","Data":"83f8e8cc5b62b0a54c2d9d76bc1ecabe0854d51649209dd371cf3e6d4116a62b"} Dec 06 01:25:31 crc kubenswrapper[4991]: I1206 01:25:31.985112 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c7b6c5df9-7d6sp" Dec 06 01:25:32 crc kubenswrapper[4991]: I1206 01:25:32.143963 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-7chjv"] Dec 06 01:25:32 crc kubenswrapper[4991]: I1206 01:25:32.144859 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-865f5d856f-7chjv" podUID="5151c0e3-7834-4643-8b66-5d7e0f421059" containerName="dnsmasq-dns" containerID="cri-o://f6c7eed506f4e8b6ba442e15c7c5bf52ecb32f2b70b0da1ed5c1aa3475a58e25" gracePeriod=10 Dec 06 01:25:32 crc kubenswrapper[4991]: I1206 01:25:32.583175 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4e35617-6fcf-4b85-b5a4-d48d8a95f458","Type":"ContainerStarted","Data":"a6955ef7eb0799f0855ce48d87f56e9c03ecbb16c286f862456a62a4b85e9bdd"} Dec 06 01:25:32 crc kubenswrapper[4991]: I1206 01:25:32.600912 4991 generic.go:334] "Generic (PLEG): container finished" podID="5151c0e3-7834-4643-8b66-5d7e0f421059" containerID="f6c7eed506f4e8b6ba442e15c7c5bf52ecb32f2b70b0da1ed5c1aa3475a58e25" exitCode=0 Dec 06 01:25:32 crc kubenswrapper[4991]: I1206 01:25:32.600970 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-7chjv" event={"ID":"5151c0e3-7834-4643-8b66-5d7e0f421059","Type":"ContainerDied","Data":"f6c7eed506f4e8b6ba442e15c7c5bf52ecb32f2b70b0da1ed5c1aa3475a58e25"} Dec 06 01:25:32 crc kubenswrapper[4991]: I1206 01:25:32.692392 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-7chjv" Dec 06 01:25:32 crc kubenswrapper[4991]: I1206 01:25:32.749706 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5151c0e3-7834-4643-8b66-5d7e0f421059-ovsdbserver-sb\") pod \"5151c0e3-7834-4643-8b66-5d7e0f421059\" (UID: \"5151c0e3-7834-4643-8b66-5d7e0f421059\") " Dec 06 01:25:32 crc kubenswrapper[4991]: I1206 01:25:32.749822 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5151c0e3-7834-4643-8b66-5d7e0f421059-config\") pod \"5151c0e3-7834-4643-8b66-5d7e0f421059\" (UID: \"5151c0e3-7834-4643-8b66-5d7e0f421059\") " Dec 06 01:25:32 crc kubenswrapper[4991]: I1206 01:25:32.750051 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j55cj\" (UniqueName: \"kubernetes.io/projected/5151c0e3-7834-4643-8b66-5d7e0f421059-kube-api-access-j55cj\") pod \"5151c0e3-7834-4643-8b66-5d7e0f421059\" (UID: \"5151c0e3-7834-4643-8b66-5d7e0f421059\") " Dec 06 01:25:32 crc kubenswrapper[4991]: I1206 01:25:32.750110 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5151c0e3-7834-4643-8b66-5d7e0f421059-dns-swift-storage-0\") pod \"5151c0e3-7834-4643-8b66-5d7e0f421059\" (UID: \"5151c0e3-7834-4643-8b66-5d7e0f421059\") " Dec 06 01:25:32 crc kubenswrapper[4991]: I1206 01:25:32.750161 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5151c0e3-7834-4643-8b66-5d7e0f421059-ovsdbserver-nb\") pod \"5151c0e3-7834-4643-8b66-5d7e0f421059\" (UID: \"5151c0e3-7834-4643-8b66-5d7e0f421059\") " Dec 06 01:25:32 crc kubenswrapper[4991]: I1206 01:25:32.750197 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5151c0e3-7834-4643-8b66-5d7e0f421059-dns-svc\") pod \"5151c0e3-7834-4643-8b66-5d7e0f421059\" (UID: \"5151c0e3-7834-4643-8b66-5d7e0f421059\") " Dec 06 01:25:32 crc kubenswrapper[4991]: I1206 01:25:32.774374 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5151c0e3-7834-4643-8b66-5d7e0f421059-kube-api-access-j55cj" (OuterVolumeSpecName: "kube-api-access-j55cj") pod "5151c0e3-7834-4643-8b66-5d7e0f421059" (UID: "5151c0e3-7834-4643-8b66-5d7e0f421059"). InnerVolumeSpecName "kube-api-access-j55cj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:25:32 crc kubenswrapper[4991]: I1206 01:25:32.826350 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5151c0e3-7834-4643-8b66-5d7e0f421059-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5151c0e3-7834-4643-8b66-5d7e0f421059" (UID: "5151c0e3-7834-4643-8b66-5d7e0f421059"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:25:32 crc kubenswrapper[4991]: I1206 01:25:32.836444 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5151c0e3-7834-4643-8b66-5d7e0f421059-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5151c0e3-7834-4643-8b66-5d7e0f421059" (UID: "5151c0e3-7834-4643-8b66-5d7e0f421059"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:25:32 crc kubenswrapper[4991]: I1206 01:25:32.838946 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5151c0e3-7834-4643-8b66-5d7e0f421059-config" (OuterVolumeSpecName: "config") pod "5151c0e3-7834-4643-8b66-5d7e0f421059" (UID: "5151c0e3-7834-4643-8b66-5d7e0f421059"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:25:32 crc kubenswrapper[4991]: I1206 01:25:32.843446 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5151c0e3-7834-4643-8b66-5d7e0f421059-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5151c0e3-7834-4643-8b66-5d7e0f421059" (UID: "5151c0e3-7834-4643-8b66-5d7e0f421059"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:25:32 crc kubenswrapper[4991]: I1206 01:25:32.843774 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5151c0e3-7834-4643-8b66-5d7e0f421059-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5151c0e3-7834-4643-8b66-5d7e0f421059" (UID: "5151c0e3-7834-4643-8b66-5d7e0f421059"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:25:32 crc kubenswrapper[4991]: I1206 01:25:32.852696 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5151c0e3-7834-4643-8b66-5d7e0f421059-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 01:25:32 crc kubenswrapper[4991]: I1206 01:25:32.852722 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5151c0e3-7834-4643-8b66-5d7e0f421059-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:25:32 crc kubenswrapper[4991]: I1206 01:25:32.852733 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j55cj\" (UniqueName: \"kubernetes.io/projected/5151c0e3-7834-4643-8b66-5d7e0f421059-kube-api-access-j55cj\") on node \"crc\" DevicePath \"\"" Dec 06 01:25:32 crc kubenswrapper[4991]: I1206 01:25:32.852746 4991 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5151c0e3-7834-4643-8b66-5d7e0f421059-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 06 01:25:32 crc kubenswrapper[4991]: I1206 01:25:32.852755 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5151c0e3-7834-4643-8b66-5d7e0f421059-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 01:25:32 crc kubenswrapper[4991]: I1206 01:25:32.852763 4991 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5151c0e3-7834-4643-8b66-5d7e0f421059-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 01:25:33 crc kubenswrapper[4991]: I1206 01:25:33.618546 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-7chjv" event={"ID":"5151c0e3-7834-4643-8b66-5d7e0f421059","Type":"ContainerDied","Data":"db10b62306dda0dbc50bc1b37602472f24d84e58314b2098adf3910d116f42c3"} Dec 06 01:25:33 crc kubenswrapper[4991]: I1206 01:25:33.619470 4991 scope.go:117] "RemoveContainer" containerID="f6c7eed506f4e8b6ba442e15c7c5bf52ecb32f2b70b0da1ed5c1aa3475a58e25" Dec 06 01:25:33 crc kubenswrapper[4991]: I1206 01:25:33.618802 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-7chjv" Dec 06 01:25:33 crc kubenswrapper[4991]: I1206 01:25:33.621946 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4e35617-6fcf-4b85-b5a4-d48d8a95f458","Type":"ContainerStarted","Data":"a6c21de6b8cdc35ed4c3fd4cf2f7bcff9c808bc8da3500901b132a19f25614a9"} Dec 06 01:25:33 crc kubenswrapper[4991]: I1206 01:25:33.622006 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4e35617-6fcf-4b85-b5a4-d48d8a95f458","Type":"ContainerStarted","Data":"46b1b5ce97de87f6b8406e7e66ea43f1af570a1d0b606ae785bb7dd4887608d5"} Dec 06 01:25:33 crc kubenswrapper[4991]: I1206 01:25:33.648123 4991 scope.go:117] "RemoveContainer" containerID="24d10f3b68efc3ee2a148dec96724d60f42d9179242e037284a48e65db97e90b" Dec 06 01:25:33 crc kubenswrapper[4991]: I1206 01:25:33.657197 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-7chjv"] Dec 06 01:25:33 crc kubenswrapper[4991]: I1206 01:25:33.668610 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-7chjv"] Dec 06 01:25:35 crc kubenswrapper[4991]: I1206 01:25:35.061206 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5151c0e3-7834-4643-8b66-5d7e0f421059" path="/var/lib/kubelet/pods/5151c0e3-7834-4643-8b66-5d7e0f421059/volumes" Dec 06 01:25:35 crc kubenswrapper[4991]: I1206 01:25:35.647884 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4e35617-6fcf-4b85-b5a4-d48d8a95f458","Type":"ContainerStarted","Data":"e4ac8282525f5589c9b1fba86e841c1ee94fc9273c7b2e910db64d75699a163a"} Dec 06 01:25:35 crc kubenswrapper[4991]: I1206 01:25:35.648830 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 06 01:25:35 crc kubenswrapper[4991]: I1206 01:25:35.650150 4991 generic.go:334] "Generic (PLEG): container finished" podID="df43ecec-34dd-4ec1-aa1e-bdc70ff19359" containerID="94f8e0d6570e7f3ee8ea306d927da00a57a898b9302ebe496fc79e910650da3c" exitCode=0 Dec 06 01:25:35 crc kubenswrapper[4991]: I1206 01:25:35.650203 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-wkngq" event={"ID":"df43ecec-34dd-4ec1-aa1e-bdc70ff19359","Type":"ContainerDied","Data":"94f8e0d6570e7f3ee8ea306d927da00a57a898b9302ebe496fc79e910650da3c"} Dec 06 01:25:35 crc kubenswrapper[4991]: I1206 01:25:35.689520 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.365310683 podStartE2EDuration="5.689454149s" podCreationTimestamp="2025-12-06 01:25:30 +0000 UTC" firstStartedPulling="2025-12-06 01:25:31.488035658 +0000 UTC m=+1404.838326136" lastFinishedPulling="2025-12-06 01:25:34.812179094 +0000 UTC m=+1408.162469602" observedRunningTime="2025-12-06 01:25:35.678465419 +0000 UTC m=+1409.028755927" watchObservedRunningTime="2025-12-06 01:25:35.689454149 +0000 UTC m=+1409.039744657" Dec 06 01:25:37 crc kubenswrapper[4991]: I1206 01:25:37.060422 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-wkngq" Dec 06 01:25:37 crc kubenswrapper[4991]: I1206 01:25:37.148320 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df43ecec-34dd-4ec1-aa1e-bdc70ff19359-config-data\") pod \"df43ecec-34dd-4ec1-aa1e-bdc70ff19359\" (UID: \"df43ecec-34dd-4ec1-aa1e-bdc70ff19359\") " Dec 06 01:25:37 crc kubenswrapper[4991]: I1206 01:25:37.148524 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skchp\" (UniqueName: \"kubernetes.io/projected/df43ecec-34dd-4ec1-aa1e-bdc70ff19359-kube-api-access-skchp\") pod \"df43ecec-34dd-4ec1-aa1e-bdc70ff19359\" (UID: \"df43ecec-34dd-4ec1-aa1e-bdc70ff19359\") " Dec 06 01:25:37 crc kubenswrapper[4991]: I1206 01:25:37.148579 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df43ecec-34dd-4ec1-aa1e-bdc70ff19359-scripts\") pod \"df43ecec-34dd-4ec1-aa1e-bdc70ff19359\" (UID: \"df43ecec-34dd-4ec1-aa1e-bdc70ff19359\") " Dec 06 01:25:37 crc kubenswrapper[4991]: I1206 01:25:37.148739 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df43ecec-34dd-4ec1-aa1e-bdc70ff19359-combined-ca-bundle\") pod \"df43ecec-34dd-4ec1-aa1e-bdc70ff19359\" (UID: \"df43ecec-34dd-4ec1-aa1e-bdc70ff19359\") " Dec 06 01:25:37 crc kubenswrapper[4991]: I1206 01:25:37.154579 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df43ecec-34dd-4ec1-aa1e-bdc70ff19359-scripts" (OuterVolumeSpecName: "scripts") pod "df43ecec-34dd-4ec1-aa1e-bdc70ff19359" (UID: "df43ecec-34dd-4ec1-aa1e-bdc70ff19359"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:25:37 crc kubenswrapper[4991]: I1206 01:25:37.170628 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df43ecec-34dd-4ec1-aa1e-bdc70ff19359-kube-api-access-skchp" (OuterVolumeSpecName: "kube-api-access-skchp") pod "df43ecec-34dd-4ec1-aa1e-bdc70ff19359" (UID: "df43ecec-34dd-4ec1-aa1e-bdc70ff19359"). InnerVolumeSpecName "kube-api-access-skchp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:25:37 crc kubenswrapper[4991]: I1206 01:25:37.179050 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df43ecec-34dd-4ec1-aa1e-bdc70ff19359-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df43ecec-34dd-4ec1-aa1e-bdc70ff19359" (UID: "df43ecec-34dd-4ec1-aa1e-bdc70ff19359"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:25:37 crc kubenswrapper[4991]: I1206 01:25:37.180025 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df43ecec-34dd-4ec1-aa1e-bdc70ff19359-config-data" (OuterVolumeSpecName: "config-data") pod "df43ecec-34dd-4ec1-aa1e-bdc70ff19359" (UID: "df43ecec-34dd-4ec1-aa1e-bdc70ff19359"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:25:37 crc kubenswrapper[4991]: I1206 01:25:37.251185 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skchp\" (UniqueName: \"kubernetes.io/projected/df43ecec-34dd-4ec1-aa1e-bdc70ff19359-kube-api-access-skchp\") on node \"crc\" DevicePath \"\"" Dec 06 01:25:37 crc kubenswrapper[4991]: I1206 01:25:37.251221 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df43ecec-34dd-4ec1-aa1e-bdc70ff19359-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 01:25:37 crc kubenswrapper[4991]: I1206 01:25:37.251231 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df43ecec-34dd-4ec1-aa1e-bdc70ff19359-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:25:37 crc kubenswrapper[4991]: I1206 01:25:37.251258 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df43ecec-34dd-4ec1-aa1e-bdc70ff19359-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 01:25:37 crc kubenswrapper[4991]: I1206 01:25:37.678370 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-wkngq" event={"ID":"df43ecec-34dd-4ec1-aa1e-bdc70ff19359","Type":"ContainerDied","Data":"a9315b01e5c323abecd40eecd4a7512c3acb3a3f47e2cb1d5f5bd14da3f5046b"} Dec 06 01:25:37 crc kubenswrapper[4991]: I1206 01:25:37.678414 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9315b01e5c323abecd40eecd4a7512c3acb3a3f47e2cb1d5f5bd14da3f5046b" Dec 06 01:25:37 crc kubenswrapper[4991]: I1206 01:25:37.678467 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-wkngq" Dec 06 01:25:37 crc kubenswrapper[4991]: I1206 01:25:37.950251 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 06 01:25:37 crc kubenswrapper[4991]: I1206 01:25:37.950457 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86" containerName="nova-api-log" containerID="cri-o://b6dceebaa75ae36e7899a4fb40f787688cf1bd50e9ab492a1b0da0dab00b534a" gracePeriod=30 Dec 06 01:25:37 crc kubenswrapper[4991]: I1206 01:25:37.950632 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86" containerName="nova-api-api" containerID="cri-o://3a1316092873fdbbe77bb596ed4a261fbb31c16f02099a31aa92adb84d85620f" gracePeriod=30 Dec 06 01:25:37 crc kubenswrapper[4991]: I1206 01:25:37.962299 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 01:25:37 crc kubenswrapper[4991]: I1206 01:25:37.962502 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="fed182dc-f3b0-470b-965c-66f5cbb719b5" containerName="nova-scheduler-scheduler" containerID="cri-o://cb371c896f879db4159a31a00a882866b3c26a725fcc5d68b39c01108d773cdf" gracePeriod=30 Dec 06 01:25:37 crc kubenswrapper[4991]: I1206 01:25:37.978872 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 01:25:37 crc kubenswrapper[4991]: I1206 01:25:37.979105 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8603a428-6117-4bbb-a409-45aa3c447611" containerName="nova-metadata-log" containerID="cri-o://81dfb5859167b882b8078f12c83503aca3b6cb983c82f38367a049a74243a90e" gracePeriod=30 Dec 06 01:25:37 crc kubenswrapper[4991]: I1206 01:25:37.980180 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8603a428-6117-4bbb-a409-45aa3c447611" containerName="nova-metadata-metadata" containerID="cri-o://d60d56a67c1a309283a3375423282c8e12279d7ec963fe832f420ee19d98c9f6" gracePeriod=30 Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.535377 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.576416 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86-combined-ca-bundle\") pod \"1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86\" (UID: \"1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86\") " Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.576542 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86-public-tls-certs\") pod \"1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86\" (UID: \"1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86\") " Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.576566 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86-config-data\") pod \"1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86\" (UID: \"1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86\") " Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.576742 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86-internal-tls-certs\") pod \"1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86\" (UID: \"1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86\") " Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.576824 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4glvp\" (UniqueName: \"kubernetes.io/projected/1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86-kube-api-access-4glvp\") pod \"1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86\" (UID: \"1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86\") " Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.576868 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86-logs\") pod \"1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86\" (UID: \"1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86\") " Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.577688 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86-logs" (OuterVolumeSpecName: "logs") pod "1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86" (UID: "1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.585449 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86-kube-api-access-4glvp" (OuterVolumeSpecName: "kube-api-access-4glvp") pod "1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86" (UID: "1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86"). InnerVolumeSpecName "kube-api-access-4glvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.612591 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86-config-data" (OuterVolumeSpecName: "config-data") pod "1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86" (UID: "1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.615971 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86" (UID: "1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.641212 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86" (UID: "1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.683493 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4glvp\" (UniqueName: \"kubernetes.io/projected/1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86-kube-api-access-4glvp\") on node \"crc\" DevicePath \"\"" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.683689 4991 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86-logs\") on node \"crc\" DevicePath \"\"" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.683757 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.683825 4991 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.683883 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.685656 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86" (UID: "1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.695909 4991 generic.go:334] "Generic (PLEG): container finished" podID="fed182dc-f3b0-470b-965c-66f5cbb719b5" containerID="cb371c896f879db4159a31a00a882866b3c26a725fcc5d68b39c01108d773cdf" exitCode=0 Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.696295 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fed182dc-f3b0-470b-965c-66f5cbb719b5","Type":"ContainerDied","Data":"cb371c896f879db4159a31a00a882866b3c26a725fcc5d68b39c01108d773cdf"} Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.697977 4991 generic.go:334] "Generic (PLEG): container finished" podID="8603a428-6117-4bbb-a409-45aa3c447611" containerID="81dfb5859167b882b8078f12c83503aca3b6cb983c82f38367a049a74243a90e" exitCode=143 Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.698052 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8603a428-6117-4bbb-a409-45aa3c447611","Type":"ContainerDied","Data":"81dfb5859167b882b8078f12c83503aca3b6cb983c82f38367a049a74243a90e"} Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.699369 4991 generic.go:334] "Generic (PLEG): container finished" podID="1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86" containerID="3a1316092873fdbbe77bb596ed4a261fbb31c16f02099a31aa92adb84d85620f" exitCode=0 Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.699389 4991 generic.go:334] "Generic (PLEG): container finished" podID="1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86" containerID="b6dceebaa75ae36e7899a4fb40f787688cf1bd50e9ab492a1b0da0dab00b534a" exitCode=143 Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.699423 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86","Type":"ContainerDied","Data":"3a1316092873fdbbe77bb596ed4a261fbb31c16f02099a31aa92adb84d85620f"} Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.699439 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86","Type":"ContainerDied","Data":"b6dceebaa75ae36e7899a4fb40f787688cf1bd50e9ab492a1b0da0dab00b534a"} Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.699450 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86","Type":"ContainerDied","Data":"fe9abf9d95c88ec4cdf9d9d2f08066f82e3a3ee5dc186ccf9e487bec061b068a"} Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.699466 4991 scope.go:117] "RemoveContainer" containerID="3a1316092873fdbbe77bb596ed4a261fbb31c16f02099a31aa92adb84d85620f" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.699962 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.733434 4991 scope.go:117] "RemoveContainer" containerID="b6dceebaa75ae36e7899a4fb40f787688cf1bd50e9ab492a1b0da0dab00b534a" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.744229 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.786522 4991 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.796360 4991 scope.go:117] "RemoveContainer" containerID="3a1316092873fdbbe77bb596ed4a261fbb31c16f02099a31aa92adb84d85620f" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.796530 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 06 01:25:38 crc kubenswrapper[4991]: E1206 01:25:38.797716 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a1316092873fdbbe77bb596ed4a261fbb31c16f02099a31aa92adb84d85620f\": container with ID starting with 3a1316092873fdbbe77bb596ed4a261fbb31c16f02099a31aa92adb84d85620f not found: ID does not exist" containerID="3a1316092873fdbbe77bb596ed4a261fbb31c16f02099a31aa92adb84d85620f" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.797761 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a1316092873fdbbe77bb596ed4a261fbb31c16f02099a31aa92adb84d85620f"} err="failed to get container status \"3a1316092873fdbbe77bb596ed4a261fbb31c16f02099a31aa92adb84d85620f\": rpc error: code = NotFound desc = could not find container \"3a1316092873fdbbe77bb596ed4a261fbb31c16f02099a31aa92adb84d85620f\": container with ID starting with 3a1316092873fdbbe77bb596ed4a261fbb31c16f02099a31aa92adb84d85620f not found: ID does not exist" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.797784 4991 scope.go:117] "RemoveContainer" containerID="b6dceebaa75ae36e7899a4fb40f787688cf1bd50e9ab492a1b0da0dab00b534a" Dec 06 01:25:38 crc kubenswrapper[4991]: E1206 01:25:38.798399 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6dceebaa75ae36e7899a4fb40f787688cf1bd50e9ab492a1b0da0dab00b534a\": container with ID starting with b6dceebaa75ae36e7899a4fb40f787688cf1bd50e9ab492a1b0da0dab00b534a not found: ID does not exist" containerID="b6dceebaa75ae36e7899a4fb40f787688cf1bd50e9ab492a1b0da0dab00b534a" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.798462 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6dceebaa75ae36e7899a4fb40f787688cf1bd50e9ab492a1b0da0dab00b534a"} err="failed to get container status \"b6dceebaa75ae36e7899a4fb40f787688cf1bd50e9ab492a1b0da0dab00b534a\": rpc error: code = NotFound desc = could not find container \"b6dceebaa75ae36e7899a4fb40f787688cf1bd50e9ab492a1b0da0dab00b534a\": container with ID starting with b6dceebaa75ae36e7899a4fb40f787688cf1bd50e9ab492a1b0da0dab00b534a not found: ID does not exist" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.798493 4991 scope.go:117] "RemoveContainer" containerID="3a1316092873fdbbe77bb596ed4a261fbb31c16f02099a31aa92adb84d85620f" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.798929 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a1316092873fdbbe77bb596ed4a261fbb31c16f02099a31aa92adb84d85620f"} err="failed to get container status \"3a1316092873fdbbe77bb596ed4a261fbb31c16f02099a31aa92adb84d85620f\": rpc error: code = NotFound desc = could not find container \"3a1316092873fdbbe77bb596ed4a261fbb31c16f02099a31aa92adb84d85620f\": container with ID starting with 3a1316092873fdbbe77bb596ed4a261fbb31c16f02099a31aa92adb84d85620f not found: ID does not exist" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.798954 4991 scope.go:117] "RemoveContainer" containerID="b6dceebaa75ae36e7899a4fb40f787688cf1bd50e9ab492a1b0da0dab00b534a" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.799220 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6dceebaa75ae36e7899a4fb40f787688cf1bd50e9ab492a1b0da0dab00b534a"} err="failed to get container status \"b6dceebaa75ae36e7899a4fb40f787688cf1bd50e9ab492a1b0da0dab00b534a\": rpc error: code = NotFound desc = could not find container \"b6dceebaa75ae36e7899a4fb40f787688cf1bd50e9ab492a1b0da0dab00b534a\": container with ID starting with b6dceebaa75ae36e7899a4fb40f787688cf1bd50e9ab492a1b0da0dab00b534a not found: ID does not exist" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.810673 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 06 01:25:38 crc kubenswrapper[4991]: E1206 01:25:38.817396 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86" containerName="nova-api-api" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.817583 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86" containerName="nova-api-api" Dec 06 01:25:38 crc kubenswrapper[4991]: E1206 01:25:38.817670 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5151c0e3-7834-4643-8b66-5d7e0f421059" containerName="init" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.817743 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="5151c0e3-7834-4643-8b66-5d7e0f421059" containerName="init" Dec 06 01:25:38 crc kubenswrapper[4991]: E1206 01:25:38.817836 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86" containerName="nova-api-log" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.817892 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86" containerName="nova-api-log" Dec 06 01:25:38 crc kubenswrapper[4991]: E1206 01:25:38.817963 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5151c0e3-7834-4643-8b66-5d7e0f421059" containerName="dnsmasq-dns" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.818039 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="5151c0e3-7834-4643-8b66-5d7e0f421059" containerName="dnsmasq-dns" Dec 06 01:25:38 crc kubenswrapper[4991]: E1206 01:25:38.818102 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df43ecec-34dd-4ec1-aa1e-bdc70ff19359" containerName="nova-manage" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.818152 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="df43ecec-34dd-4ec1-aa1e-bdc70ff19359" containerName="nova-manage" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.818467 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86" containerName="nova-api-api" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.818540 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="df43ecec-34dd-4ec1-aa1e-bdc70ff19359" containerName="nova-manage" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.818621 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86" containerName="nova-api-log" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.818689 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="5151c0e3-7834-4643-8b66-5d7e0f421059" containerName="dnsmasq-dns" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.819716 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.824267 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.824380 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.826343 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.835663 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.889703 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d718f384-f95f-4601-afc7-0aa3655b9855-public-tls-certs\") pod \"nova-api-0\" (UID: \"d718f384-f95f-4601-afc7-0aa3655b9855\") " pod="openstack/nova-api-0" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.889827 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d718f384-f95f-4601-afc7-0aa3655b9855-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d718f384-f95f-4601-afc7-0aa3655b9855\") " pod="openstack/nova-api-0" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.889855 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d718f384-f95f-4601-afc7-0aa3655b9855-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d718f384-f95f-4601-afc7-0aa3655b9855\") " pod="openstack/nova-api-0" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.889914 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4r6t\" (UniqueName: \"kubernetes.io/projected/d718f384-f95f-4601-afc7-0aa3655b9855-kube-api-access-j4r6t\") pod \"nova-api-0\" (UID: \"d718f384-f95f-4601-afc7-0aa3655b9855\") " pod="openstack/nova-api-0" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.890310 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d718f384-f95f-4601-afc7-0aa3655b9855-config-data\") pod \"nova-api-0\" (UID: \"d718f384-f95f-4601-afc7-0aa3655b9855\") " pod="openstack/nova-api-0" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.890586 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d718f384-f95f-4601-afc7-0aa3655b9855-logs\") pod \"nova-api-0\" (UID: \"d718f384-f95f-4601-afc7-0aa3655b9855\") " pod="openstack/nova-api-0" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.894713 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.991992 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fed182dc-f3b0-470b-965c-66f5cbb719b5-config-data\") pod \"fed182dc-f3b0-470b-965c-66f5cbb719b5\" (UID: \"fed182dc-f3b0-470b-965c-66f5cbb719b5\") " Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.992057 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fed182dc-f3b0-470b-965c-66f5cbb719b5-combined-ca-bundle\") pod \"fed182dc-f3b0-470b-965c-66f5cbb719b5\" (UID: \"fed182dc-f3b0-470b-965c-66f5cbb719b5\") " Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.992124 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vznf\" (UniqueName: \"kubernetes.io/projected/fed182dc-f3b0-470b-965c-66f5cbb719b5-kube-api-access-6vznf\") pod \"fed182dc-f3b0-470b-965c-66f5cbb719b5\" (UID: \"fed182dc-f3b0-470b-965c-66f5cbb719b5\") " Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.992288 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d718f384-f95f-4601-afc7-0aa3655b9855-public-tls-certs\") pod \"nova-api-0\" (UID: \"d718f384-f95f-4601-afc7-0aa3655b9855\") " pod="openstack/nova-api-0" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.992330 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d718f384-f95f-4601-afc7-0aa3655b9855-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d718f384-f95f-4601-afc7-0aa3655b9855\") " pod="openstack/nova-api-0" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.992351 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d718f384-f95f-4601-afc7-0aa3655b9855-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d718f384-f95f-4601-afc7-0aa3655b9855\") " pod="openstack/nova-api-0" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.992387 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4r6t\" (UniqueName: \"kubernetes.io/projected/d718f384-f95f-4601-afc7-0aa3655b9855-kube-api-access-j4r6t\") pod \"nova-api-0\" (UID: \"d718f384-f95f-4601-afc7-0aa3655b9855\") " pod="openstack/nova-api-0" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.992449 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d718f384-f95f-4601-afc7-0aa3655b9855-config-data\") pod \"nova-api-0\" (UID: \"d718f384-f95f-4601-afc7-0aa3655b9855\") " pod="openstack/nova-api-0" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.992496 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d718f384-f95f-4601-afc7-0aa3655b9855-logs\") pod \"nova-api-0\" (UID: \"d718f384-f95f-4601-afc7-0aa3655b9855\") " pod="openstack/nova-api-0" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.993091 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d718f384-f95f-4601-afc7-0aa3655b9855-logs\") pod \"nova-api-0\" (UID: \"d718f384-f95f-4601-afc7-0aa3655b9855\") " pod="openstack/nova-api-0" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.995629 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fed182dc-f3b0-470b-965c-66f5cbb719b5-kube-api-access-6vznf" (OuterVolumeSpecName: "kube-api-access-6vznf") pod "fed182dc-f3b0-470b-965c-66f5cbb719b5" (UID: "fed182dc-f3b0-470b-965c-66f5cbb719b5"). InnerVolumeSpecName "kube-api-access-6vznf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.997047 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d718f384-f95f-4601-afc7-0aa3655b9855-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d718f384-f95f-4601-afc7-0aa3655b9855\") " pod="openstack/nova-api-0" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.997296 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d718f384-f95f-4601-afc7-0aa3655b9855-config-data\") pod \"nova-api-0\" (UID: \"d718f384-f95f-4601-afc7-0aa3655b9855\") " pod="openstack/nova-api-0" Dec 06 01:25:38 crc kubenswrapper[4991]: I1206 01:25:38.998125 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d718f384-f95f-4601-afc7-0aa3655b9855-public-tls-certs\") pod \"nova-api-0\" (UID: \"d718f384-f95f-4601-afc7-0aa3655b9855\") " pod="openstack/nova-api-0" Dec 06 01:25:39 crc kubenswrapper[4991]: I1206 01:25:39.001339 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d718f384-f95f-4601-afc7-0aa3655b9855-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d718f384-f95f-4601-afc7-0aa3655b9855\") " pod="openstack/nova-api-0" Dec 06 01:25:39 crc kubenswrapper[4991]: I1206 01:25:39.009189 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4r6t\" (UniqueName: \"kubernetes.io/projected/d718f384-f95f-4601-afc7-0aa3655b9855-kube-api-access-j4r6t\") pod \"nova-api-0\" (UID: \"d718f384-f95f-4601-afc7-0aa3655b9855\") " pod="openstack/nova-api-0" Dec 06 01:25:39 crc kubenswrapper[4991]: I1206 01:25:39.017053 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fed182dc-f3b0-470b-965c-66f5cbb719b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fed182dc-f3b0-470b-965c-66f5cbb719b5" (UID: "fed182dc-f3b0-470b-965c-66f5cbb719b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:25:39 crc kubenswrapper[4991]: I1206 01:25:39.031081 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fed182dc-f3b0-470b-965c-66f5cbb719b5-config-data" (OuterVolumeSpecName: "config-data") pod "fed182dc-f3b0-470b-965c-66f5cbb719b5" (UID: "fed182dc-f3b0-470b-965c-66f5cbb719b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:25:39 crc kubenswrapper[4991]: I1206 01:25:39.056441 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86" path="/var/lib/kubelet/pods/1fa8ab09-7dd4-4203-b8ba-6f6ff1cc2b86/volumes" Dec 06 01:25:39 crc kubenswrapper[4991]: I1206 01:25:39.093615 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fed182dc-f3b0-470b-965c-66f5cbb719b5-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 01:25:39 crc kubenswrapper[4991]: I1206 01:25:39.094159 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fed182dc-f3b0-470b-965c-66f5cbb719b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:25:39 crc kubenswrapper[4991]: I1206 01:25:39.094258 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vznf\" (UniqueName: \"kubernetes.io/projected/fed182dc-f3b0-470b-965c-66f5cbb719b5-kube-api-access-6vznf\") on node \"crc\" DevicePath \"\"" Dec 06 01:25:39 crc kubenswrapper[4991]: I1206 01:25:39.148409 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 06 01:25:39 crc kubenswrapper[4991]: I1206 01:25:39.633010 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 06 01:25:39 crc kubenswrapper[4991]: I1206 01:25:39.717118 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d718f384-f95f-4601-afc7-0aa3655b9855","Type":"ContainerStarted","Data":"c08e00edda6b6436bd007a14026c19affc838ed61f10053e87e552d9065b04de"} Dec 06 01:25:39 crc kubenswrapper[4991]: I1206 01:25:39.720864 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fed182dc-f3b0-470b-965c-66f5cbb719b5","Type":"ContainerDied","Data":"1174ef697b8a04e670d4936f7f793b26c0262201d86546e24f5472de7e39b43a"} Dec 06 01:25:39 crc kubenswrapper[4991]: I1206 01:25:39.721117 4991 scope.go:117] "RemoveContainer" containerID="cb371c896f879db4159a31a00a882866b3c26a725fcc5d68b39c01108d773cdf" Dec 06 01:25:39 crc kubenswrapper[4991]: I1206 01:25:39.721281 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 01:25:39 crc kubenswrapper[4991]: I1206 01:25:39.793366 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 01:25:39 crc kubenswrapper[4991]: I1206 01:25:39.813925 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 01:25:39 crc kubenswrapper[4991]: I1206 01:25:39.822666 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 01:25:39 crc kubenswrapper[4991]: E1206 01:25:39.823356 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fed182dc-f3b0-470b-965c-66f5cbb719b5" containerName="nova-scheduler-scheduler" Dec 06 01:25:39 crc kubenswrapper[4991]: I1206 01:25:39.823375 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="fed182dc-f3b0-470b-965c-66f5cbb719b5" containerName="nova-scheduler-scheduler" Dec 06 01:25:39 crc kubenswrapper[4991]: I1206 01:25:39.823902 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="fed182dc-f3b0-470b-965c-66f5cbb719b5" containerName="nova-scheduler-scheduler" Dec 06 01:25:39 crc kubenswrapper[4991]: I1206 01:25:39.825124 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 01:25:39 crc kubenswrapper[4991]: I1206 01:25:39.829927 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 06 01:25:39 crc kubenswrapper[4991]: I1206 01:25:39.831741 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 01:25:39 crc kubenswrapper[4991]: I1206 01:25:39.930377 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1581dfb-5721-4ba3-9364-4b891a476821-config-data\") pod \"nova-scheduler-0\" (UID: \"a1581dfb-5721-4ba3-9364-4b891a476821\") " pod="openstack/nova-scheduler-0" Dec 06 01:25:39 crc kubenswrapper[4991]: I1206 01:25:39.932309 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6vnh\" (UniqueName: \"kubernetes.io/projected/a1581dfb-5721-4ba3-9364-4b891a476821-kube-api-access-p6vnh\") pod \"nova-scheduler-0\" (UID: \"a1581dfb-5721-4ba3-9364-4b891a476821\") " pod="openstack/nova-scheduler-0" Dec 06 01:25:39 crc kubenswrapper[4991]: I1206 01:25:39.932624 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1581dfb-5721-4ba3-9364-4b891a476821-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a1581dfb-5721-4ba3-9364-4b891a476821\") " pod="openstack/nova-scheduler-0" Dec 06 01:25:40 crc kubenswrapper[4991]: I1206 01:25:40.034556 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6vnh\" (UniqueName: \"kubernetes.io/projected/a1581dfb-5721-4ba3-9364-4b891a476821-kube-api-access-p6vnh\") pod \"nova-scheduler-0\" (UID: \"a1581dfb-5721-4ba3-9364-4b891a476821\") " pod="openstack/nova-scheduler-0" Dec 06 01:25:40 crc kubenswrapper[4991]: I1206 01:25:40.034660 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1581dfb-5721-4ba3-9364-4b891a476821-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a1581dfb-5721-4ba3-9364-4b891a476821\") " pod="openstack/nova-scheduler-0" Dec 06 01:25:40 crc kubenswrapper[4991]: I1206 01:25:40.034690 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1581dfb-5721-4ba3-9364-4b891a476821-config-data\") pod \"nova-scheduler-0\" (UID: \"a1581dfb-5721-4ba3-9364-4b891a476821\") " pod="openstack/nova-scheduler-0" Dec 06 01:25:40 crc kubenswrapper[4991]: I1206 01:25:40.041645 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1581dfb-5721-4ba3-9364-4b891a476821-config-data\") pod \"nova-scheduler-0\" (UID: \"a1581dfb-5721-4ba3-9364-4b891a476821\") " pod="openstack/nova-scheduler-0" Dec 06 01:25:40 crc kubenswrapper[4991]: I1206 01:25:40.042822 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1581dfb-5721-4ba3-9364-4b891a476821-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a1581dfb-5721-4ba3-9364-4b891a476821\") " pod="openstack/nova-scheduler-0" Dec 06 01:25:40 crc kubenswrapper[4991]: I1206 01:25:40.058014 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6vnh\" (UniqueName: \"kubernetes.io/projected/a1581dfb-5721-4ba3-9364-4b891a476821-kube-api-access-p6vnh\") pod \"nova-scheduler-0\" (UID: \"a1581dfb-5721-4ba3-9364-4b891a476821\") " pod="openstack/nova-scheduler-0" Dec 06 01:25:40 crc kubenswrapper[4991]: I1206 01:25:40.153104 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 06 01:25:40 crc kubenswrapper[4991]: I1206 01:25:40.645967 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 06 01:25:40 crc kubenswrapper[4991]: I1206 01:25:40.734026 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a1581dfb-5721-4ba3-9364-4b891a476821","Type":"ContainerStarted","Data":"f10379052cc798597942b471a1c90e6c736db2c8691f3c150e67951016b44c77"} Dec 06 01:25:40 crc kubenswrapper[4991]: I1206 01:25:40.737754 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d718f384-f95f-4601-afc7-0aa3655b9855","Type":"ContainerStarted","Data":"956b23b1c055d58b85bd99bc6ae54c7f150f8576df058c49daacc1b672b10b6a"} Dec 06 01:25:40 crc kubenswrapper[4991]: I1206 01:25:40.737814 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d718f384-f95f-4601-afc7-0aa3655b9855","Type":"ContainerStarted","Data":"2a71304548a4695110b233549fb27aae9103a9c12e89dac8e7278d81e5fa7861"} Dec 06 01:25:40 crc kubenswrapper[4991]: I1206 01:25:40.778114 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.77808545 podStartE2EDuration="2.77808545s" podCreationTimestamp="2025-12-06 01:25:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:25:40.76792525 +0000 UTC m=+1414.118215738" watchObservedRunningTime="2025-12-06 01:25:40.77808545 +0000 UTC m=+1414.128375918" Dec 06 01:25:41 crc kubenswrapper[4991]: I1206 01:25:41.060304 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fed182dc-f3b0-470b-965c-66f5cbb719b5" path="/var/lib/kubelet/pods/fed182dc-f3b0-470b-965c-66f5cbb719b5/volumes" Dec 06 01:25:41 crc kubenswrapper[4991]: I1206 01:25:41.164299 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="8603a428-6117-4bbb-a409-45aa3c447611" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": read tcp 10.217.0.2:48934->10.217.0.193:8775: read: connection reset by peer" Dec 06 01:25:41 crc kubenswrapper[4991]: I1206 01:25:41.164855 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="8603a428-6117-4bbb-a409-45aa3c447611" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": read tcp 10.217.0.2:48928->10.217.0.193:8775: read: connection reset by peer" Dec 06 01:25:41 crc kubenswrapper[4991]: I1206 01:25:41.644691 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 01:25:41 crc kubenswrapper[4991]: I1206 01:25:41.674648 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8603a428-6117-4bbb-a409-45aa3c447611-combined-ca-bundle\") pod \"8603a428-6117-4bbb-a409-45aa3c447611\" (UID: \"8603a428-6117-4bbb-a409-45aa3c447611\") " Dec 06 01:25:41 crc kubenswrapper[4991]: I1206 01:25:41.674851 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8603a428-6117-4bbb-a409-45aa3c447611-logs\") pod \"8603a428-6117-4bbb-a409-45aa3c447611\" (UID: \"8603a428-6117-4bbb-a409-45aa3c447611\") " Dec 06 01:25:41 crc kubenswrapper[4991]: I1206 01:25:41.674879 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8603a428-6117-4bbb-a409-45aa3c447611-nova-metadata-tls-certs\") pod \"8603a428-6117-4bbb-a409-45aa3c447611\" (UID: \"8603a428-6117-4bbb-a409-45aa3c447611\") " Dec 06 01:25:41 crc kubenswrapper[4991]: I1206 01:25:41.674936 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8603a428-6117-4bbb-a409-45aa3c447611-config-data\") pod \"8603a428-6117-4bbb-a409-45aa3c447611\" (UID: \"8603a428-6117-4bbb-a409-45aa3c447611\") " Dec 06 01:25:41 crc kubenswrapper[4991]: I1206 01:25:41.675207 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58dh8\" (UniqueName: \"kubernetes.io/projected/8603a428-6117-4bbb-a409-45aa3c447611-kube-api-access-58dh8\") pod \"8603a428-6117-4bbb-a409-45aa3c447611\" (UID: \"8603a428-6117-4bbb-a409-45aa3c447611\") " Dec 06 01:25:41 crc kubenswrapper[4991]: I1206 01:25:41.676719 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8603a428-6117-4bbb-a409-45aa3c447611-logs" (OuterVolumeSpecName: "logs") pod "8603a428-6117-4bbb-a409-45aa3c447611" (UID: "8603a428-6117-4bbb-a409-45aa3c447611"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:25:41 crc kubenswrapper[4991]: I1206 01:25:41.690229 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8603a428-6117-4bbb-a409-45aa3c447611-kube-api-access-58dh8" (OuterVolumeSpecName: "kube-api-access-58dh8") pod "8603a428-6117-4bbb-a409-45aa3c447611" (UID: "8603a428-6117-4bbb-a409-45aa3c447611"). InnerVolumeSpecName "kube-api-access-58dh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:25:41 crc kubenswrapper[4991]: I1206 01:25:41.723160 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8603a428-6117-4bbb-a409-45aa3c447611-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8603a428-6117-4bbb-a409-45aa3c447611" (UID: "8603a428-6117-4bbb-a409-45aa3c447611"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:25:41 crc kubenswrapper[4991]: I1206 01:25:41.761294 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8603a428-6117-4bbb-a409-45aa3c447611-config-data" (OuterVolumeSpecName: "config-data") pod "8603a428-6117-4bbb-a409-45aa3c447611" (UID: "8603a428-6117-4bbb-a409-45aa3c447611"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:25:41 crc kubenswrapper[4991]: I1206 01:25:41.765749 4991 generic.go:334] "Generic (PLEG): container finished" podID="8603a428-6117-4bbb-a409-45aa3c447611" containerID="d60d56a67c1a309283a3375423282c8e12279d7ec963fe832f420ee19d98c9f6" exitCode=0 Dec 06 01:25:41 crc kubenswrapper[4991]: I1206 01:25:41.765839 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8603a428-6117-4bbb-a409-45aa3c447611","Type":"ContainerDied","Data":"d60d56a67c1a309283a3375423282c8e12279d7ec963fe832f420ee19d98c9f6"} Dec 06 01:25:41 crc kubenswrapper[4991]: I1206 01:25:41.765874 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8603a428-6117-4bbb-a409-45aa3c447611","Type":"ContainerDied","Data":"3e250d89491720d399fa8b173bc199de94980e0111df56e9fc6f0cfaa8748078"} Dec 06 01:25:41 crc kubenswrapper[4991]: I1206 01:25:41.765895 4991 scope.go:117] "RemoveContainer" containerID="d60d56a67c1a309283a3375423282c8e12279d7ec963fe832f420ee19d98c9f6" Dec 06 01:25:41 crc kubenswrapper[4991]: I1206 01:25:41.766017 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 01:25:41 crc kubenswrapper[4991]: I1206 01:25:41.771716 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a1581dfb-5721-4ba3-9364-4b891a476821","Type":"ContainerStarted","Data":"34d2efebd69950e940930d83c15492f6add3f5ca9967b537c6c26037d1b5dddd"} Dec 06 01:25:41 crc kubenswrapper[4991]: I1206 01:25:41.776082 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8603a428-6117-4bbb-a409-45aa3c447611-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "8603a428-6117-4bbb-a409-45aa3c447611" (UID: "8603a428-6117-4bbb-a409-45aa3c447611"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:25:41 crc kubenswrapper[4991]: I1206 01:25:41.782463 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58dh8\" (UniqueName: \"kubernetes.io/projected/8603a428-6117-4bbb-a409-45aa3c447611-kube-api-access-58dh8\") on node \"crc\" DevicePath \"\"" Dec 06 01:25:41 crc kubenswrapper[4991]: I1206 01:25:41.782515 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8603a428-6117-4bbb-a409-45aa3c447611-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:25:41 crc kubenswrapper[4991]: I1206 01:25:41.782540 4991 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8603a428-6117-4bbb-a409-45aa3c447611-logs\") on node \"crc\" DevicePath \"\"" Dec 06 01:25:41 crc kubenswrapper[4991]: I1206 01:25:41.782552 4991 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8603a428-6117-4bbb-a409-45aa3c447611-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 06 01:25:41 crc kubenswrapper[4991]: I1206 01:25:41.782564 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8603a428-6117-4bbb-a409-45aa3c447611-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 01:25:41 crc kubenswrapper[4991]: I1206 01:25:41.795965 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.795934607 podStartE2EDuration="2.795934607s" podCreationTimestamp="2025-12-06 01:25:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:25:41.795217639 +0000 UTC m=+1415.145508137" watchObservedRunningTime="2025-12-06 01:25:41.795934607 +0000 UTC m=+1415.146225085" Dec 06 01:25:41 crc kubenswrapper[4991]: I1206 01:25:41.806583 4991 scope.go:117] "RemoveContainer" containerID="81dfb5859167b882b8078f12c83503aca3b6cb983c82f38367a049a74243a90e" Dec 06 01:25:41 crc kubenswrapper[4991]: I1206 01:25:41.833418 4991 scope.go:117] "RemoveContainer" containerID="d60d56a67c1a309283a3375423282c8e12279d7ec963fe832f420ee19d98c9f6" Dec 06 01:25:41 crc kubenswrapper[4991]: E1206 01:25:41.834367 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d60d56a67c1a309283a3375423282c8e12279d7ec963fe832f420ee19d98c9f6\": container with ID starting with d60d56a67c1a309283a3375423282c8e12279d7ec963fe832f420ee19d98c9f6 not found: ID does not exist" containerID="d60d56a67c1a309283a3375423282c8e12279d7ec963fe832f420ee19d98c9f6" Dec 06 01:25:41 crc kubenswrapper[4991]: I1206 01:25:41.834436 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d60d56a67c1a309283a3375423282c8e12279d7ec963fe832f420ee19d98c9f6"} err="failed to get container status \"d60d56a67c1a309283a3375423282c8e12279d7ec963fe832f420ee19d98c9f6\": rpc error: code = NotFound desc = could not find container \"d60d56a67c1a309283a3375423282c8e12279d7ec963fe832f420ee19d98c9f6\": container with ID starting with d60d56a67c1a309283a3375423282c8e12279d7ec963fe832f420ee19d98c9f6 not found: ID does not exist" Dec 06 01:25:41 crc kubenswrapper[4991]: I1206 01:25:41.834459 4991 scope.go:117] "RemoveContainer" containerID="81dfb5859167b882b8078f12c83503aca3b6cb983c82f38367a049a74243a90e" Dec 06 01:25:41 crc kubenswrapper[4991]: E1206 01:25:41.835071 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81dfb5859167b882b8078f12c83503aca3b6cb983c82f38367a049a74243a90e\": container with ID starting with 81dfb5859167b882b8078f12c83503aca3b6cb983c82f38367a049a74243a90e not found: ID does not exist" containerID="81dfb5859167b882b8078f12c83503aca3b6cb983c82f38367a049a74243a90e" Dec 06 01:25:41 crc kubenswrapper[4991]: I1206 01:25:41.835107 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81dfb5859167b882b8078f12c83503aca3b6cb983c82f38367a049a74243a90e"} err="failed to get container status \"81dfb5859167b882b8078f12c83503aca3b6cb983c82f38367a049a74243a90e\": rpc error: code = NotFound desc = could not find container \"81dfb5859167b882b8078f12c83503aca3b6cb983c82f38367a049a74243a90e\": container with ID starting with 81dfb5859167b882b8078f12c83503aca3b6cb983c82f38367a049a74243a90e not found: ID does not exist" Dec 06 01:25:42 crc kubenswrapper[4991]: I1206 01:25:42.147156 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 01:25:42 crc kubenswrapper[4991]: I1206 01:25:42.167714 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 01:25:42 crc kubenswrapper[4991]: I1206 01:25:42.188399 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 06 01:25:42 crc kubenswrapper[4991]: E1206 01:25:42.189473 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8603a428-6117-4bbb-a409-45aa3c447611" containerName="nova-metadata-log" Dec 06 01:25:42 crc kubenswrapper[4991]: I1206 01:25:42.189492 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="8603a428-6117-4bbb-a409-45aa3c447611" containerName="nova-metadata-log" Dec 06 01:25:42 crc kubenswrapper[4991]: E1206 01:25:42.189530 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8603a428-6117-4bbb-a409-45aa3c447611" containerName="nova-metadata-metadata" Dec 06 01:25:42 crc kubenswrapper[4991]: I1206 01:25:42.189538 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="8603a428-6117-4bbb-a409-45aa3c447611" containerName="nova-metadata-metadata" Dec 06 01:25:42 crc kubenswrapper[4991]: I1206 01:25:42.189760 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="8603a428-6117-4bbb-a409-45aa3c447611" containerName="nova-metadata-metadata" Dec 06 01:25:42 crc kubenswrapper[4991]: I1206 01:25:42.189784 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="8603a428-6117-4bbb-a409-45aa3c447611" containerName="nova-metadata-log" Dec 06 01:25:42 crc kubenswrapper[4991]: I1206 01:25:42.192836 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 01:25:42 crc kubenswrapper[4991]: I1206 01:25:42.197349 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 01:25:42 crc kubenswrapper[4991]: I1206 01:25:42.197944 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 06 01:25:42 crc kubenswrapper[4991]: I1206 01:25:42.198202 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 06 01:25:42 crc kubenswrapper[4991]: I1206 01:25:42.296748 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bae01666-bf5e-4dc0-a540-440d511229e7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bae01666-bf5e-4dc0-a540-440d511229e7\") " pod="openstack/nova-metadata-0" Dec 06 01:25:42 crc kubenswrapper[4991]: I1206 01:25:42.296866 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bae01666-bf5e-4dc0-a540-440d511229e7-config-data\") pod \"nova-metadata-0\" (UID: \"bae01666-bf5e-4dc0-a540-440d511229e7\") " pod="openstack/nova-metadata-0" Dec 06 01:25:42 crc kubenswrapper[4991]: I1206 01:25:42.296935 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bae01666-bf5e-4dc0-a540-440d511229e7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bae01666-bf5e-4dc0-a540-440d511229e7\") " pod="openstack/nova-metadata-0" Dec 06 01:25:42 crc kubenswrapper[4991]: I1206 01:25:42.296965 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktd29\" (UniqueName: \"kubernetes.io/projected/bae01666-bf5e-4dc0-a540-440d511229e7-kube-api-access-ktd29\") pod \"nova-metadata-0\" (UID: \"bae01666-bf5e-4dc0-a540-440d511229e7\") " pod="openstack/nova-metadata-0" Dec 06 01:25:42 crc kubenswrapper[4991]: I1206 01:25:42.297055 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bae01666-bf5e-4dc0-a540-440d511229e7-logs\") pod \"nova-metadata-0\" (UID: \"bae01666-bf5e-4dc0-a540-440d511229e7\") " pod="openstack/nova-metadata-0" Dec 06 01:25:42 crc kubenswrapper[4991]: I1206 01:25:42.399048 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bae01666-bf5e-4dc0-a540-440d511229e7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bae01666-bf5e-4dc0-a540-440d511229e7\") " pod="openstack/nova-metadata-0" Dec 06 01:25:42 crc kubenswrapper[4991]: I1206 01:25:42.399105 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bae01666-bf5e-4dc0-a540-440d511229e7-config-data\") pod \"nova-metadata-0\" (UID: \"bae01666-bf5e-4dc0-a540-440d511229e7\") " pod="openstack/nova-metadata-0" Dec 06 01:25:42 crc kubenswrapper[4991]: I1206 01:25:42.399133 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bae01666-bf5e-4dc0-a540-440d511229e7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bae01666-bf5e-4dc0-a540-440d511229e7\") " pod="openstack/nova-metadata-0" Dec 06 01:25:42 crc kubenswrapper[4991]: I1206 01:25:42.399152 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktd29\" (UniqueName: \"kubernetes.io/projected/bae01666-bf5e-4dc0-a540-440d511229e7-kube-api-access-ktd29\") pod \"nova-metadata-0\" (UID: \"bae01666-bf5e-4dc0-a540-440d511229e7\") " pod="openstack/nova-metadata-0" Dec 06 01:25:42 crc kubenswrapper[4991]: I1206 01:25:42.399194 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bae01666-bf5e-4dc0-a540-440d511229e7-logs\") pod \"nova-metadata-0\" (UID: \"bae01666-bf5e-4dc0-a540-440d511229e7\") " pod="openstack/nova-metadata-0" Dec 06 01:25:42 crc kubenswrapper[4991]: I1206 01:25:42.399768 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bae01666-bf5e-4dc0-a540-440d511229e7-logs\") pod \"nova-metadata-0\" (UID: \"bae01666-bf5e-4dc0-a540-440d511229e7\") " pod="openstack/nova-metadata-0" Dec 06 01:25:42 crc kubenswrapper[4991]: I1206 01:25:42.405359 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bae01666-bf5e-4dc0-a540-440d511229e7-config-data\") pod \"nova-metadata-0\" (UID: \"bae01666-bf5e-4dc0-a540-440d511229e7\") " pod="openstack/nova-metadata-0" Dec 06 01:25:42 crc kubenswrapper[4991]: I1206 01:25:42.405574 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bae01666-bf5e-4dc0-a540-440d511229e7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bae01666-bf5e-4dc0-a540-440d511229e7\") " pod="openstack/nova-metadata-0" Dec 06 01:25:42 crc kubenswrapper[4991]: I1206 01:25:42.410637 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bae01666-bf5e-4dc0-a540-440d511229e7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bae01666-bf5e-4dc0-a540-440d511229e7\") " pod="openstack/nova-metadata-0" Dec 06 01:25:42 crc kubenswrapper[4991]: I1206 01:25:42.446713 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktd29\" (UniqueName: \"kubernetes.io/projected/bae01666-bf5e-4dc0-a540-440d511229e7-kube-api-access-ktd29\") pod \"nova-metadata-0\" (UID: \"bae01666-bf5e-4dc0-a540-440d511229e7\") " pod="openstack/nova-metadata-0" Dec 06 01:25:42 crc kubenswrapper[4991]: I1206 01:25:42.524950 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 06 01:25:43 crc kubenswrapper[4991]: I1206 01:25:43.052803 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8603a428-6117-4bbb-a409-45aa3c447611" path="/var/lib/kubelet/pods/8603a428-6117-4bbb-a409-45aa3c447611/volumes" Dec 06 01:25:43 crc kubenswrapper[4991]: I1206 01:25:43.054180 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 06 01:25:43 crc kubenswrapper[4991]: I1206 01:25:43.799652 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bae01666-bf5e-4dc0-a540-440d511229e7","Type":"ContainerStarted","Data":"c7f629f399686b404b72f7abd3ba246e1435e956a7a1fa7878b990d014231ac1"} Dec 06 01:25:43 crc kubenswrapper[4991]: I1206 01:25:43.800257 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bae01666-bf5e-4dc0-a540-440d511229e7","Type":"ContainerStarted","Data":"e1626e4f505d309340a54894add67f547137b8dbe7b27bed23d012715eb088c2"} Dec 06 01:25:43 crc kubenswrapper[4991]: I1206 01:25:43.800274 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bae01666-bf5e-4dc0-a540-440d511229e7","Type":"ContainerStarted","Data":"b7842d360b80d1475650121e92f2afd9934dd27fe09eba23f7dd549c28cd6a40"} Dec 06 01:25:43 crc kubenswrapper[4991]: I1206 01:25:43.836733 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.836714061 podStartE2EDuration="1.836714061s" podCreationTimestamp="2025-12-06 01:25:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:25:43.823769861 +0000 UTC m=+1417.174060409" watchObservedRunningTime="2025-12-06 01:25:43.836714061 +0000 UTC m=+1417.187004529" Dec 06 01:25:45 crc kubenswrapper[4991]: I1206 01:25:45.153957 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 06 01:25:46 crc kubenswrapper[4991]: I1206 01:25:46.740163 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 01:25:46 crc kubenswrapper[4991]: I1206 01:25:46.741141 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 01:25:47 crc kubenswrapper[4991]: I1206 01:25:47.526193 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 06 01:25:47 crc kubenswrapper[4991]: I1206 01:25:47.526760 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 06 01:25:49 crc kubenswrapper[4991]: I1206 01:25:49.148764 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 06 01:25:49 crc kubenswrapper[4991]: I1206 01:25:49.149954 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 06 01:25:50 crc kubenswrapper[4991]: I1206 01:25:50.153386 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 06 01:25:50 crc kubenswrapper[4991]: I1206 01:25:50.167243 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d718f384-f95f-4601-afc7-0aa3655b9855" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.203:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 06 01:25:50 crc kubenswrapper[4991]: I1206 01:25:50.168749 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d718f384-f95f-4601-afc7-0aa3655b9855" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.203:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 06 01:25:50 crc kubenswrapper[4991]: I1206 01:25:50.203432 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 06 01:25:50 crc kubenswrapper[4991]: I1206 01:25:50.964491 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 06 01:25:52 crc kubenswrapper[4991]: I1206 01:25:52.525974 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 06 01:25:52 crc kubenswrapper[4991]: I1206 01:25:52.526309 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 06 01:25:53 crc kubenswrapper[4991]: I1206 01:25:53.544237 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="bae01666-bf5e-4dc0-a540-440d511229e7" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.205:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 06 01:25:53 crc kubenswrapper[4991]: I1206 01:25:53.544310 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="bae01666-bf5e-4dc0-a540-440d511229e7" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.205:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 06 01:25:59 crc kubenswrapper[4991]: I1206 01:25:59.159180 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 06 01:25:59 crc kubenswrapper[4991]: I1206 01:25:59.160453 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 06 01:25:59 crc kubenswrapper[4991]: I1206 01:25:59.170505 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 06 01:25:59 crc kubenswrapper[4991]: I1206 01:25:59.170690 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 06 01:26:00 crc kubenswrapper[4991]: I1206 01:26:00.019038 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 06 01:26:00 crc kubenswrapper[4991]: I1206 01:26:00.036504 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 06 01:26:00 crc kubenswrapper[4991]: I1206 01:26:00.985551 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 06 01:26:02 crc kubenswrapper[4991]: I1206 01:26:02.532214 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 06 01:26:02 crc kubenswrapper[4991]: I1206 01:26:02.534108 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 06 01:26:02 crc kubenswrapper[4991]: I1206 01:26:02.539568 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 06 01:26:03 crc kubenswrapper[4991]: I1206 01:26:03.101842 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 06 01:26:10 crc kubenswrapper[4991]: I1206 01:26:10.867929 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 01:26:11 crc kubenswrapper[4991]: I1206 01:26:11.904533 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 01:26:15 crc kubenswrapper[4991]: I1206 01:26:15.750504 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af" containerName="rabbitmq" containerID="cri-o://b9efb04596d3956458c1741004cf0c3c747574ec679f10d4a938952ad80a97e6" gracePeriod=604796 Dec 06 01:26:16 crc kubenswrapper[4991]: I1206 01:26:16.237917 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="8380dcf2-30be-4578-937b-5e2a8f4fc074" containerName="rabbitmq" containerID="cri-o://485ffea6f5a453e06d197597f0b9df4471c81a596a195cf59876927841636f54" gracePeriod=604796 Dec 06 01:26:16 crc kubenswrapper[4991]: I1206 01:26:16.740289 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 01:26:16 crc kubenswrapper[4991]: I1206 01:26:16.740378 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 01:26:18 crc kubenswrapper[4991]: I1206 01:26:18.021120 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="8380dcf2-30be-4578-937b-5e2a8f4fc074" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.98:5671: connect: connection refused" Dec 06 01:26:18 crc kubenswrapper[4991]: I1206 01:26:18.347261 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Dec 06 01:26:19 crc kubenswrapper[4991]: I1206 01:26:19.867232 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vwwhl"] Dec 06 01:26:19 crc kubenswrapper[4991]: I1206 01:26:19.870791 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vwwhl" Dec 06 01:26:19 crc kubenswrapper[4991]: I1206 01:26:19.878250 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vwwhl"] Dec 06 01:26:19 crc kubenswrapper[4991]: I1206 01:26:19.922178 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb527461-35b2-4636-9ca8-803adf47668f-utilities\") pod \"redhat-operators-vwwhl\" (UID: \"eb527461-35b2-4636-9ca8-803adf47668f\") " pod="openshift-marketplace/redhat-operators-vwwhl" Dec 06 01:26:19 crc kubenswrapper[4991]: I1206 01:26:19.922276 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb527461-35b2-4636-9ca8-803adf47668f-catalog-content\") pod \"redhat-operators-vwwhl\" (UID: \"eb527461-35b2-4636-9ca8-803adf47668f\") " pod="openshift-marketplace/redhat-operators-vwwhl" Dec 06 01:26:19 crc kubenswrapper[4991]: I1206 01:26:19.922308 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s96j\" (UniqueName: \"kubernetes.io/projected/eb527461-35b2-4636-9ca8-803adf47668f-kube-api-access-7s96j\") pod \"redhat-operators-vwwhl\" (UID: \"eb527461-35b2-4636-9ca8-803adf47668f\") " pod="openshift-marketplace/redhat-operators-vwwhl" Dec 06 01:26:20 crc kubenswrapper[4991]: I1206 01:26:20.023921 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb527461-35b2-4636-9ca8-803adf47668f-utilities\") pod \"redhat-operators-vwwhl\" (UID: \"eb527461-35b2-4636-9ca8-803adf47668f\") " pod="openshift-marketplace/redhat-operators-vwwhl" Dec 06 01:26:20 crc kubenswrapper[4991]: I1206 01:26:20.024047 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb527461-35b2-4636-9ca8-803adf47668f-catalog-content\") pod \"redhat-operators-vwwhl\" (UID: \"eb527461-35b2-4636-9ca8-803adf47668f\") " pod="openshift-marketplace/redhat-operators-vwwhl" Dec 06 01:26:20 crc kubenswrapper[4991]: I1206 01:26:20.024086 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s96j\" (UniqueName: \"kubernetes.io/projected/eb527461-35b2-4636-9ca8-803adf47668f-kube-api-access-7s96j\") pod \"redhat-operators-vwwhl\" (UID: \"eb527461-35b2-4636-9ca8-803adf47668f\") " pod="openshift-marketplace/redhat-operators-vwwhl" Dec 06 01:26:20 crc kubenswrapper[4991]: I1206 01:26:20.025031 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb527461-35b2-4636-9ca8-803adf47668f-utilities\") pod \"redhat-operators-vwwhl\" (UID: \"eb527461-35b2-4636-9ca8-803adf47668f\") " pod="openshift-marketplace/redhat-operators-vwwhl" Dec 06 01:26:20 crc kubenswrapper[4991]: I1206 01:26:20.025292 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb527461-35b2-4636-9ca8-803adf47668f-catalog-content\") pod \"redhat-operators-vwwhl\" (UID: \"eb527461-35b2-4636-9ca8-803adf47668f\") " pod="openshift-marketplace/redhat-operators-vwwhl" Dec 06 01:26:20 crc kubenswrapper[4991]: I1206 01:26:20.059793 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s96j\" (UniqueName: \"kubernetes.io/projected/eb527461-35b2-4636-9ca8-803adf47668f-kube-api-access-7s96j\") pod \"redhat-operators-vwwhl\" (UID: \"eb527461-35b2-4636-9ca8-803adf47668f\") " pod="openshift-marketplace/redhat-operators-vwwhl" Dec 06 01:26:20 crc kubenswrapper[4991]: I1206 01:26:20.195768 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vwwhl" Dec 06 01:26:20 crc kubenswrapper[4991]: I1206 01:26:20.697473 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vwwhl"] Dec 06 01:26:21 crc kubenswrapper[4991]: I1206 01:26:21.297499 4991 generic.go:334] "Generic (PLEG): container finished" podID="eb527461-35b2-4636-9ca8-803adf47668f" containerID="c0a5569c5034e37a640465b342a9664ddae29c1e2b92f229d5a1f83f4f4e3067" exitCode=0 Dec 06 01:26:21 crc kubenswrapper[4991]: I1206 01:26:21.297617 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vwwhl" event={"ID":"eb527461-35b2-4636-9ca8-803adf47668f","Type":"ContainerDied","Data":"c0a5569c5034e37a640465b342a9664ddae29c1e2b92f229d5a1f83f4f4e3067"} Dec 06 01:26:21 crc kubenswrapper[4991]: I1206 01:26:21.297841 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vwwhl" event={"ID":"eb527461-35b2-4636-9ca8-803adf47668f","Type":"ContainerStarted","Data":"a38be09ecd005f28c4e663126246434639683d49cc498dfb5ddca4cf4fc4ed2f"} Dec 06 01:26:21 crc kubenswrapper[4991]: I1206 01:26:21.299892 4991 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.309426 4991 generic.go:334] "Generic (PLEG): container finished" podID="d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af" containerID="b9efb04596d3956458c1741004cf0c3c747574ec679f10d4a938952ad80a97e6" exitCode=0 Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.310030 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af","Type":"ContainerDied","Data":"b9efb04596d3956458c1741004cf0c3c747574ec679f10d4a938952ad80a97e6"} Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.310074 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af","Type":"ContainerDied","Data":"83246f8482c1f84400f6204ad561026e90de77e8cc0c9cdc11925e1cb185a0df"} Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.310094 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83246f8482c1f84400f6204ad561026e90de77e8cc0c9cdc11925e1cb185a0df" Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.313319 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vwwhl" event={"ID":"eb527461-35b2-4636-9ca8-803adf47668f","Type":"ContainerStarted","Data":"17aa7aee82aa054025058af63b1268bee4ec2bb5777b92a4af90456943f7bccd"} Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.354620 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.468008 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-rabbitmq-tls\") pod \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\" (UID: \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\") " Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.468064 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-rabbitmq-erlang-cookie\") pod \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\" (UID: \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\") " Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.468181 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-rabbitmq-plugins\") pod \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\" (UID: \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\") " Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.468202 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68cs7\" (UniqueName: \"kubernetes.io/projected/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-kube-api-access-68cs7\") pod \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\" (UID: \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\") " Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.468219 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-rabbitmq-confd\") pod \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\" (UID: \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\") " Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.468244 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\" (UID: \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\") " Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.468275 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-config-data\") pod \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\" (UID: \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\") " Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.468318 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-erlang-cookie-secret\") pod \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\" (UID: \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\") " Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.468333 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-pod-info\") pod \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\" (UID: \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\") " Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.468372 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-plugins-conf\") pod \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\" (UID: \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\") " Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.468435 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-server-conf\") pod \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\" (UID: \"d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af\") " Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.472648 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af" (UID: "d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.479886 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af" (UID: "d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.482077 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af" (UID: "d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.486419 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af" (UID: "d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.488659 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-kube-api-access-68cs7" (OuterVolumeSpecName: "kube-api-access-68cs7") pod "d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af" (UID: "d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af"). InnerVolumeSpecName "kube-api-access-68cs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.495478 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af" (UID: "d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.500878 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af" (UID: "d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.501015 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-pod-info" (OuterVolumeSpecName: "pod-info") pod "d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af" (UID: "d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.545746 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-config-data" (OuterVolumeSpecName: "config-data") pod "d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af" (UID: "d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.570777 4991 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.570809 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68cs7\" (UniqueName: \"kubernetes.io/projected/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-kube-api-access-68cs7\") on node \"crc\" DevicePath \"\"" Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.570842 4991 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.570852 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.570860 4991 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.570869 4991 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-pod-info\") on node \"crc\" DevicePath \"\"" Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.570876 4991 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.570884 4991 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.570896 4991 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.595809 4991 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.598305 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-server-conf" (OuterVolumeSpecName: "server-conf") pod "d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af" (UID: "d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.632703 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af" (UID: "d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.680778 4991 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.680813 4991 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.680823 4991 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af-server-conf\") on node \"crc\" DevicePath \"\"" Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.811346 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.884529 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8380dcf2-30be-4578-937b-5e2a8f4fc074-rabbitmq-erlang-cookie\") pod \"8380dcf2-30be-4578-937b-5e2a8f4fc074\" (UID: \"8380dcf2-30be-4578-937b-5e2a8f4fc074\") " Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.884596 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8380dcf2-30be-4578-937b-5e2a8f4fc074-plugins-conf\") pod \"8380dcf2-30be-4578-937b-5e2a8f4fc074\" (UID: \"8380dcf2-30be-4578-937b-5e2a8f4fc074\") " Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.884626 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8380dcf2-30be-4578-937b-5e2a8f4fc074-rabbitmq-plugins\") pod \"8380dcf2-30be-4578-937b-5e2a8f4fc074\" (UID: \"8380dcf2-30be-4578-937b-5e2a8f4fc074\") " Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.884693 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8380dcf2-30be-4578-937b-5e2a8f4fc074-rabbitmq-tls\") pod \"8380dcf2-30be-4578-937b-5e2a8f4fc074\" (UID: \"8380dcf2-30be-4578-937b-5e2a8f4fc074\") " Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.884780 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8380dcf2-30be-4578-937b-5e2a8f4fc074-rabbitmq-confd\") pod \"8380dcf2-30be-4578-937b-5e2a8f4fc074\" (UID: \"8380dcf2-30be-4578-937b-5e2a8f4fc074\") " Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.884824 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dn5h\" (UniqueName: \"kubernetes.io/projected/8380dcf2-30be-4578-937b-5e2a8f4fc074-kube-api-access-5dn5h\") pod \"8380dcf2-30be-4578-937b-5e2a8f4fc074\" (UID: \"8380dcf2-30be-4578-937b-5e2a8f4fc074\") " Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.884863 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8380dcf2-30be-4578-937b-5e2a8f4fc074-config-data\") pod \"8380dcf2-30be-4578-937b-5e2a8f4fc074\" (UID: \"8380dcf2-30be-4578-937b-5e2a8f4fc074\") " Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.884933 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8380dcf2-30be-4578-937b-5e2a8f4fc074-erlang-cookie-secret\") pod \"8380dcf2-30be-4578-937b-5e2a8f4fc074\" (UID: \"8380dcf2-30be-4578-937b-5e2a8f4fc074\") " Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.885038 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"8380dcf2-30be-4578-937b-5e2a8f4fc074\" (UID: \"8380dcf2-30be-4578-937b-5e2a8f4fc074\") " Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.885092 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8380dcf2-30be-4578-937b-5e2a8f4fc074-pod-info\") pod \"8380dcf2-30be-4578-937b-5e2a8f4fc074\" (UID: \"8380dcf2-30be-4578-937b-5e2a8f4fc074\") " Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.885130 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8380dcf2-30be-4578-937b-5e2a8f4fc074-server-conf\") pod \"8380dcf2-30be-4578-937b-5e2a8f4fc074\" (UID: \"8380dcf2-30be-4578-937b-5e2a8f4fc074\") " Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.885745 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8380dcf2-30be-4578-937b-5e2a8f4fc074-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "8380dcf2-30be-4578-937b-5e2a8f4fc074" (UID: "8380dcf2-30be-4578-937b-5e2a8f4fc074"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.886145 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8380dcf2-30be-4578-937b-5e2a8f4fc074-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "8380dcf2-30be-4578-937b-5e2a8f4fc074" (UID: "8380dcf2-30be-4578-937b-5e2a8f4fc074"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.891238 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "persistence") pod "8380dcf2-30be-4578-937b-5e2a8f4fc074" (UID: "8380dcf2-30be-4578-937b-5e2a8f4fc074"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.896142 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8380dcf2-30be-4578-937b-5e2a8f4fc074-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "8380dcf2-30be-4578-937b-5e2a8f4fc074" (UID: "8380dcf2-30be-4578-937b-5e2a8f4fc074"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.896940 4991 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.896958 4991 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8380dcf2-30be-4578-937b-5e2a8f4fc074-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.897296 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8380dcf2-30be-4578-937b-5e2a8f4fc074-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "8380dcf2-30be-4578-937b-5e2a8f4fc074" (UID: "8380dcf2-30be-4578-937b-5e2a8f4fc074"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.897740 4991 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8380dcf2-30be-4578-937b-5e2a8f4fc074-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.897758 4991 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8380dcf2-30be-4578-937b-5e2a8f4fc074-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.900137 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/8380dcf2-30be-4578-937b-5e2a8f4fc074-pod-info" (OuterVolumeSpecName: "pod-info") pod "8380dcf2-30be-4578-937b-5e2a8f4fc074" (UID: "8380dcf2-30be-4578-937b-5e2a8f4fc074"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.903339 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8380dcf2-30be-4578-937b-5e2a8f4fc074-kube-api-access-5dn5h" (OuterVolumeSpecName: "kube-api-access-5dn5h") pod "8380dcf2-30be-4578-937b-5e2a8f4fc074" (UID: "8380dcf2-30be-4578-937b-5e2a8f4fc074"). InnerVolumeSpecName "kube-api-access-5dn5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.906865 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8380dcf2-30be-4578-937b-5e2a8f4fc074-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "8380dcf2-30be-4578-937b-5e2a8f4fc074" (UID: "8380dcf2-30be-4578-937b-5e2a8f4fc074"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.940005 4991 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.987521 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8380dcf2-30be-4578-937b-5e2a8f4fc074-config-data" (OuterVolumeSpecName: "config-data") pod "8380dcf2-30be-4578-937b-5e2a8f4fc074" (UID: "8380dcf2-30be-4578-937b-5e2a8f4fc074"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.999357 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8380dcf2-30be-4578-937b-5e2a8f4fc074-server-conf" (OuterVolumeSpecName: "server-conf") pod "8380dcf2-30be-4578-937b-5e2a8f4fc074" (UID: "8380dcf2-30be-4578-937b-5e2a8f4fc074"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.999830 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dn5h\" (UniqueName: \"kubernetes.io/projected/8380dcf2-30be-4578-937b-5e2a8f4fc074-kube-api-access-5dn5h\") on node \"crc\" DevicePath \"\"" Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.999857 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8380dcf2-30be-4578-937b-5e2a8f4fc074-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.999872 4991 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8380dcf2-30be-4578-937b-5e2a8f4fc074-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.999886 4991 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.999898 4991 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8380dcf2-30be-4578-937b-5e2a8f4fc074-pod-info\") on node \"crc\" DevicePath \"\"" Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.999909 4991 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8380dcf2-30be-4578-937b-5e2a8f4fc074-server-conf\") on node \"crc\" DevicePath \"\"" Dec 06 01:26:22 crc kubenswrapper[4991]: I1206 01:26:22.999919 4991 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8380dcf2-30be-4578-937b-5e2a8f4fc074-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.052325 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8380dcf2-30be-4578-937b-5e2a8f4fc074-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "8380dcf2-30be-4578-937b-5e2a8f4fc074" (UID: "8380dcf2-30be-4578-937b-5e2a8f4fc074"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.101513 4991 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8380dcf2-30be-4578-937b-5e2a8f4fc074-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.325431 4991 generic.go:334] "Generic (PLEG): container finished" podID="8380dcf2-30be-4578-937b-5e2a8f4fc074" containerID="485ffea6f5a453e06d197597f0b9df4471c81a596a195cf59876927841636f54" exitCode=0 Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.325730 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8380dcf2-30be-4578-937b-5e2a8f4fc074","Type":"ContainerDied","Data":"485ffea6f5a453e06d197597f0b9df4471c81a596a195cf59876927841636f54"} Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.325781 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8380dcf2-30be-4578-937b-5e2a8f4fc074","Type":"ContainerDied","Data":"bdae74660fde60c7bce1f8fd00b55f9fd77c7d915f167364a5bb557731ed285a"} Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.325802 4991 scope.go:117] "RemoveContainer" containerID="485ffea6f5a453e06d197597f0b9df4471c81a596a195cf59876927841636f54" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.325974 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.327056 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.382197 4991 scope.go:117] "RemoveContainer" containerID="906bedf92a02c420aa1b9648b0ca335743a0784def2ed1b54f18aedd5e785b8a" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.383673 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.403439 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.416184 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.429157 4991 scope.go:117] "RemoveContainer" containerID="485ffea6f5a453e06d197597f0b9df4471c81a596a195cf59876927841636f54" Dec 06 01:26:23 crc kubenswrapper[4991]: E1206 01:26:23.429837 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"485ffea6f5a453e06d197597f0b9df4471c81a596a195cf59876927841636f54\": container with ID starting with 485ffea6f5a453e06d197597f0b9df4471c81a596a195cf59876927841636f54 not found: ID does not exist" containerID="485ffea6f5a453e06d197597f0b9df4471c81a596a195cf59876927841636f54" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.429882 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"485ffea6f5a453e06d197597f0b9df4471c81a596a195cf59876927841636f54"} err="failed to get container status \"485ffea6f5a453e06d197597f0b9df4471c81a596a195cf59876927841636f54\": rpc error: code = NotFound desc = could not find container \"485ffea6f5a453e06d197597f0b9df4471c81a596a195cf59876927841636f54\": container with ID starting with 485ffea6f5a453e06d197597f0b9df4471c81a596a195cf59876927841636f54 not found: ID does not exist" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.429911 4991 scope.go:117] "RemoveContainer" containerID="906bedf92a02c420aa1b9648b0ca335743a0784def2ed1b54f18aedd5e785b8a" Dec 06 01:26:23 crc kubenswrapper[4991]: E1206 01:26:23.432161 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"906bedf92a02c420aa1b9648b0ca335743a0784def2ed1b54f18aedd5e785b8a\": container with ID starting with 906bedf92a02c420aa1b9648b0ca335743a0784def2ed1b54f18aedd5e785b8a not found: ID does not exist" containerID="906bedf92a02c420aa1b9648b0ca335743a0784def2ed1b54f18aedd5e785b8a" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.432196 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"906bedf92a02c420aa1b9648b0ca335743a0784def2ed1b54f18aedd5e785b8a"} err="failed to get container status \"906bedf92a02c420aa1b9648b0ca335743a0784def2ed1b54f18aedd5e785b8a\": rpc error: code = NotFound desc = could not find container \"906bedf92a02c420aa1b9648b0ca335743a0784def2ed1b54f18aedd5e785b8a\": container with ID starting with 906bedf92a02c420aa1b9648b0ca335743a0784def2ed1b54f18aedd5e785b8a not found: ID does not exist" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.434556 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.460052 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 01:26:23 crc kubenswrapper[4991]: E1206 01:26:23.460572 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8380dcf2-30be-4578-937b-5e2a8f4fc074" containerName="rabbitmq" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.460591 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="8380dcf2-30be-4578-937b-5e2a8f4fc074" containerName="rabbitmq" Dec 06 01:26:23 crc kubenswrapper[4991]: E1206 01:26:23.460623 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8380dcf2-30be-4578-937b-5e2a8f4fc074" containerName="setup-container" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.460630 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="8380dcf2-30be-4578-937b-5e2a8f4fc074" containerName="setup-container" Dec 06 01:26:23 crc kubenswrapper[4991]: E1206 01:26:23.460642 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af" containerName="rabbitmq" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.460652 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af" containerName="rabbitmq" Dec 06 01:26:23 crc kubenswrapper[4991]: E1206 01:26:23.460665 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af" containerName="setup-container" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.460672 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af" containerName="setup-container" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.460853 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="8380dcf2-30be-4578-937b-5e2a8f4fc074" containerName="rabbitmq" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.460879 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af" containerName="rabbitmq" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.462025 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.467931 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.467970 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.468246 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.468339 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-dkcz5" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.468369 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.468551 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.468822 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.475100 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.497512 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.499329 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.503772 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.504098 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-259kt" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.504195 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.509267 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2e3e3138-a6f6-4005-b57d-ee02b9534e57-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e3e3138-a6f6-4005-b57d-ee02b9534e57\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.509333 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2e3e3138-a6f6-4005-b57d-ee02b9534e57-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e3e3138-a6f6-4005-b57d-ee02b9534e57\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.509365 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2e3e3138-a6f6-4005-b57d-ee02b9534e57-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e3e3138-a6f6-4005-b57d-ee02b9534e57\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.509404 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2e3e3138-a6f6-4005-b57d-ee02b9534e57-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e3e3138-a6f6-4005-b57d-ee02b9534e57\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.509442 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2e3e3138-a6f6-4005-b57d-ee02b9534e57-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e3e3138-a6f6-4005-b57d-ee02b9534e57\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.509473 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2e3e3138-a6f6-4005-b57d-ee02b9534e57-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e3e3138-a6f6-4005-b57d-ee02b9534e57\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.509508 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e3e3138-a6f6-4005-b57d-ee02b9534e57\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.509524 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2s95\" (UniqueName: \"kubernetes.io/projected/2e3e3138-a6f6-4005-b57d-ee02b9534e57-kube-api-access-v2s95\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e3e3138-a6f6-4005-b57d-ee02b9534e57\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.509542 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2e3e3138-a6f6-4005-b57d-ee02b9534e57-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e3e3138-a6f6-4005-b57d-ee02b9534e57\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.509579 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2e3e3138-a6f6-4005-b57d-ee02b9534e57-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e3e3138-a6f6-4005-b57d-ee02b9534e57\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.509623 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2e3e3138-a6f6-4005-b57d-ee02b9534e57-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e3e3138-a6f6-4005-b57d-ee02b9534e57\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.514360 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.514413 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.514582 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.514785 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.517048 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.610965 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2e3e3138-a6f6-4005-b57d-ee02b9534e57-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e3e3138-a6f6-4005-b57d-ee02b9534e57\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.611045 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a-config-data\") pod \"rabbitmq-server-0\" (UID: \"f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a\") " pod="openstack/rabbitmq-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.611078 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e3e3138-a6f6-4005-b57d-ee02b9534e57\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.611095 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a\") " pod="openstack/rabbitmq-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.611115 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2s95\" (UniqueName: \"kubernetes.io/projected/2e3e3138-a6f6-4005-b57d-ee02b9534e57-kube-api-access-v2s95\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e3e3138-a6f6-4005-b57d-ee02b9534e57\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.611137 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2e3e3138-a6f6-4005-b57d-ee02b9534e57-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e3e3138-a6f6-4005-b57d-ee02b9534e57\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.611161 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a\") " pod="openstack/rabbitmq-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.611259 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a\") " pod="openstack/rabbitmq-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.611344 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47vkx\" (UniqueName: \"kubernetes.io/projected/f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a-kube-api-access-47vkx\") pod \"rabbitmq-server-0\" (UID: \"f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a\") " pod="openstack/rabbitmq-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.611376 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2e3e3138-a6f6-4005-b57d-ee02b9534e57-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e3e3138-a6f6-4005-b57d-ee02b9534e57\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.611396 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a\") " pod="openstack/rabbitmq-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.611422 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a\") " pod="openstack/rabbitmq-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.611493 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2e3e3138-a6f6-4005-b57d-ee02b9534e57-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e3e3138-a6f6-4005-b57d-ee02b9534e57\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.611530 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a\") " pod="openstack/rabbitmq-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.611562 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2e3e3138-a6f6-4005-b57d-ee02b9534e57-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e3e3138-a6f6-4005-b57d-ee02b9534e57\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.611769 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2e3e3138-a6f6-4005-b57d-ee02b9534e57-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e3e3138-a6f6-4005-b57d-ee02b9534e57\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.611810 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a\") " pod="openstack/rabbitmq-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.611834 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2e3e3138-a6f6-4005-b57d-ee02b9534e57-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e3e3138-a6f6-4005-b57d-ee02b9534e57\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.611860 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2e3e3138-a6f6-4005-b57d-ee02b9534e57-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e3e3138-a6f6-4005-b57d-ee02b9534e57\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.611918 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2e3e3138-a6f6-4005-b57d-ee02b9534e57-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e3e3138-a6f6-4005-b57d-ee02b9534e57\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.611966 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2e3e3138-a6f6-4005-b57d-ee02b9534e57-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e3e3138-a6f6-4005-b57d-ee02b9534e57\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.612020 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a\") " pod="openstack/rabbitmq-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.612040 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a\") " pod="openstack/rabbitmq-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.612244 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2e3e3138-a6f6-4005-b57d-ee02b9534e57-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e3e3138-a6f6-4005-b57d-ee02b9534e57\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.612296 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2e3e3138-a6f6-4005-b57d-ee02b9534e57-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e3e3138-a6f6-4005-b57d-ee02b9534e57\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.612590 4991 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e3e3138-a6f6-4005-b57d-ee02b9534e57\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.612809 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2e3e3138-a6f6-4005-b57d-ee02b9534e57-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e3e3138-a6f6-4005-b57d-ee02b9534e57\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.612829 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2e3e3138-a6f6-4005-b57d-ee02b9534e57-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e3e3138-a6f6-4005-b57d-ee02b9534e57\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.616326 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2e3e3138-a6f6-4005-b57d-ee02b9534e57-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e3e3138-a6f6-4005-b57d-ee02b9534e57\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.617781 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2e3e3138-a6f6-4005-b57d-ee02b9534e57-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e3e3138-a6f6-4005-b57d-ee02b9534e57\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.618060 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2e3e3138-a6f6-4005-b57d-ee02b9534e57-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e3e3138-a6f6-4005-b57d-ee02b9534e57\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.619708 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2e3e3138-a6f6-4005-b57d-ee02b9534e57-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e3e3138-a6f6-4005-b57d-ee02b9534e57\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.628846 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2s95\" (UniqueName: \"kubernetes.io/projected/2e3e3138-a6f6-4005-b57d-ee02b9534e57-kube-api-access-v2s95\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e3e3138-a6f6-4005-b57d-ee02b9534e57\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.645324 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e3e3138-a6f6-4005-b57d-ee02b9534e57\") " pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.713521 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a\") " pod="openstack/rabbitmq-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.713623 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a\") " pod="openstack/rabbitmq-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.713665 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a\") " pod="openstack/rabbitmq-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.713728 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a\") " pod="openstack/rabbitmq-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.713749 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a\") " pod="openstack/rabbitmq-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.713784 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a-config-data\") pod \"rabbitmq-server-0\" (UID: \"f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a\") " pod="openstack/rabbitmq-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.713807 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a\") " pod="openstack/rabbitmq-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.713836 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a\") " pod="openstack/rabbitmq-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.713853 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a\") " pod="openstack/rabbitmq-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.713877 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47vkx\" (UniqueName: \"kubernetes.io/projected/f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a-kube-api-access-47vkx\") pod \"rabbitmq-server-0\" (UID: \"f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a\") " pod="openstack/rabbitmq-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.713904 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a\") " pod="openstack/rabbitmq-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.714031 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a\") " pod="openstack/rabbitmq-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.714818 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a-config-data\") pod \"rabbitmq-server-0\" (UID: \"f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a\") " pod="openstack/rabbitmq-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.714844 4991 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.715006 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a\") " pod="openstack/rabbitmq-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.715381 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a\") " pod="openstack/rabbitmq-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.715926 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a\") " pod="openstack/rabbitmq-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.718038 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a\") " pod="openstack/rabbitmq-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.718471 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a\") " pod="openstack/rabbitmq-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.718938 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a\") " pod="openstack/rabbitmq-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.719311 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a\") " pod="openstack/rabbitmq-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.731336 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47vkx\" (UniqueName: \"kubernetes.io/projected/f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a-kube-api-access-47vkx\") pod \"rabbitmq-server-0\" (UID: \"f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a\") " pod="openstack/rabbitmq-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.763984 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a\") " pod="openstack/rabbitmq-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.797391 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:26:23 crc kubenswrapper[4991]: I1206 01:26:23.838688 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 06 01:26:24 crc kubenswrapper[4991]: I1206 01:26:24.068565 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-zb8t6"] Dec 06 01:26:24 crc kubenswrapper[4991]: I1206 01:26:24.074560 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-zb8t6" Dec 06 01:26:24 crc kubenswrapper[4991]: I1206 01:26:24.077513 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 06 01:26:24 crc kubenswrapper[4991]: I1206 01:26:24.083330 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-zb8t6"] Dec 06 01:26:24 crc kubenswrapper[4991]: I1206 01:26:24.229523 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e9971f93-88c0-412b-a660-94c7fab43a98-dns-svc\") pod \"dnsmasq-dns-5576978c7c-zb8t6\" (UID: \"e9971f93-88c0-412b-a660-94c7fab43a98\") " pod="openstack/dnsmasq-dns-5576978c7c-zb8t6" Dec 06 01:26:24 crc kubenswrapper[4991]: I1206 01:26:24.230319 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e9971f93-88c0-412b-a660-94c7fab43a98-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-zb8t6\" (UID: \"e9971f93-88c0-412b-a660-94c7fab43a98\") " pod="openstack/dnsmasq-dns-5576978c7c-zb8t6" Dec 06 01:26:24 crc kubenswrapper[4991]: I1206 01:26:24.230387 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9971f93-88c0-412b-a660-94c7fab43a98-config\") pod \"dnsmasq-dns-5576978c7c-zb8t6\" (UID: \"e9971f93-88c0-412b-a660-94c7fab43a98\") " pod="openstack/dnsmasq-dns-5576978c7c-zb8t6" Dec 06 01:26:24 crc kubenswrapper[4991]: I1206 01:26:24.230633 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e9971f93-88c0-412b-a660-94c7fab43a98-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-zb8t6\" (UID: \"e9971f93-88c0-412b-a660-94c7fab43a98\") " pod="openstack/dnsmasq-dns-5576978c7c-zb8t6" Dec 06 01:26:24 crc kubenswrapper[4991]: I1206 01:26:24.230718 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e9971f93-88c0-412b-a660-94c7fab43a98-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-zb8t6\" (UID: \"e9971f93-88c0-412b-a660-94c7fab43a98\") " pod="openstack/dnsmasq-dns-5576978c7c-zb8t6" Dec 06 01:26:24 crc kubenswrapper[4991]: I1206 01:26:24.230744 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e9971f93-88c0-412b-a660-94c7fab43a98-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-zb8t6\" (UID: \"e9971f93-88c0-412b-a660-94c7fab43a98\") " pod="openstack/dnsmasq-dns-5576978c7c-zb8t6" Dec 06 01:26:24 crc kubenswrapper[4991]: I1206 01:26:24.230845 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjsk2\" (UniqueName: \"kubernetes.io/projected/e9971f93-88c0-412b-a660-94c7fab43a98-kube-api-access-gjsk2\") pod \"dnsmasq-dns-5576978c7c-zb8t6\" (UID: \"e9971f93-88c0-412b-a660-94c7fab43a98\") " pod="openstack/dnsmasq-dns-5576978c7c-zb8t6" Dec 06 01:26:24 crc kubenswrapper[4991]: I1206 01:26:24.332837 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e9971f93-88c0-412b-a660-94c7fab43a98-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-zb8t6\" (UID: \"e9971f93-88c0-412b-a660-94c7fab43a98\") " pod="openstack/dnsmasq-dns-5576978c7c-zb8t6" Dec 06 01:26:24 crc kubenswrapper[4991]: I1206 01:26:24.332878 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9971f93-88c0-412b-a660-94c7fab43a98-config\") pod \"dnsmasq-dns-5576978c7c-zb8t6\" (UID: \"e9971f93-88c0-412b-a660-94c7fab43a98\") " pod="openstack/dnsmasq-dns-5576978c7c-zb8t6" Dec 06 01:26:24 crc kubenswrapper[4991]: I1206 01:26:24.332975 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e9971f93-88c0-412b-a660-94c7fab43a98-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-zb8t6\" (UID: \"e9971f93-88c0-412b-a660-94c7fab43a98\") " pod="openstack/dnsmasq-dns-5576978c7c-zb8t6" Dec 06 01:26:24 crc kubenswrapper[4991]: I1206 01:26:24.333011 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e9971f93-88c0-412b-a660-94c7fab43a98-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-zb8t6\" (UID: \"e9971f93-88c0-412b-a660-94c7fab43a98\") " pod="openstack/dnsmasq-dns-5576978c7c-zb8t6" Dec 06 01:26:24 crc kubenswrapper[4991]: I1206 01:26:24.333026 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e9971f93-88c0-412b-a660-94c7fab43a98-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-zb8t6\" (UID: \"e9971f93-88c0-412b-a660-94c7fab43a98\") " pod="openstack/dnsmasq-dns-5576978c7c-zb8t6" Dec 06 01:26:24 crc kubenswrapper[4991]: I1206 01:26:24.333069 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjsk2\" (UniqueName: \"kubernetes.io/projected/e9971f93-88c0-412b-a660-94c7fab43a98-kube-api-access-gjsk2\") pod \"dnsmasq-dns-5576978c7c-zb8t6\" (UID: \"e9971f93-88c0-412b-a660-94c7fab43a98\") " pod="openstack/dnsmasq-dns-5576978c7c-zb8t6" Dec 06 01:26:24 crc kubenswrapper[4991]: I1206 01:26:24.333106 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e9971f93-88c0-412b-a660-94c7fab43a98-dns-svc\") pod \"dnsmasq-dns-5576978c7c-zb8t6\" (UID: \"e9971f93-88c0-412b-a660-94c7fab43a98\") " pod="openstack/dnsmasq-dns-5576978c7c-zb8t6" Dec 06 01:26:24 crc kubenswrapper[4991]: I1206 01:26:24.333796 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e9971f93-88c0-412b-a660-94c7fab43a98-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-zb8t6\" (UID: \"e9971f93-88c0-412b-a660-94c7fab43a98\") " pod="openstack/dnsmasq-dns-5576978c7c-zb8t6" Dec 06 01:26:24 crc kubenswrapper[4991]: I1206 01:26:24.334072 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9971f93-88c0-412b-a660-94c7fab43a98-config\") pod \"dnsmasq-dns-5576978c7c-zb8t6\" (UID: \"e9971f93-88c0-412b-a660-94c7fab43a98\") " pod="openstack/dnsmasq-dns-5576978c7c-zb8t6" Dec 06 01:26:24 crc kubenswrapper[4991]: I1206 01:26:24.334291 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e9971f93-88c0-412b-a660-94c7fab43a98-dns-svc\") pod \"dnsmasq-dns-5576978c7c-zb8t6\" (UID: \"e9971f93-88c0-412b-a660-94c7fab43a98\") " pod="openstack/dnsmasq-dns-5576978c7c-zb8t6" Dec 06 01:26:24 crc kubenswrapper[4991]: I1206 01:26:24.334677 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e9971f93-88c0-412b-a660-94c7fab43a98-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-zb8t6\" (UID: \"e9971f93-88c0-412b-a660-94c7fab43a98\") " pod="openstack/dnsmasq-dns-5576978c7c-zb8t6" Dec 06 01:26:24 crc kubenswrapper[4991]: I1206 01:26:24.334683 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e9971f93-88c0-412b-a660-94c7fab43a98-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-zb8t6\" (UID: \"e9971f93-88c0-412b-a660-94c7fab43a98\") " pod="openstack/dnsmasq-dns-5576978c7c-zb8t6" Dec 06 01:26:24 crc kubenswrapper[4991]: I1206 01:26:24.334742 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e9971f93-88c0-412b-a660-94c7fab43a98-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-zb8t6\" (UID: \"e9971f93-88c0-412b-a660-94c7fab43a98\") " pod="openstack/dnsmasq-dns-5576978c7c-zb8t6" Dec 06 01:26:24 crc kubenswrapper[4991]: I1206 01:26:24.360929 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjsk2\" (UniqueName: \"kubernetes.io/projected/e9971f93-88c0-412b-a660-94c7fab43a98-kube-api-access-gjsk2\") pod \"dnsmasq-dns-5576978c7c-zb8t6\" (UID: \"e9971f93-88c0-412b-a660-94c7fab43a98\") " pod="openstack/dnsmasq-dns-5576978c7c-zb8t6" Dec 06 01:26:24 crc kubenswrapper[4991]: I1206 01:26:24.408119 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 06 01:26:24 crc kubenswrapper[4991]: I1206 01:26:24.412644 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-zb8t6" Dec 06 01:26:24 crc kubenswrapper[4991]: W1206 01:26:24.448349 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf06bb7d4_b2e9_4bd5_8ae4_8cd7458dcf8a.slice/crio-c705e5d918c9b86d5f0f825ec103da73883950121e85d6d92303009140db9ced WatchSource:0}: Error finding container c705e5d918c9b86d5f0f825ec103da73883950121e85d6d92303009140db9ced: Status 404 returned error can't find the container with id c705e5d918c9b86d5f0f825ec103da73883950121e85d6d92303009140db9ced Dec 06 01:26:24 crc kubenswrapper[4991]: I1206 01:26:24.556240 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 06 01:26:24 crc kubenswrapper[4991]: I1206 01:26:24.986923 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-zb8t6"] Dec 06 01:26:25 crc kubenswrapper[4991]: W1206 01:26:25.009224 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9971f93_88c0_412b_a660_94c7fab43a98.slice/crio-88b969335512e5ac73de0441a82509be9591677febe87a906a96bf27e5d0a0fa WatchSource:0}: Error finding container 88b969335512e5ac73de0441a82509be9591677febe87a906a96bf27e5d0a0fa: Status 404 returned error can't find the container with id 88b969335512e5ac73de0441a82509be9591677febe87a906a96bf27e5d0a0fa Dec 06 01:26:25 crc kubenswrapper[4991]: I1206 01:26:25.069965 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8380dcf2-30be-4578-937b-5e2a8f4fc074" path="/var/lib/kubelet/pods/8380dcf2-30be-4578-937b-5e2a8f4fc074/volumes" Dec 06 01:26:25 crc kubenswrapper[4991]: I1206 01:26:25.072678 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af" path="/var/lib/kubelet/pods/d2ea53e8-fa71-4d0d-9c20-d7f1f24ff8af/volumes" Dec 06 01:26:25 crc kubenswrapper[4991]: I1206 01:26:25.347320 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-zb8t6" event={"ID":"e9971f93-88c0-412b-a660-94c7fab43a98","Type":"ContainerStarted","Data":"88b969335512e5ac73de0441a82509be9591677febe87a906a96bf27e5d0a0fa"} Dec 06 01:26:25 crc kubenswrapper[4991]: I1206 01:26:25.351441 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2e3e3138-a6f6-4005-b57d-ee02b9534e57","Type":"ContainerStarted","Data":"19759df709945db9c98ecc1cfd96a75743e8be05989edcf5052b9b7cea5bb6e7"} Dec 06 01:26:25 crc kubenswrapper[4991]: I1206 01:26:25.353927 4991 generic.go:334] "Generic (PLEG): container finished" podID="eb527461-35b2-4636-9ca8-803adf47668f" containerID="17aa7aee82aa054025058af63b1268bee4ec2bb5777b92a4af90456943f7bccd" exitCode=0 Dec 06 01:26:25 crc kubenswrapper[4991]: I1206 01:26:25.354024 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vwwhl" event={"ID":"eb527461-35b2-4636-9ca8-803adf47668f","Type":"ContainerDied","Data":"17aa7aee82aa054025058af63b1268bee4ec2bb5777b92a4af90456943f7bccd"} Dec 06 01:26:25 crc kubenswrapper[4991]: I1206 01:26:25.355380 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a","Type":"ContainerStarted","Data":"c705e5d918c9b86d5f0f825ec103da73883950121e85d6d92303009140db9ced"} Dec 06 01:26:26 crc kubenswrapper[4991]: I1206 01:26:26.367212 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2e3e3138-a6f6-4005-b57d-ee02b9534e57","Type":"ContainerStarted","Data":"f629ed7cfabb5309c79c2440650d949975c853625ab480ba4cbd8d6f9b3da6d9"} Dec 06 01:26:26 crc kubenswrapper[4991]: I1206 01:26:26.369898 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a","Type":"ContainerStarted","Data":"aad132a6e5ebd985cd9c87d0c9efee4f3b8a85b989d6b996bd58db227c43f4ce"} Dec 06 01:26:26 crc kubenswrapper[4991]: I1206 01:26:26.374482 4991 generic.go:334] "Generic (PLEG): container finished" podID="e9971f93-88c0-412b-a660-94c7fab43a98" containerID="4a90743ef33aeb47c6f93044fbd3923478c6129c07c0bf622a6df73a06b52017" exitCode=0 Dec 06 01:26:26 crc kubenswrapper[4991]: I1206 01:26:26.374526 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-zb8t6" event={"ID":"e9971f93-88c0-412b-a660-94c7fab43a98","Type":"ContainerDied","Data":"4a90743ef33aeb47c6f93044fbd3923478c6129c07c0bf622a6df73a06b52017"} Dec 06 01:26:27 crc kubenswrapper[4991]: I1206 01:26:27.387606 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vwwhl" event={"ID":"eb527461-35b2-4636-9ca8-803adf47668f","Type":"ContainerStarted","Data":"020f9b2f8e7ed202e4aa6e85bdf9b0813d8464b3ec4c90593d045cbf08621e1d"} Dec 06 01:26:27 crc kubenswrapper[4991]: I1206 01:26:27.389999 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-zb8t6" event={"ID":"e9971f93-88c0-412b-a660-94c7fab43a98","Type":"ContainerStarted","Data":"749a729c7be8d240d98d7f22cc3e56464e3a2a9de0852d17fa2a4741ed2a4381"} Dec 06 01:26:27 crc kubenswrapper[4991]: I1206 01:26:27.390287 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5576978c7c-zb8t6" Dec 06 01:26:27 crc kubenswrapper[4991]: I1206 01:26:27.414920 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vwwhl" podStartSLOduration=3.469585163 podStartE2EDuration="8.414900102s" podCreationTimestamp="2025-12-06 01:26:19 +0000 UTC" firstStartedPulling="2025-12-06 01:26:21.299501146 +0000 UTC m=+1454.649791614" lastFinishedPulling="2025-12-06 01:26:26.244816085 +0000 UTC m=+1459.595106553" observedRunningTime="2025-12-06 01:26:27.409865574 +0000 UTC m=+1460.760156042" watchObservedRunningTime="2025-12-06 01:26:27.414900102 +0000 UTC m=+1460.765190570" Dec 06 01:26:27 crc kubenswrapper[4991]: I1206 01:26:27.437046 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5576978c7c-zb8t6" podStartSLOduration=3.437029028 podStartE2EDuration="3.437029028s" podCreationTimestamp="2025-12-06 01:26:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:26:27.429271359 +0000 UTC m=+1460.779561857" watchObservedRunningTime="2025-12-06 01:26:27.437029028 +0000 UTC m=+1460.787319496" Dec 06 01:26:30 crc kubenswrapper[4991]: I1206 01:26:30.196788 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vwwhl" Dec 06 01:26:30 crc kubenswrapper[4991]: I1206 01:26:30.197438 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vwwhl" Dec 06 01:26:31 crc kubenswrapper[4991]: I1206 01:26:31.243963 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vwwhl" podUID="eb527461-35b2-4636-9ca8-803adf47668f" containerName="registry-server" probeResult="failure" output=< Dec 06 01:26:31 crc kubenswrapper[4991]: timeout: failed to connect service ":50051" within 1s Dec 06 01:26:31 crc kubenswrapper[4991]: > Dec 06 01:26:34 crc kubenswrapper[4991]: I1206 01:26:34.414320 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5576978c7c-zb8t6" Dec 06 01:26:34 crc kubenswrapper[4991]: I1206 01:26:34.516681 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-7d6sp"] Dec 06 01:26:34 crc kubenswrapper[4991]: I1206 01:26:34.516933 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c7b6c5df9-7d6sp" podUID="73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49" containerName="dnsmasq-dns" containerID="cri-o://13e40c62eb15dcd741fac3f59f61b9a7f59d6e9b4f5146d35a55adb68b2e8de6" gracePeriod=10 Dec 06 01:26:34 crc kubenswrapper[4991]: I1206 01:26:34.693807 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8c6f6df99-69zsb"] Dec 06 01:26:34 crc kubenswrapper[4991]: I1206 01:26:34.696533 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8c6f6df99-69zsb" Dec 06 01:26:34 crc kubenswrapper[4991]: I1206 01:26:34.716615 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8c6f6df99-69zsb"] Dec 06 01:26:34 crc kubenswrapper[4991]: I1206 01:26:34.779852 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbcg5\" (UniqueName: \"kubernetes.io/projected/fa47d36c-149e-4778-b285-fa3b2e71277b-kube-api-access-vbcg5\") pod \"dnsmasq-dns-8c6f6df99-69zsb\" (UID: \"fa47d36c-149e-4778-b285-fa3b2e71277b\") " pod="openstack/dnsmasq-dns-8c6f6df99-69zsb" Dec 06 01:26:34 crc kubenswrapper[4991]: I1206 01:26:34.780173 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa47d36c-149e-4778-b285-fa3b2e71277b-config\") pod \"dnsmasq-dns-8c6f6df99-69zsb\" (UID: \"fa47d36c-149e-4778-b285-fa3b2e71277b\") " pod="openstack/dnsmasq-dns-8c6f6df99-69zsb" Dec 06 01:26:34 crc kubenswrapper[4991]: I1206 01:26:34.780288 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/fa47d36c-149e-4778-b285-fa3b2e71277b-openstack-edpm-ipam\") pod \"dnsmasq-dns-8c6f6df99-69zsb\" (UID: \"fa47d36c-149e-4778-b285-fa3b2e71277b\") " pod="openstack/dnsmasq-dns-8c6f6df99-69zsb" Dec 06 01:26:34 crc kubenswrapper[4991]: I1206 01:26:34.780496 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa47d36c-149e-4778-b285-fa3b2e71277b-ovsdbserver-sb\") pod \"dnsmasq-dns-8c6f6df99-69zsb\" (UID: \"fa47d36c-149e-4778-b285-fa3b2e71277b\") " pod="openstack/dnsmasq-dns-8c6f6df99-69zsb" Dec 06 01:26:34 crc kubenswrapper[4991]: I1206 01:26:34.780564 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fa47d36c-149e-4778-b285-fa3b2e71277b-dns-swift-storage-0\") pod \"dnsmasq-dns-8c6f6df99-69zsb\" (UID: \"fa47d36c-149e-4778-b285-fa3b2e71277b\") " pod="openstack/dnsmasq-dns-8c6f6df99-69zsb" Dec 06 01:26:34 crc kubenswrapper[4991]: I1206 01:26:34.780594 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa47d36c-149e-4778-b285-fa3b2e71277b-dns-svc\") pod \"dnsmasq-dns-8c6f6df99-69zsb\" (UID: \"fa47d36c-149e-4778-b285-fa3b2e71277b\") " pod="openstack/dnsmasq-dns-8c6f6df99-69zsb" Dec 06 01:26:34 crc kubenswrapper[4991]: I1206 01:26:34.780616 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa47d36c-149e-4778-b285-fa3b2e71277b-ovsdbserver-nb\") pod \"dnsmasq-dns-8c6f6df99-69zsb\" (UID: \"fa47d36c-149e-4778-b285-fa3b2e71277b\") " pod="openstack/dnsmasq-dns-8c6f6df99-69zsb" Dec 06 01:26:34 crc kubenswrapper[4991]: I1206 01:26:34.883831 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa47d36c-149e-4778-b285-fa3b2e71277b-config\") pod \"dnsmasq-dns-8c6f6df99-69zsb\" (UID: \"fa47d36c-149e-4778-b285-fa3b2e71277b\") " pod="openstack/dnsmasq-dns-8c6f6df99-69zsb" Dec 06 01:26:34 crc kubenswrapper[4991]: I1206 01:26:34.883889 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/fa47d36c-149e-4778-b285-fa3b2e71277b-openstack-edpm-ipam\") pod \"dnsmasq-dns-8c6f6df99-69zsb\" (UID: \"fa47d36c-149e-4778-b285-fa3b2e71277b\") " pod="openstack/dnsmasq-dns-8c6f6df99-69zsb" Dec 06 01:26:34 crc kubenswrapper[4991]: I1206 01:26:34.884128 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa47d36c-149e-4778-b285-fa3b2e71277b-ovsdbserver-sb\") pod \"dnsmasq-dns-8c6f6df99-69zsb\" (UID: \"fa47d36c-149e-4778-b285-fa3b2e71277b\") " pod="openstack/dnsmasq-dns-8c6f6df99-69zsb" Dec 06 01:26:34 crc kubenswrapper[4991]: I1206 01:26:34.884162 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fa47d36c-149e-4778-b285-fa3b2e71277b-dns-swift-storage-0\") pod \"dnsmasq-dns-8c6f6df99-69zsb\" (UID: \"fa47d36c-149e-4778-b285-fa3b2e71277b\") " pod="openstack/dnsmasq-dns-8c6f6df99-69zsb" Dec 06 01:26:34 crc kubenswrapper[4991]: I1206 01:26:34.884179 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa47d36c-149e-4778-b285-fa3b2e71277b-dns-svc\") pod \"dnsmasq-dns-8c6f6df99-69zsb\" (UID: \"fa47d36c-149e-4778-b285-fa3b2e71277b\") " pod="openstack/dnsmasq-dns-8c6f6df99-69zsb" Dec 06 01:26:34 crc kubenswrapper[4991]: I1206 01:26:34.884194 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa47d36c-149e-4778-b285-fa3b2e71277b-ovsdbserver-nb\") pod \"dnsmasq-dns-8c6f6df99-69zsb\" (UID: \"fa47d36c-149e-4778-b285-fa3b2e71277b\") " pod="openstack/dnsmasq-dns-8c6f6df99-69zsb" Dec 06 01:26:34 crc kubenswrapper[4991]: I1206 01:26:34.884960 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa47d36c-149e-4778-b285-fa3b2e71277b-config\") pod \"dnsmasq-dns-8c6f6df99-69zsb\" (UID: \"fa47d36c-149e-4778-b285-fa3b2e71277b\") " pod="openstack/dnsmasq-dns-8c6f6df99-69zsb" Dec 06 01:26:34 crc kubenswrapper[4991]: I1206 01:26:34.885023 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa47d36c-149e-4778-b285-fa3b2e71277b-ovsdbserver-sb\") pod \"dnsmasq-dns-8c6f6df99-69zsb\" (UID: \"fa47d36c-149e-4778-b285-fa3b2e71277b\") " pod="openstack/dnsmasq-dns-8c6f6df99-69zsb" Dec 06 01:26:34 crc kubenswrapper[4991]: I1206 01:26:34.884561 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbcg5\" (UniqueName: \"kubernetes.io/projected/fa47d36c-149e-4778-b285-fa3b2e71277b-kube-api-access-vbcg5\") pod \"dnsmasq-dns-8c6f6df99-69zsb\" (UID: \"fa47d36c-149e-4778-b285-fa3b2e71277b\") " pod="openstack/dnsmasq-dns-8c6f6df99-69zsb" Dec 06 01:26:34 crc kubenswrapper[4991]: I1206 01:26:34.885461 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/fa47d36c-149e-4778-b285-fa3b2e71277b-openstack-edpm-ipam\") pod \"dnsmasq-dns-8c6f6df99-69zsb\" (UID: \"fa47d36c-149e-4778-b285-fa3b2e71277b\") " pod="openstack/dnsmasq-dns-8c6f6df99-69zsb" Dec 06 01:26:34 crc kubenswrapper[4991]: I1206 01:26:34.887757 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa47d36c-149e-4778-b285-fa3b2e71277b-dns-svc\") pod \"dnsmasq-dns-8c6f6df99-69zsb\" (UID: \"fa47d36c-149e-4778-b285-fa3b2e71277b\") " pod="openstack/dnsmasq-dns-8c6f6df99-69zsb" Dec 06 01:26:34 crc kubenswrapper[4991]: I1206 01:26:34.887942 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa47d36c-149e-4778-b285-fa3b2e71277b-ovsdbserver-nb\") pod \"dnsmasq-dns-8c6f6df99-69zsb\" (UID: \"fa47d36c-149e-4778-b285-fa3b2e71277b\") " pod="openstack/dnsmasq-dns-8c6f6df99-69zsb" Dec 06 01:26:34 crc kubenswrapper[4991]: I1206 01:26:34.887943 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fa47d36c-149e-4778-b285-fa3b2e71277b-dns-swift-storage-0\") pod \"dnsmasq-dns-8c6f6df99-69zsb\" (UID: \"fa47d36c-149e-4778-b285-fa3b2e71277b\") " pod="openstack/dnsmasq-dns-8c6f6df99-69zsb" Dec 06 01:26:34 crc kubenswrapper[4991]: I1206 01:26:34.906405 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbcg5\" (UniqueName: \"kubernetes.io/projected/fa47d36c-149e-4778-b285-fa3b2e71277b-kube-api-access-vbcg5\") pod \"dnsmasq-dns-8c6f6df99-69zsb\" (UID: \"fa47d36c-149e-4778-b285-fa3b2e71277b\") " pod="openstack/dnsmasq-dns-8c6f6df99-69zsb" Dec 06 01:26:35 crc kubenswrapper[4991]: I1206 01:26:35.011543 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-7d6sp" Dec 06 01:26:35 crc kubenswrapper[4991]: I1206 01:26:35.025176 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8c6f6df99-69zsb" Dec 06 01:26:35 crc kubenswrapper[4991]: I1206 01:26:35.088638 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49-ovsdbserver-nb\") pod \"73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49\" (UID: \"73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49\") " Dec 06 01:26:35 crc kubenswrapper[4991]: I1206 01:26:35.088733 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49-dns-swift-storage-0\") pod \"73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49\" (UID: \"73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49\") " Dec 06 01:26:35 crc kubenswrapper[4991]: I1206 01:26:35.088811 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49-ovsdbserver-sb\") pod \"73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49\" (UID: \"73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49\") " Dec 06 01:26:35 crc kubenswrapper[4991]: I1206 01:26:35.088832 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49-config\") pod \"73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49\" (UID: \"73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49\") " Dec 06 01:26:35 crc kubenswrapper[4991]: I1206 01:26:35.088899 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49-dns-svc\") pod \"73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49\" (UID: \"73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49\") " Dec 06 01:26:35 crc kubenswrapper[4991]: I1206 01:26:35.088933 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjs2b\" (UniqueName: \"kubernetes.io/projected/73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49-kube-api-access-mjs2b\") pod \"73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49\" (UID: \"73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49\") " Dec 06 01:26:35 crc kubenswrapper[4991]: I1206 01:26:35.096716 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49-kube-api-access-mjs2b" (OuterVolumeSpecName: "kube-api-access-mjs2b") pod "73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49" (UID: "73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49"). InnerVolumeSpecName "kube-api-access-mjs2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:26:35 crc kubenswrapper[4991]: I1206 01:26:35.152183 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49" (UID: "73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:26:35 crc kubenswrapper[4991]: I1206 01:26:35.169721 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49" (UID: "73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:26:35 crc kubenswrapper[4991]: I1206 01:26:35.169822 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49" (UID: "73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:26:35 crc kubenswrapper[4991]: I1206 01:26:35.173423 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49" (UID: "73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:26:35 crc kubenswrapper[4991]: I1206 01:26:35.177041 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49-config" (OuterVolumeSpecName: "config") pod "73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49" (UID: "73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:26:35 crc kubenswrapper[4991]: I1206 01:26:35.197202 4991 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 01:26:35 crc kubenswrapper[4991]: I1206 01:26:35.197244 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjs2b\" (UniqueName: \"kubernetes.io/projected/73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49-kube-api-access-mjs2b\") on node \"crc\" DevicePath \"\"" Dec 06 01:26:35 crc kubenswrapper[4991]: I1206 01:26:35.197275 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 01:26:35 crc kubenswrapper[4991]: I1206 01:26:35.197291 4991 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 06 01:26:35 crc kubenswrapper[4991]: I1206 01:26:35.197304 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 01:26:35 crc kubenswrapper[4991]: I1206 01:26:35.197316 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:26:35 crc kubenswrapper[4991]: I1206 01:26:35.484472 4991 generic.go:334] "Generic (PLEG): container finished" podID="73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49" containerID="13e40c62eb15dcd741fac3f59f61b9a7f59d6e9b4f5146d35a55adb68b2e8de6" exitCode=0 Dec 06 01:26:35 crc kubenswrapper[4991]: I1206 01:26:35.484527 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-7d6sp" event={"ID":"73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49","Type":"ContainerDied","Data":"13e40c62eb15dcd741fac3f59f61b9a7f59d6e9b4f5146d35a55adb68b2e8de6"} Dec 06 01:26:35 crc kubenswrapper[4991]: I1206 01:26:35.484558 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-7d6sp" event={"ID":"73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49","Type":"ContainerDied","Data":"4cd1a750ec6dabfb18b69aab52b5357900a97ee2eb56c3eb683afa20f364fd77"} Dec 06 01:26:35 crc kubenswrapper[4991]: I1206 01:26:35.484579 4991 scope.go:117] "RemoveContainer" containerID="13e40c62eb15dcd741fac3f59f61b9a7f59d6e9b4f5146d35a55adb68b2e8de6" Dec 06 01:26:35 crc kubenswrapper[4991]: I1206 01:26:35.484746 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-7d6sp" Dec 06 01:26:35 crc kubenswrapper[4991]: I1206 01:26:35.522869 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8c6f6df99-69zsb"] Dec 06 01:26:35 crc kubenswrapper[4991]: I1206 01:26:35.525927 4991 scope.go:117] "RemoveContainer" containerID="d38d022bccd7765326407e3d0f4e63f463bfd51e4267c33efe7ac06179cc8494" Dec 06 01:26:35 crc kubenswrapper[4991]: I1206 01:26:35.534706 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-7d6sp"] Dec 06 01:26:35 crc kubenswrapper[4991]: I1206 01:26:35.545245 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-7d6sp"] Dec 06 01:26:35 crc kubenswrapper[4991]: I1206 01:26:35.559747 4991 scope.go:117] "RemoveContainer" containerID="13e40c62eb15dcd741fac3f59f61b9a7f59d6e9b4f5146d35a55adb68b2e8de6" Dec 06 01:26:35 crc kubenswrapper[4991]: E1206 01:26:35.560167 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13e40c62eb15dcd741fac3f59f61b9a7f59d6e9b4f5146d35a55adb68b2e8de6\": container with ID starting with 13e40c62eb15dcd741fac3f59f61b9a7f59d6e9b4f5146d35a55adb68b2e8de6 not found: ID does not exist" containerID="13e40c62eb15dcd741fac3f59f61b9a7f59d6e9b4f5146d35a55adb68b2e8de6" Dec 06 01:26:35 crc kubenswrapper[4991]: I1206 01:26:35.560217 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13e40c62eb15dcd741fac3f59f61b9a7f59d6e9b4f5146d35a55adb68b2e8de6"} err="failed to get container status \"13e40c62eb15dcd741fac3f59f61b9a7f59d6e9b4f5146d35a55adb68b2e8de6\": rpc error: code = NotFound desc = could not find container \"13e40c62eb15dcd741fac3f59f61b9a7f59d6e9b4f5146d35a55adb68b2e8de6\": container with ID starting with 13e40c62eb15dcd741fac3f59f61b9a7f59d6e9b4f5146d35a55adb68b2e8de6 not found: ID does not exist" Dec 06 01:26:35 crc kubenswrapper[4991]: I1206 01:26:35.560238 4991 scope.go:117] "RemoveContainer" containerID="d38d022bccd7765326407e3d0f4e63f463bfd51e4267c33efe7ac06179cc8494" Dec 06 01:26:35 crc kubenswrapper[4991]: E1206 01:26:35.560806 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d38d022bccd7765326407e3d0f4e63f463bfd51e4267c33efe7ac06179cc8494\": container with ID starting with d38d022bccd7765326407e3d0f4e63f463bfd51e4267c33efe7ac06179cc8494 not found: ID does not exist" containerID="d38d022bccd7765326407e3d0f4e63f463bfd51e4267c33efe7ac06179cc8494" Dec 06 01:26:35 crc kubenswrapper[4991]: I1206 01:26:35.561381 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d38d022bccd7765326407e3d0f4e63f463bfd51e4267c33efe7ac06179cc8494"} err="failed to get container status \"d38d022bccd7765326407e3d0f4e63f463bfd51e4267c33efe7ac06179cc8494\": rpc error: code = NotFound desc = could not find container \"d38d022bccd7765326407e3d0f4e63f463bfd51e4267c33efe7ac06179cc8494\": container with ID starting with d38d022bccd7765326407e3d0f4e63f463bfd51e4267c33efe7ac06179cc8494 not found: ID does not exist" Dec 06 01:26:36 crc kubenswrapper[4991]: I1206 01:26:36.498749 4991 generic.go:334] "Generic (PLEG): container finished" podID="fa47d36c-149e-4778-b285-fa3b2e71277b" containerID="f3768186404542c60c20a0f6c468e3dbe0a925e1d2a454ff018f7dde1ee461f6" exitCode=0 Dec 06 01:26:36 crc kubenswrapper[4991]: I1206 01:26:36.498797 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c6f6df99-69zsb" event={"ID":"fa47d36c-149e-4778-b285-fa3b2e71277b","Type":"ContainerDied","Data":"f3768186404542c60c20a0f6c468e3dbe0a925e1d2a454ff018f7dde1ee461f6"} Dec 06 01:26:36 crc kubenswrapper[4991]: I1206 01:26:36.499155 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c6f6df99-69zsb" event={"ID":"fa47d36c-149e-4778-b285-fa3b2e71277b","Type":"ContainerStarted","Data":"9de3f9c6a01f7396cc0a1f2bfa37bfc8a88c09d45d633666df9941fc9b6619ca"} Dec 06 01:26:37 crc kubenswrapper[4991]: I1206 01:26:37.056837 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49" path="/var/lib/kubelet/pods/73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49/volumes" Dec 06 01:26:37 crc kubenswrapper[4991]: I1206 01:26:37.516177 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c6f6df99-69zsb" event={"ID":"fa47d36c-149e-4778-b285-fa3b2e71277b","Type":"ContainerStarted","Data":"a63c1ac3722088aeb7105e5cad782141992af3c608d970b9353fdbc1def5f21e"} Dec 06 01:26:37 crc kubenswrapper[4991]: I1206 01:26:37.516646 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8c6f6df99-69zsb" Dec 06 01:26:37 crc kubenswrapper[4991]: I1206 01:26:37.540211 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8c6f6df99-69zsb" podStartSLOduration=3.540182406 podStartE2EDuration="3.540182406s" podCreationTimestamp="2025-12-06 01:26:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:26:37.537825106 +0000 UTC m=+1470.888115584" watchObservedRunningTime="2025-12-06 01:26:37.540182406 +0000 UTC m=+1470.890472894" Dec 06 01:26:40 crc kubenswrapper[4991]: I1206 01:26:40.257895 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vwwhl" Dec 06 01:26:40 crc kubenswrapper[4991]: I1206 01:26:40.313761 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vwwhl" Dec 06 01:26:40 crc kubenswrapper[4991]: I1206 01:26:40.501540 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vwwhl"] Dec 06 01:26:41 crc kubenswrapper[4991]: I1206 01:26:41.559243 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vwwhl" podUID="eb527461-35b2-4636-9ca8-803adf47668f" containerName="registry-server" containerID="cri-o://020f9b2f8e7ed202e4aa6e85bdf9b0813d8464b3ec4c90593d045cbf08621e1d" gracePeriod=2 Dec 06 01:26:42 crc kubenswrapper[4991]: I1206 01:26:42.125399 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vwwhl" Dec 06 01:26:42 crc kubenswrapper[4991]: I1206 01:26:42.249225 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb527461-35b2-4636-9ca8-803adf47668f-catalog-content\") pod \"eb527461-35b2-4636-9ca8-803adf47668f\" (UID: \"eb527461-35b2-4636-9ca8-803adf47668f\") " Dec 06 01:26:42 crc kubenswrapper[4991]: I1206 01:26:42.249303 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7s96j\" (UniqueName: \"kubernetes.io/projected/eb527461-35b2-4636-9ca8-803adf47668f-kube-api-access-7s96j\") pod \"eb527461-35b2-4636-9ca8-803adf47668f\" (UID: \"eb527461-35b2-4636-9ca8-803adf47668f\") " Dec 06 01:26:42 crc kubenswrapper[4991]: I1206 01:26:42.249400 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb527461-35b2-4636-9ca8-803adf47668f-utilities\") pod \"eb527461-35b2-4636-9ca8-803adf47668f\" (UID: \"eb527461-35b2-4636-9ca8-803adf47668f\") " Dec 06 01:26:42 crc kubenswrapper[4991]: I1206 01:26:42.250228 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb527461-35b2-4636-9ca8-803adf47668f-utilities" (OuterVolumeSpecName: "utilities") pod "eb527461-35b2-4636-9ca8-803adf47668f" (UID: "eb527461-35b2-4636-9ca8-803adf47668f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:26:42 crc kubenswrapper[4991]: I1206 01:26:42.255816 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb527461-35b2-4636-9ca8-803adf47668f-kube-api-access-7s96j" (OuterVolumeSpecName: "kube-api-access-7s96j") pod "eb527461-35b2-4636-9ca8-803adf47668f" (UID: "eb527461-35b2-4636-9ca8-803adf47668f"). InnerVolumeSpecName "kube-api-access-7s96j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:26:42 crc kubenswrapper[4991]: I1206 01:26:42.346631 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb527461-35b2-4636-9ca8-803adf47668f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eb527461-35b2-4636-9ca8-803adf47668f" (UID: "eb527461-35b2-4636-9ca8-803adf47668f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:26:42 crc kubenswrapper[4991]: I1206 01:26:42.353029 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb527461-35b2-4636-9ca8-803adf47668f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 01:26:42 crc kubenswrapper[4991]: I1206 01:26:42.353077 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7s96j\" (UniqueName: \"kubernetes.io/projected/eb527461-35b2-4636-9ca8-803adf47668f-kube-api-access-7s96j\") on node \"crc\" DevicePath \"\"" Dec 06 01:26:42 crc kubenswrapper[4991]: I1206 01:26:42.353095 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb527461-35b2-4636-9ca8-803adf47668f-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 01:26:42 crc kubenswrapper[4991]: I1206 01:26:42.574908 4991 generic.go:334] "Generic (PLEG): container finished" podID="eb527461-35b2-4636-9ca8-803adf47668f" containerID="020f9b2f8e7ed202e4aa6e85bdf9b0813d8464b3ec4c90593d045cbf08621e1d" exitCode=0 Dec 06 01:26:42 crc kubenswrapper[4991]: I1206 01:26:42.575021 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vwwhl" Dec 06 01:26:42 crc kubenswrapper[4991]: I1206 01:26:42.575039 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vwwhl" event={"ID":"eb527461-35b2-4636-9ca8-803adf47668f","Type":"ContainerDied","Data":"020f9b2f8e7ed202e4aa6e85bdf9b0813d8464b3ec4c90593d045cbf08621e1d"} Dec 06 01:26:42 crc kubenswrapper[4991]: I1206 01:26:42.575249 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vwwhl" event={"ID":"eb527461-35b2-4636-9ca8-803adf47668f","Type":"ContainerDied","Data":"a38be09ecd005f28c4e663126246434639683d49cc498dfb5ddca4cf4fc4ed2f"} Dec 06 01:26:42 crc kubenswrapper[4991]: I1206 01:26:42.575288 4991 scope.go:117] "RemoveContainer" containerID="020f9b2f8e7ed202e4aa6e85bdf9b0813d8464b3ec4c90593d045cbf08621e1d" Dec 06 01:26:42 crc kubenswrapper[4991]: I1206 01:26:42.612199 4991 scope.go:117] "RemoveContainer" containerID="17aa7aee82aa054025058af63b1268bee4ec2bb5777b92a4af90456943f7bccd" Dec 06 01:26:42 crc kubenswrapper[4991]: I1206 01:26:42.632748 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vwwhl"] Dec 06 01:26:42 crc kubenswrapper[4991]: I1206 01:26:42.647910 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vwwhl"] Dec 06 01:26:42 crc kubenswrapper[4991]: I1206 01:26:42.666390 4991 scope.go:117] "RemoveContainer" containerID="c0a5569c5034e37a640465b342a9664ddae29c1e2b92f229d5a1f83f4f4e3067" Dec 06 01:26:42 crc kubenswrapper[4991]: I1206 01:26:42.714643 4991 scope.go:117] "RemoveContainer" containerID="020f9b2f8e7ed202e4aa6e85bdf9b0813d8464b3ec4c90593d045cbf08621e1d" Dec 06 01:26:42 crc kubenswrapper[4991]: E1206 01:26:42.715353 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"020f9b2f8e7ed202e4aa6e85bdf9b0813d8464b3ec4c90593d045cbf08621e1d\": container with ID starting with 020f9b2f8e7ed202e4aa6e85bdf9b0813d8464b3ec4c90593d045cbf08621e1d not found: ID does not exist" containerID="020f9b2f8e7ed202e4aa6e85bdf9b0813d8464b3ec4c90593d045cbf08621e1d" Dec 06 01:26:42 crc kubenswrapper[4991]: I1206 01:26:42.715422 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"020f9b2f8e7ed202e4aa6e85bdf9b0813d8464b3ec4c90593d045cbf08621e1d"} err="failed to get container status \"020f9b2f8e7ed202e4aa6e85bdf9b0813d8464b3ec4c90593d045cbf08621e1d\": rpc error: code = NotFound desc = could not find container \"020f9b2f8e7ed202e4aa6e85bdf9b0813d8464b3ec4c90593d045cbf08621e1d\": container with ID starting with 020f9b2f8e7ed202e4aa6e85bdf9b0813d8464b3ec4c90593d045cbf08621e1d not found: ID does not exist" Dec 06 01:26:42 crc kubenswrapper[4991]: I1206 01:26:42.715466 4991 scope.go:117] "RemoveContainer" containerID="17aa7aee82aa054025058af63b1268bee4ec2bb5777b92a4af90456943f7bccd" Dec 06 01:26:42 crc kubenswrapper[4991]: E1206 01:26:42.715907 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17aa7aee82aa054025058af63b1268bee4ec2bb5777b92a4af90456943f7bccd\": container with ID starting with 17aa7aee82aa054025058af63b1268bee4ec2bb5777b92a4af90456943f7bccd not found: ID does not exist" containerID="17aa7aee82aa054025058af63b1268bee4ec2bb5777b92a4af90456943f7bccd" Dec 06 01:26:42 crc kubenswrapper[4991]: I1206 01:26:42.715941 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17aa7aee82aa054025058af63b1268bee4ec2bb5777b92a4af90456943f7bccd"} err="failed to get container status \"17aa7aee82aa054025058af63b1268bee4ec2bb5777b92a4af90456943f7bccd\": rpc error: code = NotFound desc = could not find container \"17aa7aee82aa054025058af63b1268bee4ec2bb5777b92a4af90456943f7bccd\": container with ID starting with 17aa7aee82aa054025058af63b1268bee4ec2bb5777b92a4af90456943f7bccd not found: ID does not exist" Dec 06 01:26:42 crc kubenswrapper[4991]: I1206 01:26:42.715964 4991 scope.go:117] "RemoveContainer" containerID="c0a5569c5034e37a640465b342a9664ddae29c1e2b92f229d5a1f83f4f4e3067" Dec 06 01:26:42 crc kubenswrapper[4991]: E1206 01:26:42.716448 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0a5569c5034e37a640465b342a9664ddae29c1e2b92f229d5a1f83f4f4e3067\": container with ID starting with c0a5569c5034e37a640465b342a9664ddae29c1e2b92f229d5a1f83f4f4e3067 not found: ID does not exist" containerID="c0a5569c5034e37a640465b342a9664ddae29c1e2b92f229d5a1f83f4f4e3067" Dec 06 01:26:42 crc kubenswrapper[4991]: I1206 01:26:42.716471 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0a5569c5034e37a640465b342a9664ddae29c1e2b92f229d5a1f83f4f4e3067"} err="failed to get container status \"c0a5569c5034e37a640465b342a9664ddae29c1e2b92f229d5a1f83f4f4e3067\": rpc error: code = NotFound desc = could not find container \"c0a5569c5034e37a640465b342a9664ddae29c1e2b92f229d5a1f83f4f4e3067\": container with ID starting with c0a5569c5034e37a640465b342a9664ddae29c1e2b92f229d5a1f83f4f4e3067 not found: ID does not exist" Dec 06 01:26:43 crc kubenswrapper[4991]: I1206 01:26:43.054858 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb527461-35b2-4636-9ca8-803adf47668f" path="/var/lib/kubelet/pods/eb527461-35b2-4636-9ca8-803adf47668f/volumes" Dec 06 01:26:45 crc kubenswrapper[4991]: I1206 01:26:45.027241 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8c6f6df99-69zsb" Dec 06 01:26:45 crc kubenswrapper[4991]: I1206 01:26:45.127605 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-zb8t6"] Dec 06 01:26:45 crc kubenswrapper[4991]: I1206 01:26:45.135356 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5576978c7c-zb8t6" podUID="e9971f93-88c0-412b-a660-94c7fab43a98" containerName="dnsmasq-dns" containerID="cri-o://749a729c7be8d240d98d7f22cc3e56464e3a2a9de0852d17fa2a4741ed2a4381" gracePeriod=10 Dec 06 01:26:45 crc kubenswrapper[4991]: I1206 01:26:45.610891 4991 generic.go:334] "Generic (PLEG): container finished" podID="e9971f93-88c0-412b-a660-94c7fab43a98" containerID="749a729c7be8d240d98d7f22cc3e56464e3a2a9de0852d17fa2a4741ed2a4381" exitCode=0 Dec 06 01:26:45 crc kubenswrapper[4991]: I1206 01:26:45.611021 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-zb8t6" event={"ID":"e9971f93-88c0-412b-a660-94c7fab43a98","Type":"ContainerDied","Data":"749a729c7be8d240d98d7f22cc3e56464e3a2a9de0852d17fa2a4741ed2a4381"} Dec 06 01:26:45 crc kubenswrapper[4991]: I1206 01:26:45.612044 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-zb8t6" event={"ID":"e9971f93-88c0-412b-a660-94c7fab43a98","Type":"ContainerDied","Data":"88b969335512e5ac73de0441a82509be9591677febe87a906a96bf27e5d0a0fa"} Dec 06 01:26:45 crc kubenswrapper[4991]: I1206 01:26:45.612124 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88b969335512e5ac73de0441a82509be9591677febe87a906a96bf27e5d0a0fa" Dec 06 01:26:45 crc kubenswrapper[4991]: I1206 01:26:45.632620 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-zb8t6" Dec 06 01:26:45 crc kubenswrapper[4991]: I1206 01:26:45.733872 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e9971f93-88c0-412b-a660-94c7fab43a98-dns-svc\") pod \"e9971f93-88c0-412b-a660-94c7fab43a98\" (UID: \"e9971f93-88c0-412b-a660-94c7fab43a98\") " Dec 06 01:26:45 crc kubenswrapper[4991]: I1206 01:26:45.734027 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9971f93-88c0-412b-a660-94c7fab43a98-config\") pod \"e9971f93-88c0-412b-a660-94c7fab43a98\" (UID: \"e9971f93-88c0-412b-a660-94c7fab43a98\") " Dec 06 01:26:45 crc kubenswrapper[4991]: I1206 01:26:45.734088 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e9971f93-88c0-412b-a660-94c7fab43a98-ovsdbserver-nb\") pod \"e9971f93-88c0-412b-a660-94c7fab43a98\" (UID: \"e9971f93-88c0-412b-a660-94c7fab43a98\") " Dec 06 01:26:45 crc kubenswrapper[4991]: I1206 01:26:45.734119 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjsk2\" (UniqueName: \"kubernetes.io/projected/e9971f93-88c0-412b-a660-94c7fab43a98-kube-api-access-gjsk2\") pod \"e9971f93-88c0-412b-a660-94c7fab43a98\" (UID: \"e9971f93-88c0-412b-a660-94c7fab43a98\") " Dec 06 01:26:45 crc kubenswrapper[4991]: I1206 01:26:45.734171 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e9971f93-88c0-412b-a660-94c7fab43a98-ovsdbserver-sb\") pod \"e9971f93-88c0-412b-a660-94c7fab43a98\" (UID: \"e9971f93-88c0-412b-a660-94c7fab43a98\") " Dec 06 01:26:45 crc kubenswrapper[4991]: I1206 01:26:45.734292 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e9971f93-88c0-412b-a660-94c7fab43a98-dns-swift-storage-0\") pod \"e9971f93-88c0-412b-a660-94c7fab43a98\" (UID: \"e9971f93-88c0-412b-a660-94c7fab43a98\") " Dec 06 01:26:45 crc kubenswrapper[4991]: I1206 01:26:45.734361 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e9971f93-88c0-412b-a660-94c7fab43a98-openstack-edpm-ipam\") pod \"e9971f93-88c0-412b-a660-94c7fab43a98\" (UID: \"e9971f93-88c0-412b-a660-94c7fab43a98\") " Dec 06 01:26:45 crc kubenswrapper[4991]: I1206 01:26:45.761023 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9971f93-88c0-412b-a660-94c7fab43a98-kube-api-access-gjsk2" (OuterVolumeSpecName: "kube-api-access-gjsk2") pod "e9971f93-88c0-412b-a660-94c7fab43a98" (UID: "e9971f93-88c0-412b-a660-94c7fab43a98"). InnerVolumeSpecName "kube-api-access-gjsk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:26:45 crc kubenswrapper[4991]: I1206 01:26:45.803103 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9971f93-88c0-412b-a660-94c7fab43a98-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e9971f93-88c0-412b-a660-94c7fab43a98" (UID: "e9971f93-88c0-412b-a660-94c7fab43a98"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:26:45 crc kubenswrapper[4991]: I1206 01:26:45.806468 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9971f93-88c0-412b-a660-94c7fab43a98-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "e9971f93-88c0-412b-a660-94c7fab43a98" (UID: "e9971f93-88c0-412b-a660-94c7fab43a98"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:26:45 crc kubenswrapper[4991]: I1206 01:26:45.807294 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9971f93-88c0-412b-a660-94c7fab43a98-config" (OuterVolumeSpecName: "config") pod "e9971f93-88c0-412b-a660-94c7fab43a98" (UID: "e9971f93-88c0-412b-a660-94c7fab43a98"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:26:45 crc kubenswrapper[4991]: I1206 01:26:45.815606 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9971f93-88c0-412b-a660-94c7fab43a98-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e9971f93-88c0-412b-a660-94c7fab43a98" (UID: "e9971f93-88c0-412b-a660-94c7fab43a98"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:26:45 crc kubenswrapper[4991]: I1206 01:26:45.819354 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9971f93-88c0-412b-a660-94c7fab43a98-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e9971f93-88c0-412b-a660-94c7fab43a98" (UID: "e9971f93-88c0-412b-a660-94c7fab43a98"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:26:45 crc kubenswrapper[4991]: I1206 01:26:45.830411 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9971f93-88c0-412b-a660-94c7fab43a98-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e9971f93-88c0-412b-a660-94c7fab43a98" (UID: "e9971f93-88c0-412b-a660-94c7fab43a98"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:26:45 crc kubenswrapper[4991]: I1206 01:26:45.838076 4991 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e9971f93-88c0-412b-a660-94c7fab43a98-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 06 01:26:45 crc kubenswrapper[4991]: I1206 01:26:45.838110 4991 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e9971f93-88c0-412b-a660-94c7fab43a98-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 06 01:26:45 crc kubenswrapper[4991]: I1206 01:26:45.838120 4991 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e9971f93-88c0-412b-a660-94c7fab43a98-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 06 01:26:45 crc kubenswrapper[4991]: I1206 01:26:45.838133 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9971f93-88c0-412b-a660-94c7fab43a98-config\") on node \"crc\" DevicePath \"\"" Dec 06 01:26:45 crc kubenswrapper[4991]: I1206 01:26:45.838142 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e9971f93-88c0-412b-a660-94c7fab43a98-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 06 01:26:45 crc kubenswrapper[4991]: I1206 01:26:45.838150 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjsk2\" (UniqueName: \"kubernetes.io/projected/e9971f93-88c0-412b-a660-94c7fab43a98-kube-api-access-gjsk2\") on node \"crc\" DevicePath \"\"" Dec 06 01:26:45 crc kubenswrapper[4991]: I1206 01:26:45.838161 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e9971f93-88c0-412b-a660-94c7fab43a98-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 06 01:26:45 crc kubenswrapper[4991]: I1206 01:26:45.912300 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-75hjn"] Dec 06 01:26:45 crc kubenswrapper[4991]: E1206 01:26:45.912899 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49" containerName="dnsmasq-dns" Dec 06 01:26:45 crc kubenswrapper[4991]: I1206 01:26:45.912918 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49" containerName="dnsmasq-dns" Dec 06 01:26:45 crc kubenswrapper[4991]: E1206 01:26:45.912933 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49" containerName="init" Dec 06 01:26:45 crc kubenswrapper[4991]: I1206 01:26:45.912942 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49" containerName="init" Dec 06 01:26:45 crc kubenswrapper[4991]: E1206 01:26:45.912964 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9971f93-88c0-412b-a660-94c7fab43a98" containerName="init" Dec 06 01:26:45 crc kubenswrapper[4991]: I1206 01:26:45.912971 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9971f93-88c0-412b-a660-94c7fab43a98" containerName="init" Dec 06 01:26:45 crc kubenswrapper[4991]: E1206 01:26:45.913006 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9971f93-88c0-412b-a660-94c7fab43a98" containerName="dnsmasq-dns" Dec 06 01:26:45 crc kubenswrapper[4991]: I1206 01:26:45.913016 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9971f93-88c0-412b-a660-94c7fab43a98" containerName="dnsmasq-dns" Dec 06 01:26:45 crc kubenswrapper[4991]: E1206 01:26:45.913029 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb527461-35b2-4636-9ca8-803adf47668f" containerName="extract-content" Dec 06 01:26:45 crc kubenswrapper[4991]: I1206 01:26:45.913037 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb527461-35b2-4636-9ca8-803adf47668f" containerName="extract-content" Dec 06 01:26:45 crc kubenswrapper[4991]: E1206 01:26:45.913048 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb527461-35b2-4636-9ca8-803adf47668f" containerName="registry-server" Dec 06 01:26:45 crc kubenswrapper[4991]: I1206 01:26:45.913056 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb527461-35b2-4636-9ca8-803adf47668f" containerName="registry-server" Dec 06 01:26:45 crc kubenswrapper[4991]: E1206 01:26:45.913081 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb527461-35b2-4636-9ca8-803adf47668f" containerName="extract-utilities" Dec 06 01:26:45 crc kubenswrapper[4991]: I1206 01:26:45.913087 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb527461-35b2-4636-9ca8-803adf47668f" containerName="extract-utilities" Dec 06 01:26:45 crc kubenswrapper[4991]: I1206 01:26:45.913308 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9971f93-88c0-412b-a660-94c7fab43a98" containerName="dnsmasq-dns" Dec 06 01:26:45 crc kubenswrapper[4991]: I1206 01:26:45.913335 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="73f37bf5-c4a0-4a59-9d3e-d18a1de7dd49" containerName="dnsmasq-dns" Dec 06 01:26:45 crc kubenswrapper[4991]: I1206 01:26:45.913348 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb527461-35b2-4636-9ca8-803adf47668f" containerName="registry-server" Dec 06 01:26:45 crc kubenswrapper[4991]: I1206 01:26:45.915232 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-75hjn" Dec 06 01:26:45 crc kubenswrapper[4991]: I1206 01:26:45.924144 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-75hjn"] Dec 06 01:26:46 crc kubenswrapper[4991]: I1206 01:26:46.042521 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ec9eda6-ce50-431e-b2a0-b41d606fe1fc-utilities\") pod \"redhat-marketplace-75hjn\" (UID: \"4ec9eda6-ce50-431e-b2a0-b41d606fe1fc\") " pod="openshift-marketplace/redhat-marketplace-75hjn" Dec 06 01:26:46 crc kubenswrapper[4991]: I1206 01:26:46.042576 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc6tv\" (UniqueName: \"kubernetes.io/projected/4ec9eda6-ce50-431e-b2a0-b41d606fe1fc-kube-api-access-jc6tv\") pod \"redhat-marketplace-75hjn\" (UID: \"4ec9eda6-ce50-431e-b2a0-b41d606fe1fc\") " pod="openshift-marketplace/redhat-marketplace-75hjn" Dec 06 01:26:46 crc kubenswrapper[4991]: I1206 01:26:46.042902 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ec9eda6-ce50-431e-b2a0-b41d606fe1fc-catalog-content\") pod \"redhat-marketplace-75hjn\" (UID: \"4ec9eda6-ce50-431e-b2a0-b41d606fe1fc\") " pod="openshift-marketplace/redhat-marketplace-75hjn" Dec 06 01:26:46 crc kubenswrapper[4991]: I1206 01:26:46.145085 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ec9eda6-ce50-431e-b2a0-b41d606fe1fc-utilities\") pod \"redhat-marketplace-75hjn\" (UID: \"4ec9eda6-ce50-431e-b2a0-b41d606fe1fc\") " pod="openshift-marketplace/redhat-marketplace-75hjn" Dec 06 01:26:46 crc kubenswrapper[4991]: I1206 01:26:46.145169 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc6tv\" (UniqueName: \"kubernetes.io/projected/4ec9eda6-ce50-431e-b2a0-b41d606fe1fc-kube-api-access-jc6tv\") pod \"redhat-marketplace-75hjn\" (UID: \"4ec9eda6-ce50-431e-b2a0-b41d606fe1fc\") " pod="openshift-marketplace/redhat-marketplace-75hjn" Dec 06 01:26:46 crc kubenswrapper[4991]: I1206 01:26:46.146094 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ec9eda6-ce50-431e-b2a0-b41d606fe1fc-catalog-content\") pod \"redhat-marketplace-75hjn\" (UID: \"4ec9eda6-ce50-431e-b2a0-b41d606fe1fc\") " pod="openshift-marketplace/redhat-marketplace-75hjn" Dec 06 01:26:46 crc kubenswrapper[4991]: I1206 01:26:46.145588 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ec9eda6-ce50-431e-b2a0-b41d606fe1fc-utilities\") pod \"redhat-marketplace-75hjn\" (UID: \"4ec9eda6-ce50-431e-b2a0-b41d606fe1fc\") " pod="openshift-marketplace/redhat-marketplace-75hjn" Dec 06 01:26:46 crc kubenswrapper[4991]: I1206 01:26:46.146471 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ec9eda6-ce50-431e-b2a0-b41d606fe1fc-catalog-content\") pod \"redhat-marketplace-75hjn\" (UID: \"4ec9eda6-ce50-431e-b2a0-b41d606fe1fc\") " pod="openshift-marketplace/redhat-marketplace-75hjn" Dec 06 01:26:46 crc kubenswrapper[4991]: I1206 01:26:46.165801 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc6tv\" (UniqueName: \"kubernetes.io/projected/4ec9eda6-ce50-431e-b2a0-b41d606fe1fc-kube-api-access-jc6tv\") pod \"redhat-marketplace-75hjn\" (UID: \"4ec9eda6-ce50-431e-b2a0-b41d606fe1fc\") " pod="openshift-marketplace/redhat-marketplace-75hjn" Dec 06 01:26:46 crc kubenswrapper[4991]: I1206 01:26:46.230918 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-75hjn" Dec 06 01:26:46 crc kubenswrapper[4991]: I1206 01:26:46.623611 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-zb8t6" Dec 06 01:26:46 crc kubenswrapper[4991]: I1206 01:26:46.675089 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-zb8t6"] Dec 06 01:26:46 crc kubenswrapper[4991]: I1206 01:26:46.700355 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-zb8t6"] Dec 06 01:26:46 crc kubenswrapper[4991]: I1206 01:26:46.726501 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-75hjn"] Dec 06 01:26:46 crc kubenswrapper[4991]: I1206 01:26:46.745924 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 01:26:46 crc kubenswrapper[4991]: I1206 01:26:46.746017 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 01:26:46 crc kubenswrapper[4991]: I1206 01:26:46.746081 4991 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" Dec 06 01:26:46 crc kubenswrapper[4991]: I1206 01:26:46.752669 4991 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fcb06d7c03c6d3eab8bf59045483a7760319fe20cd9fc9098eb56b0b1ff9d4ef"} pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 01:26:46 crc kubenswrapper[4991]: I1206 01:26:46.752794 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" containerID="cri-o://fcb06d7c03c6d3eab8bf59045483a7760319fe20cd9fc9098eb56b0b1ff9d4ef" gracePeriod=600 Dec 06 01:26:47 crc kubenswrapper[4991]: I1206 01:26:47.068544 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9971f93-88c0-412b-a660-94c7fab43a98" path="/var/lib/kubelet/pods/e9971f93-88c0-412b-a660-94c7fab43a98/volumes" Dec 06 01:26:47 crc kubenswrapper[4991]: I1206 01:26:47.635842 4991 generic.go:334] "Generic (PLEG): container finished" podID="4ec9eda6-ce50-431e-b2a0-b41d606fe1fc" containerID="2325e181f90046b899b4da68166bfdf74270727f4c80cc540b743a9ab9048a58" exitCode=0 Dec 06 01:26:47 crc kubenswrapper[4991]: I1206 01:26:47.635921 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-75hjn" event={"ID":"4ec9eda6-ce50-431e-b2a0-b41d606fe1fc","Type":"ContainerDied","Data":"2325e181f90046b899b4da68166bfdf74270727f4c80cc540b743a9ab9048a58"} Dec 06 01:26:47 crc kubenswrapper[4991]: I1206 01:26:47.636793 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-75hjn" event={"ID":"4ec9eda6-ce50-431e-b2a0-b41d606fe1fc","Type":"ContainerStarted","Data":"52ad3424d4995a9f8ba83695dd20bd44f14ebea8d8099eb569be1ea5d79bee04"} Dec 06 01:26:47 crc kubenswrapper[4991]: I1206 01:26:47.642945 4991 generic.go:334] "Generic (PLEG): container finished" podID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerID="fcb06d7c03c6d3eab8bf59045483a7760319fe20cd9fc9098eb56b0b1ff9d4ef" exitCode=0 Dec 06 01:26:47 crc kubenswrapper[4991]: I1206 01:26:47.642976 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" event={"ID":"f35cea72-9e31-4432-9e3b-16a50ebce794","Type":"ContainerDied","Data":"fcb06d7c03c6d3eab8bf59045483a7760319fe20cd9fc9098eb56b0b1ff9d4ef"} Dec 06 01:26:47 crc kubenswrapper[4991]: I1206 01:26:47.643015 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" event={"ID":"f35cea72-9e31-4432-9e3b-16a50ebce794","Type":"ContainerStarted","Data":"1191fa5f428c9ddfe26ebd749363cce958056a5e267b2384684767a98228d9bb"} Dec 06 01:26:47 crc kubenswrapper[4991]: I1206 01:26:47.643035 4991 scope.go:117] "RemoveContainer" containerID="dba3ac16754d5a6b2843c8e44c7c8c77526e3ed39ca408957653647034fac3f9" Dec 06 01:26:49 crc kubenswrapper[4991]: I1206 01:26:49.677213 4991 generic.go:334] "Generic (PLEG): container finished" podID="4ec9eda6-ce50-431e-b2a0-b41d606fe1fc" containerID="d0f94a63a82de98debd66714717bd1cdfae7aae4db28930eb277fb1a347ed533" exitCode=0 Dec 06 01:26:49 crc kubenswrapper[4991]: I1206 01:26:49.677479 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-75hjn" event={"ID":"4ec9eda6-ce50-431e-b2a0-b41d606fe1fc","Type":"ContainerDied","Data":"d0f94a63a82de98debd66714717bd1cdfae7aae4db28930eb277fb1a347ed533"} Dec 06 01:26:50 crc kubenswrapper[4991]: I1206 01:26:50.700681 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-75hjn" event={"ID":"4ec9eda6-ce50-431e-b2a0-b41d606fe1fc","Type":"ContainerStarted","Data":"8ecc9ad9a01447bfa16738d3a8caa7c7c6aa5b77fd1db0e6dcde8eef8316bd65"} Dec 06 01:26:50 crc kubenswrapper[4991]: I1206 01:26:50.736096 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-75hjn" podStartSLOduration=3.201564859 podStartE2EDuration="5.736063108s" podCreationTimestamp="2025-12-06 01:26:45 +0000 UTC" firstStartedPulling="2025-12-06 01:26:47.638646446 +0000 UTC m=+1480.988936924" lastFinishedPulling="2025-12-06 01:26:50.173144705 +0000 UTC m=+1483.523435173" observedRunningTime="2025-12-06 01:26:50.725632932 +0000 UTC m=+1484.075923480" watchObservedRunningTime="2025-12-06 01:26:50.736063108 +0000 UTC m=+1484.086353656" Dec 06 01:26:56 crc kubenswrapper[4991]: I1206 01:26:56.231661 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-75hjn" Dec 06 01:26:56 crc kubenswrapper[4991]: I1206 01:26:56.232593 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-75hjn" Dec 06 01:26:56 crc kubenswrapper[4991]: I1206 01:26:56.310609 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-75hjn" Dec 06 01:26:56 crc kubenswrapper[4991]: I1206 01:26:56.847454 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-75hjn" Dec 06 01:26:56 crc kubenswrapper[4991]: I1206 01:26:56.914354 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-75hjn"] Dec 06 01:26:58 crc kubenswrapper[4991]: I1206 01:26:58.123300 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lxh7l"] Dec 06 01:26:58 crc kubenswrapper[4991]: I1206 01:26:58.132405 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lxh7l" Dec 06 01:26:58 crc kubenswrapper[4991]: I1206 01:26:58.135572 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 01:26:58 crc kubenswrapper[4991]: I1206 01:26:58.136664 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 01:26:58 crc kubenswrapper[4991]: I1206 01:26:58.137043 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 01:26:58 crc kubenswrapper[4991]: I1206 01:26:58.137372 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9tlxv" Dec 06 01:26:58 crc kubenswrapper[4991]: I1206 01:26:58.173122 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lxh7l"] Dec 06 01:26:58 crc kubenswrapper[4991]: I1206 01:26:58.213059 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/128953a1-f318-43c1-a2bb-3b006c8600a4-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lxh7l\" (UID: \"128953a1-f318-43c1-a2bb-3b006c8600a4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lxh7l" Dec 06 01:26:58 crc kubenswrapper[4991]: I1206 01:26:58.213226 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/128953a1-f318-43c1-a2bb-3b006c8600a4-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lxh7l\" (UID: \"128953a1-f318-43c1-a2bb-3b006c8600a4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lxh7l" Dec 06 01:26:58 crc kubenswrapper[4991]: I1206 01:26:58.213328 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp464\" (UniqueName: \"kubernetes.io/projected/128953a1-f318-43c1-a2bb-3b006c8600a4-kube-api-access-qp464\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lxh7l\" (UID: \"128953a1-f318-43c1-a2bb-3b006c8600a4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lxh7l" Dec 06 01:26:58 crc kubenswrapper[4991]: I1206 01:26:58.213714 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/128953a1-f318-43c1-a2bb-3b006c8600a4-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lxh7l\" (UID: \"128953a1-f318-43c1-a2bb-3b006c8600a4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lxh7l" Dec 06 01:26:58 crc kubenswrapper[4991]: I1206 01:26:58.315468 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/128953a1-f318-43c1-a2bb-3b006c8600a4-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lxh7l\" (UID: \"128953a1-f318-43c1-a2bb-3b006c8600a4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lxh7l" Dec 06 01:26:58 crc kubenswrapper[4991]: I1206 01:26:58.315539 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/128953a1-f318-43c1-a2bb-3b006c8600a4-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lxh7l\" (UID: \"128953a1-f318-43c1-a2bb-3b006c8600a4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lxh7l" Dec 06 01:26:58 crc kubenswrapper[4991]: I1206 01:26:58.315573 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp464\" (UniqueName: \"kubernetes.io/projected/128953a1-f318-43c1-a2bb-3b006c8600a4-kube-api-access-qp464\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lxh7l\" (UID: \"128953a1-f318-43c1-a2bb-3b006c8600a4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lxh7l" Dec 06 01:26:58 crc kubenswrapper[4991]: I1206 01:26:58.315660 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/128953a1-f318-43c1-a2bb-3b006c8600a4-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lxh7l\" (UID: \"128953a1-f318-43c1-a2bb-3b006c8600a4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lxh7l" Dec 06 01:26:58 crc kubenswrapper[4991]: I1206 01:26:58.322865 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/128953a1-f318-43c1-a2bb-3b006c8600a4-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lxh7l\" (UID: \"128953a1-f318-43c1-a2bb-3b006c8600a4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lxh7l" Dec 06 01:26:58 crc kubenswrapper[4991]: I1206 01:26:58.328281 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/128953a1-f318-43c1-a2bb-3b006c8600a4-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lxh7l\" (UID: \"128953a1-f318-43c1-a2bb-3b006c8600a4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lxh7l" Dec 06 01:26:58 crc kubenswrapper[4991]: I1206 01:26:58.334599 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/128953a1-f318-43c1-a2bb-3b006c8600a4-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lxh7l\" (UID: \"128953a1-f318-43c1-a2bb-3b006c8600a4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lxh7l" Dec 06 01:26:58 crc kubenswrapper[4991]: I1206 01:26:58.344423 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp464\" (UniqueName: \"kubernetes.io/projected/128953a1-f318-43c1-a2bb-3b006c8600a4-kube-api-access-qp464\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lxh7l\" (UID: \"128953a1-f318-43c1-a2bb-3b006c8600a4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lxh7l" Dec 06 01:26:58 crc kubenswrapper[4991]: I1206 01:26:58.493290 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lxh7l" Dec 06 01:26:58 crc kubenswrapper[4991]: I1206 01:26:58.804495 4991 generic.go:334] "Generic (PLEG): container finished" podID="f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a" containerID="aad132a6e5ebd985cd9c87d0c9efee4f3b8a85b989d6b996bd58db227c43f4ce" exitCode=0 Dec 06 01:26:58 crc kubenswrapper[4991]: I1206 01:26:58.804584 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a","Type":"ContainerDied","Data":"aad132a6e5ebd985cd9c87d0c9efee4f3b8a85b989d6b996bd58db227c43f4ce"} Dec 06 01:26:58 crc kubenswrapper[4991]: I1206 01:26:58.807218 4991 generic.go:334] "Generic (PLEG): container finished" podID="2e3e3138-a6f6-4005-b57d-ee02b9534e57" containerID="f629ed7cfabb5309c79c2440650d949975c853625ab480ba4cbd8d6f9b3da6d9" exitCode=0 Dec 06 01:26:58 crc kubenswrapper[4991]: I1206 01:26:58.807263 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2e3e3138-a6f6-4005-b57d-ee02b9534e57","Type":"ContainerDied","Data":"f629ed7cfabb5309c79c2440650d949975c853625ab480ba4cbd8d6f9b3da6d9"} Dec 06 01:26:58 crc kubenswrapper[4991]: I1206 01:26:58.807532 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-75hjn" podUID="4ec9eda6-ce50-431e-b2a0-b41d606fe1fc" containerName="registry-server" containerID="cri-o://8ecc9ad9a01447bfa16738d3a8caa7c7c6aa5b77fd1db0e6dcde8eef8316bd65" gracePeriod=2 Dec 06 01:26:59 crc kubenswrapper[4991]: I1206 01:26:59.067085 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lxh7l"] Dec 06 01:26:59 crc kubenswrapper[4991]: I1206 01:26:59.213377 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-75hjn" Dec 06 01:26:59 crc kubenswrapper[4991]: I1206 01:26:59.237052 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ec9eda6-ce50-431e-b2a0-b41d606fe1fc-catalog-content\") pod \"4ec9eda6-ce50-431e-b2a0-b41d606fe1fc\" (UID: \"4ec9eda6-ce50-431e-b2a0-b41d606fe1fc\") " Dec 06 01:26:59 crc kubenswrapper[4991]: I1206 01:26:59.237286 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jc6tv\" (UniqueName: \"kubernetes.io/projected/4ec9eda6-ce50-431e-b2a0-b41d606fe1fc-kube-api-access-jc6tv\") pod \"4ec9eda6-ce50-431e-b2a0-b41d606fe1fc\" (UID: \"4ec9eda6-ce50-431e-b2a0-b41d606fe1fc\") " Dec 06 01:26:59 crc kubenswrapper[4991]: I1206 01:26:59.237387 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ec9eda6-ce50-431e-b2a0-b41d606fe1fc-utilities\") pod \"4ec9eda6-ce50-431e-b2a0-b41d606fe1fc\" (UID: \"4ec9eda6-ce50-431e-b2a0-b41d606fe1fc\") " Dec 06 01:26:59 crc kubenswrapper[4991]: I1206 01:26:59.238087 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ec9eda6-ce50-431e-b2a0-b41d606fe1fc-utilities" (OuterVolumeSpecName: "utilities") pod "4ec9eda6-ce50-431e-b2a0-b41d606fe1fc" (UID: "4ec9eda6-ce50-431e-b2a0-b41d606fe1fc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:26:59 crc kubenswrapper[4991]: I1206 01:26:59.247181 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ec9eda6-ce50-431e-b2a0-b41d606fe1fc-kube-api-access-jc6tv" (OuterVolumeSpecName: "kube-api-access-jc6tv") pod "4ec9eda6-ce50-431e-b2a0-b41d606fe1fc" (UID: "4ec9eda6-ce50-431e-b2a0-b41d606fe1fc"). InnerVolumeSpecName "kube-api-access-jc6tv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:26:59 crc kubenswrapper[4991]: I1206 01:26:59.273814 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ec9eda6-ce50-431e-b2a0-b41d606fe1fc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4ec9eda6-ce50-431e-b2a0-b41d606fe1fc" (UID: "4ec9eda6-ce50-431e-b2a0-b41d606fe1fc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:26:59 crc kubenswrapper[4991]: I1206 01:26:59.340309 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jc6tv\" (UniqueName: \"kubernetes.io/projected/4ec9eda6-ce50-431e-b2a0-b41d606fe1fc-kube-api-access-jc6tv\") on node \"crc\" DevicePath \"\"" Dec 06 01:26:59 crc kubenswrapper[4991]: I1206 01:26:59.340350 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ec9eda6-ce50-431e-b2a0-b41d606fe1fc-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 01:26:59 crc kubenswrapper[4991]: I1206 01:26:59.340362 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ec9eda6-ce50-431e-b2a0-b41d606fe1fc-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 01:26:59 crc kubenswrapper[4991]: I1206 01:26:59.825998 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a","Type":"ContainerStarted","Data":"84b08b0c2d29e3347d9a5766f319ac44897b4480125bab0c777396cc812d6fbc"} Dec 06 01:26:59 crc kubenswrapper[4991]: I1206 01:26:59.826757 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 06 01:26:59 crc kubenswrapper[4991]: I1206 01:26:59.834713 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lxh7l" event={"ID":"128953a1-f318-43c1-a2bb-3b006c8600a4","Type":"ContainerStarted","Data":"4dbd8703f282580c856970b8cd0d57ee155a41b8b318f9207e769846f354c5a6"} Dec 06 01:26:59 crc kubenswrapper[4991]: I1206 01:26:59.838856 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2e3e3138-a6f6-4005-b57d-ee02b9534e57","Type":"ContainerStarted","Data":"5054925809ed2b9861f07e131b00287d2414696954fc0a7deda038826e77dc79"} Dec 06 01:26:59 crc kubenswrapper[4991]: I1206 01:26:59.839569 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:26:59 crc kubenswrapper[4991]: I1206 01:26:59.843785 4991 generic.go:334] "Generic (PLEG): container finished" podID="4ec9eda6-ce50-431e-b2a0-b41d606fe1fc" containerID="8ecc9ad9a01447bfa16738d3a8caa7c7c6aa5b77fd1db0e6dcde8eef8316bd65" exitCode=0 Dec 06 01:26:59 crc kubenswrapper[4991]: I1206 01:26:59.843837 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-75hjn" event={"ID":"4ec9eda6-ce50-431e-b2a0-b41d606fe1fc","Type":"ContainerDied","Data":"8ecc9ad9a01447bfa16738d3a8caa7c7c6aa5b77fd1db0e6dcde8eef8316bd65"} Dec 06 01:26:59 crc kubenswrapper[4991]: I1206 01:26:59.843870 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-75hjn" event={"ID":"4ec9eda6-ce50-431e-b2a0-b41d606fe1fc","Type":"ContainerDied","Data":"52ad3424d4995a9f8ba83695dd20bd44f14ebea8d8099eb569be1ea5d79bee04"} Dec 06 01:26:59 crc kubenswrapper[4991]: I1206 01:26:59.843890 4991 scope.go:117] "RemoveContainer" containerID="8ecc9ad9a01447bfa16738d3a8caa7c7c6aa5b77fd1db0e6dcde8eef8316bd65" Dec 06 01:26:59 crc kubenswrapper[4991]: I1206 01:26:59.843925 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-75hjn" Dec 06 01:26:59 crc kubenswrapper[4991]: I1206 01:26:59.870432 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.870395591 podStartE2EDuration="36.870395591s" podCreationTimestamp="2025-12-06 01:26:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:26:59.858865116 +0000 UTC m=+1493.209155584" watchObservedRunningTime="2025-12-06 01:26:59.870395591 +0000 UTC m=+1493.220686049" Dec 06 01:26:59 crc kubenswrapper[4991]: I1206 01:26:59.890294 4991 scope.go:117] "RemoveContainer" containerID="d0f94a63a82de98debd66714717bd1cdfae7aae4db28930eb277fb1a347ed533" Dec 06 01:26:59 crc kubenswrapper[4991]: I1206 01:26:59.908068 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.899619018 podStartE2EDuration="36.899619018s" podCreationTimestamp="2025-12-06 01:26:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 01:26:59.888585186 +0000 UTC m=+1493.238875664" watchObservedRunningTime="2025-12-06 01:26:59.899619018 +0000 UTC m=+1493.249909486" Dec 06 01:26:59 crc kubenswrapper[4991]: I1206 01:26:59.932968 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-75hjn"] Dec 06 01:26:59 crc kubenswrapper[4991]: I1206 01:26:59.942150 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-75hjn"] Dec 06 01:26:59 crc kubenswrapper[4991]: I1206 01:26:59.974469 4991 scope.go:117] "RemoveContainer" containerID="2325e181f90046b899b4da68166bfdf74270727f4c80cc540b743a9ab9048a58" Dec 06 01:26:59 crc kubenswrapper[4991]: I1206 01:26:59.997540 4991 scope.go:117] "RemoveContainer" containerID="8ecc9ad9a01447bfa16738d3a8caa7c7c6aa5b77fd1db0e6dcde8eef8316bd65" Dec 06 01:26:59 crc kubenswrapper[4991]: E1206 01:26:59.998106 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ecc9ad9a01447bfa16738d3a8caa7c7c6aa5b77fd1db0e6dcde8eef8316bd65\": container with ID starting with 8ecc9ad9a01447bfa16738d3a8caa7c7c6aa5b77fd1db0e6dcde8eef8316bd65 not found: ID does not exist" containerID="8ecc9ad9a01447bfa16738d3a8caa7c7c6aa5b77fd1db0e6dcde8eef8316bd65" Dec 06 01:26:59 crc kubenswrapper[4991]: I1206 01:26:59.998158 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ecc9ad9a01447bfa16738d3a8caa7c7c6aa5b77fd1db0e6dcde8eef8316bd65"} err="failed to get container status \"8ecc9ad9a01447bfa16738d3a8caa7c7c6aa5b77fd1db0e6dcde8eef8316bd65\": rpc error: code = NotFound desc = could not find container \"8ecc9ad9a01447bfa16738d3a8caa7c7c6aa5b77fd1db0e6dcde8eef8316bd65\": container with ID starting with 8ecc9ad9a01447bfa16738d3a8caa7c7c6aa5b77fd1db0e6dcde8eef8316bd65 not found: ID does not exist" Dec 06 01:26:59 crc kubenswrapper[4991]: I1206 01:26:59.998196 4991 scope.go:117] "RemoveContainer" containerID="d0f94a63a82de98debd66714717bd1cdfae7aae4db28930eb277fb1a347ed533" Dec 06 01:26:59 crc kubenswrapper[4991]: E1206 01:26:59.998639 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0f94a63a82de98debd66714717bd1cdfae7aae4db28930eb277fb1a347ed533\": container with ID starting with d0f94a63a82de98debd66714717bd1cdfae7aae4db28930eb277fb1a347ed533 not found: ID does not exist" containerID="d0f94a63a82de98debd66714717bd1cdfae7aae4db28930eb277fb1a347ed533" Dec 06 01:26:59 crc kubenswrapper[4991]: I1206 01:26:59.998667 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0f94a63a82de98debd66714717bd1cdfae7aae4db28930eb277fb1a347ed533"} err="failed to get container status \"d0f94a63a82de98debd66714717bd1cdfae7aae4db28930eb277fb1a347ed533\": rpc error: code = NotFound desc = could not find container \"d0f94a63a82de98debd66714717bd1cdfae7aae4db28930eb277fb1a347ed533\": container with ID starting with d0f94a63a82de98debd66714717bd1cdfae7aae4db28930eb277fb1a347ed533 not found: ID does not exist" Dec 06 01:26:59 crc kubenswrapper[4991]: I1206 01:26:59.998684 4991 scope.go:117] "RemoveContainer" containerID="2325e181f90046b899b4da68166bfdf74270727f4c80cc540b743a9ab9048a58" Dec 06 01:26:59 crc kubenswrapper[4991]: E1206 01:26:59.999073 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2325e181f90046b899b4da68166bfdf74270727f4c80cc540b743a9ab9048a58\": container with ID starting with 2325e181f90046b899b4da68166bfdf74270727f4c80cc540b743a9ab9048a58 not found: ID does not exist" containerID="2325e181f90046b899b4da68166bfdf74270727f4c80cc540b743a9ab9048a58" Dec 06 01:26:59 crc kubenswrapper[4991]: I1206 01:26:59.999124 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2325e181f90046b899b4da68166bfdf74270727f4c80cc540b743a9ab9048a58"} err="failed to get container status \"2325e181f90046b899b4da68166bfdf74270727f4c80cc540b743a9ab9048a58\": rpc error: code = NotFound desc = could not find container \"2325e181f90046b899b4da68166bfdf74270727f4c80cc540b743a9ab9048a58\": container with ID starting with 2325e181f90046b899b4da68166bfdf74270727f4c80cc540b743a9ab9048a58 not found: ID does not exist" Dec 06 01:27:00 crc kubenswrapper[4991]: E1206 01:27:00.071160 4991 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ec9eda6_ce50_431e_b2a0_b41d606fe1fc.slice/crio-52ad3424d4995a9f8ba83695dd20bd44f14ebea8d8099eb569be1ea5d79bee04\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ec9eda6_ce50_431e_b2a0_b41d606fe1fc.slice\": RecentStats: unable to find data in memory cache]" Dec 06 01:27:01 crc kubenswrapper[4991]: I1206 01:27:01.049763 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ec9eda6-ce50-431e-b2a0-b41d606fe1fc" path="/var/lib/kubelet/pods/4ec9eda6-ce50-431e-b2a0-b41d606fe1fc/volumes" Dec 06 01:27:13 crc kubenswrapper[4991]: I1206 01:27:13.802175 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 06 01:27:13 crc kubenswrapper[4991]: I1206 01:27:13.843184 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 06 01:27:14 crc kubenswrapper[4991]: I1206 01:27:14.427158 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 01:27:14 crc kubenswrapper[4991]: I1206 01:27:14.779924 4991 scope.go:117] "RemoveContainer" containerID="ba88aa1a0ccfd2b59d2138748ee66af24deb48ba69955de9f56aa083e4ca9001" Dec 06 01:27:15 crc kubenswrapper[4991]: I1206 01:27:15.035278 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lxh7l" event={"ID":"128953a1-f318-43c1-a2bb-3b006c8600a4","Type":"ContainerStarted","Data":"53a167df08d721172bc03e0db4e19dfa765362552a82e8184718491b59d455fa"} Dec 06 01:27:15 crc kubenswrapper[4991]: I1206 01:27:15.062234 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lxh7l" podStartSLOduration=1.7227937089999998 podStartE2EDuration="17.062211553s" podCreationTimestamp="2025-12-06 01:26:58 +0000 UTC" firstStartedPulling="2025-12-06 01:26:59.085412473 +0000 UTC m=+1492.435702941" lastFinishedPulling="2025-12-06 01:27:14.424830317 +0000 UTC m=+1507.775120785" observedRunningTime="2025-12-06 01:27:15.061624988 +0000 UTC m=+1508.411915476" watchObservedRunningTime="2025-12-06 01:27:15.062211553 +0000 UTC m=+1508.412502041" Dec 06 01:27:26 crc kubenswrapper[4991]: I1206 01:27:26.159163 4991 generic.go:334] "Generic (PLEG): container finished" podID="128953a1-f318-43c1-a2bb-3b006c8600a4" containerID="53a167df08d721172bc03e0db4e19dfa765362552a82e8184718491b59d455fa" exitCode=0 Dec 06 01:27:26 crc kubenswrapper[4991]: I1206 01:27:26.159773 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lxh7l" event={"ID":"128953a1-f318-43c1-a2bb-3b006c8600a4","Type":"ContainerDied","Data":"53a167df08d721172bc03e0db4e19dfa765362552a82e8184718491b59d455fa"} Dec 06 01:27:27 crc kubenswrapper[4991]: I1206 01:27:27.752978 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lxh7l" Dec 06 01:27:27 crc kubenswrapper[4991]: I1206 01:27:27.837100 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/128953a1-f318-43c1-a2bb-3b006c8600a4-repo-setup-combined-ca-bundle\") pod \"128953a1-f318-43c1-a2bb-3b006c8600a4\" (UID: \"128953a1-f318-43c1-a2bb-3b006c8600a4\") " Dec 06 01:27:27 crc kubenswrapper[4991]: I1206 01:27:27.837153 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/128953a1-f318-43c1-a2bb-3b006c8600a4-ssh-key\") pod \"128953a1-f318-43c1-a2bb-3b006c8600a4\" (UID: \"128953a1-f318-43c1-a2bb-3b006c8600a4\") " Dec 06 01:27:27 crc kubenswrapper[4991]: I1206 01:27:27.837209 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/128953a1-f318-43c1-a2bb-3b006c8600a4-inventory\") pod \"128953a1-f318-43c1-a2bb-3b006c8600a4\" (UID: \"128953a1-f318-43c1-a2bb-3b006c8600a4\") " Dec 06 01:27:27 crc kubenswrapper[4991]: I1206 01:27:27.837266 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qp464\" (UniqueName: \"kubernetes.io/projected/128953a1-f318-43c1-a2bb-3b006c8600a4-kube-api-access-qp464\") pod \"128953a1-f318-43c1-a2bb-3b006c8600a4\" (UID: \"128953a1-f318-43c1-a2bb-3b006c8600a4\") " Dec 06 01:27:27 crc kubenswrapper[4991]: I1206 01:27:27.866280 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/128953a1-f318-43c1-a2bb-3b006c8600a4-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "128953a1-f318-43c1-a2bb-3b006c8600a4" (UID: "128953a1-f318-43c1-a2bb-3b006c8600a4"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:27:27 crc kubenswrapper[4991]: I1206 01:27:27.866465 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/128953a1-f318-43c1-a2bb-3b006c8600a4-kube-api-access-qp464" (OuterVolumeSpecName: "kube-api-access-qp464") pod "128953a1-f318-43c1-a2bb-3b006c8600a4" (UID: "128953a1-f318-43c1-a2bb-3b006c8600a4"). InnerVolumeSpecName "kube-api-access-qp464". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:27:27 crc kubenswrapper[4991]: I1206 01:27:27.933485 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/128953a1-f318-43c1-a2bb-3b006c8600a4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "128953a1-f318-43c1-a2bb-3b006c8600a4" (UID: "128953a1-f318-43c1-a2bb-3b006c8600a4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:27:27 crc kubenswrapper[4991]: I1206 01:27:27.938761 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/128953a1-f318-43c1-a2bb-3b006c8600a4-inventory" (OuterVolumeSpecName: "inventory") pod "128953a1-f318-43c1-a2bb-3b006c8600a4" (UID: "128953a1-f318-43c1-a2bb-3b006c8600a4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:27:27 crc kubenswrapper[4991]: I1206 01:27:27.940392 4991 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/128953a1-f318-43c1-a2bb-3b006c8600a4-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:27:27 crc kubenswrapper[4991]: I1206 01:27:27.940409 4991 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/128953a1-f318-43c1-a2bb-3b006c8600a4-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 01:27:27 crc kubenswrapper[4991]: I1206 01:27:27.940419 4991 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/128953a1-f318-43c1-a2bb-3b006c8600a4-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 01:27:27 crc kubenswrapper[4991]: I1206 01:27:27.940429 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qp464\" (UniqueName: \"kubernetes.io/projected/128953a1-f318-43c1-a2bb-3b006c8600a4-kube-api-access-qp464\") on node \"crc\" DevicePath \"\"" Dec 06 01:27:28 crc kubenswrapper[4991]: I1206 01:27:28.180415 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lxh7l" event={"ID":"128953a1-f318-43c1-a2bb-3b006c8600a4","Type":"ContainerDied","Data":"4dbd8703f282580c856970b8cd0d57ee155a41b8b318f9207e769846f354c5a6"} Dec 06 01:27:28 crc kubenswrapper[4991]: I1206 01:27:28.180452 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4dbd8703f282580c856970b8cd0d57ee155a41b8b318f9207e769846f354c5a6" Dec 06 01:27:28 crc kubenswrapper[4991]: I1206 01:27:28.180507 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lxh7l" Dec 06 01:27:28 crc kubenswrapper[4991]: I1206 01:27:28.305757 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-zvphr"] Dec 06 01:27:28 crc kubenswrapper[4991]: E1206 01:27:28.306214 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ec9eda6-ce50-431e-b2a0-b41d606fe1fc" containerName="extract-utilities" Dec 06 01:27:28 crc kubenswrapper[4991]: I1206 01:27:28.306231 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ec9eda6-ce50-431e-b2a0-b41d606fe1fc" containerName="extract-utilities" Dec 06 01:27:28 crc kubenswrapper[4991]: E1206 01:27:28.306250 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="128953a1-f318-43c1-a2bb-3b006c8600a4" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 06 01:27:28 crc kubenswrapper[4991]: I1206 01:27:28.306259 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="128953a1-f318-43c1-a2bb-3b006c8600a4" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 06 01:27:28 crc kubenswrapper[4991]: E1206 01:27:28.306275 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ec9eda6-ce50-431e-b2a0-b41d606fe1fc" containerName="extract-content" Dec 06 01:27:28 crc kubenswrapper[4991]: I1206 01:27:28.306281 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ec9eda6-ce50-431e-b2a0-b41d606fe1fc" containerName="extract-content" Dec 06 01:27:28 crc kubenswrapper[4991]: E1206 01:27:28.306311 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ec9eda6-ce50-431e-b2a0-b41d606fe1fc" containerName="registry-server" Dec 06 01:27:28 crc kubenswrapper[4991]: I1206 01:27:28.306316 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ec9eda6-ce50-431e-b2a0-b41d606fe1fc" containerName="registry-server" Dec 06 01:27:28 crc kubenswrapper[4991]: I1206 01:27:28.306513 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ec9eda6-ce50-431e-b2a0-b41d606fe1fc" containerName="registry-server" Dec 06 01:27:28 crc kubenswrapper[4991]: I1206 01:27:28.306543 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="128953a1-f318-43c1-a2bb-3b006c8600a4" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 06 01:27:28 crc kubenswrapper[4991]: I1206 01:27:28.307247 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zvphr" Dec 06 01:27:28 crc kubenswrapper[4991]: I1206 01:27:28.309727 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 01:27:28 crc kubenswrapper[4991]: I1206 01:27:28.315250 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 01:27:28 crc kubenswrapper[4991]: I1206 01:27:28.315906 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9tlxv" Dec 06 01:27:28 crc kubenswrapper[4991]: I1206 01:27:28.315904 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 01:27:28 crc kubenswrapper[4991]: I1206 01:27:28.318714 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-zvphr"] Dec 06 01:27:28 crc kubenswrapper[4991]: I1206 01:27:28.449190 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eb12fac8-d6c4-4be8-907f-a06cae6bc067-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zvphr\" (UID: \"eb12fac8-d6c4-4be8-907f-a06cae6bc067\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zvphr" Dec 06 01:27:28 crc kubenswrapper[4991]: I1206 01:27:28.449257 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zps58\" (UniqueName: \"kubernetes.io/projected/eb12fac8-d6c4-4be8-907f-a06cae6bc067-kube-api-access-zps58\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zvphr\" (UID: \"eb12fac8-d6c4-4be8-907f-a06cae6bc067\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zvphr" Dec 06 01:27:28 crc kubenswrapper[4991]: I1206 01:27:28.449313 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb12fac8-d6c4-4be8-907f-a06cae6bc067-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zvphr\" (UID: \"eb12fac8-d6c4-4be8-907f-a06cae6bc067\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zvphr" Dec 06 01:27:28 crc kubenswrapper[4991]: I1206 01:27:28.551493 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eb12fac8-d6c4-4be8-907f-a06cae6bc067-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zvphr\" (UID: \"eb12fac8-d6c4-4be8-907f-a06cae6bc067\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zvphr" Dec 06 01:27:28 crc kubenswrapper[4991]: I1206 01:27:28.551551 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zps58\" (UniqueName: \"kubernetes.io/projected/eb12fac8-d6c4-4be8-907f-a06cae6bc067-kube-api-access-zps58\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zvphr\" (UID: \"eb12fac8-d6c4-4be8-907f-a06cae6bc067\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zvphr" Dec 06 01:27:28 crc kubenswrapper[4991]: I1206 01:27:28.551611 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb12fac8-d6c4-4be8-907f-a06cae6bc067-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zvphr\" (UID: \"eb12fac8-d6c4-4be8-907f-a06cae6bc067\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zvphr" Dec 06 01:27:28 crc kubenswrapper[4991]: I1206 01:27:28.556825 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb12fac8-d6c4-4be8-907f-a06cae6bc067-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zvphr\" (UID: \"eb12fac8-d6c4-4be8-907f-a06cae6bc067\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zvphr" Dec 06 01:27:28 crc kubenswrapper[4991]: I1206 01:27:28.557398 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eb12fac8-d6c4-4be8-907f-a06cae6bc067-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zvphr\" (UID: \"eb12fac8-d6c4-4be8-907f-a06cae6bc067\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zvphr" Dec 06 01:27:28 crc kubenswrapper[4991]: I1206 01:27:28.572142 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zps58\" (UniqueName: \"kubernetes.io/projected/eb12fac8-d6c4-4be8-907f-a06cae6bc067-kube-api-access-zps58\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zvphr\" (UID: \"eb12fac8-d6c4-4be8-907f-a06cae6bc067\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zvphr" Dec 06 01:27:28 crc kubenswrapper[4991]: I1206 01:27:28.629205 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zvphr" Dec 06 01:27:29 crc kubenswrapper[4991]: I1206 01:27:29.230122 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-zvphr"] Dec 06 01:27:30 crc kubenswrapper[4991]: I1206 01:27:30.202090 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zvphr" event={"ID":"eb12fac8-d6c4-4be8-907f-a06cae6bc067","Type":"ContainerStarted","Data":"18fe720bb7462ee70c312676c7bbb3fe02a7926ae43ef5555d4b6eabc6e23568"} Dec 06 01:27:30 crc kubenswrapper[4991]: I1206 01:27:30.202585 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zvphr" event={"ID":"eb12fac8-d6c4-4be8-907f-a06cae6bc067","Type":"ContainerStarted","Data":"a8e8ff985a82105e7307fec65eb8d9c400525c6d370dd295a453a3e560fbbe9b"} Dec 06 01:27:30 crc kubenswrapper[4991]: I1206 01:27:30.249233 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zvphr" podStartSLOduration=2.07625494 podStartE2EDuration="2.249207909s" podCreationTimestamp="2025-12-06 01:27:28 +0000 UTC" firstStartedPulling="2025-12-06 01:27:29.232265465 +0000 UTC m=+1522.582555973" lastFinishedPulling="2025-12-06 01:27:29.405218474 +0000 UTC m=+1522.755508942" observedRunningTime="2025-12-06 01:27:30.222580269 +0000 UTC m=+1523.572870747" watchObservedRunningTime="2025-12-06 01:27:30.249207909 +0000 UTC m=+1523.599498387" Dec 06 01:27:32 crc kubenswrapper[4991]: I1206 01:27:32.220471 4991 generic.go:334] "Generic (PLEG): container finished" podID="eb12fac8-d6c4-4be8-907f-a06cae6bc067" containerID="18fe720bb7462ee70c312676c7bbb3fe02a7926ae43ef5555d4b6eabc6e23568" exitCode=0 Dec 06 01:27:32 crc kubenswrapper[4991]: I1206 01:27:32.220576 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zvphr" event={"ID":"eb12fac8-d6c4-4be8-907f-a06cae6bc067","Type":"ContainerDied","Data":"18fe720bb7462ee70c312676c7bbb3fe02a7926ae43ef5555d4b6eabc6e23568"} Dec 06 01:27:33 crc kubenswrapper[4991]: I1206 01:27:33.788371 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zvphr" Dec 06 01:27:33 crc kubenswrapper[4991]: I1206 01:27:33.852910 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eb12fac8-d6c4-4be8-907f-a06cae6bc067-ssh-key\") pod \"eb12fac8-d6c4-4be8-907f-a06cae6bc067\" (UID: \"eb12fac8-d6c4-4be8-907f-a06cae6bc067\") " Dec 06 01:27:33 crc kubenswrapper[4991]: I1206 01:27:33.853010 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb12fac8-d6c4-4be8-907f-a06cae6bc067-inventory\") pod \"eb12fac8-d6c4-4be8-907f-a06cae6bc067\" (UID: \"eb12fac8-d6c4-4be8-907f-a06cae6bc067\") " Dec 06 01:27:33 crc kubenswrapper[4991]: I1206 01:27:33.853040 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zps58\" (UniqueName: \"kubernetes.io/projected/eb12fac8-d6c4-4be8-907f-a06cae6bc067-kube-api-access-zps58\") pod \"eb12fac8-d6c4-4be8-907f-a06cae6bc067\" (UID: \"eb12fac8-d6c4-4be8-907f-a06cae6bc067\") " Dec 06 01:27:33 crc kubenswrapper[4991]: I1206 01:27:33.861349 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb12fac8-d6c4-4be8-907f-a06cae6bc067-kube-api-access-zps58" (OuterVolumeSpecName: "kube-api-access-zps58") pod "eb12fac8-d6c4-4be8-907f-a06cae6bc067" (UID: "eb12fac8-d6c4-4be8-907f-a06cae6bc067"). InnerVolumeSpecName "kube-api-access-zps58". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:27:33 crc kubenswrapper[4991]: I1206 01:27:33.902528 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb12fac8-d6c4-4be8-907f-a06cae6bc067-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "eb12fac8-d6c4-4be8-907f-a06cae6bc067" (UID: "eb12fac8-d6c4-4be8-907f-a06cae6bc067"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:27:33 crc kubenswrapper[4991]: I1206 01:27:33.914682 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb12fac8-d6c4-4be8-907f-a06cae6bc067-inventory" (OuterVolumeSpecName: "inventory") pod "eb12fac8-d6c4-4be8-907f-a06cae6bc067" (UID: "eb12fac8-d6c4-4be8-907f-a06cae6bc067"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:27:33 crc kubenswrapper[4991]: I1206 01:27:33.956770 4991 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eb12fac8-d6c4-4be8-907f-a06cae6bc067-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 01:27:33 crc kubenswrapper[4991]: I1206 01:27:33.956835 4991 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb12fac8-d6c4-4be8-907f-a06cae6bc067-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 01:27:33 crc kubenswrapper[4991]: I1206 01:27:33.956856 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zps58\" (UniqueName: \"kubernetes.io/projected/eb12fac8-d6c4-4be8-907f-a06cae6bc067-kube-api-access-zps58\") on node \"crc\" DevicePath \"\"" Dec 06 01:27:34 crc kubenswrapper[4991]: I1206 01:27:34.247845 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zvphr" event={"ID":"eb12fac8-d6c4-4be8-907f-a06cae6bc067","Type":"ContainerDied","Data":"a8e8ff985a82105e7307fec65eb8d9c400525c6d370dd295a453a3e560fbbe9b"} Dec 06 01:27:34 crc kubenswrapper[4991]: I1206 01:27:34.247934 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8e8ff985a82105e7307fec65eb8d9c400525c6d370dd295a453a3e560fbbe9b" Dec 06 01:27:34 crc kubenswrapper[4991]: I1206 01:27:34.248033 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zvphr" Dec 06 01:27:34 crc kubenswrapper[4991]: I1206 01:27:34.347526 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46jmn"] Dec 06 01:27:34 crc kubenswrapper[4991]: E1206 01:27:34.348125 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb12fac8-d6c4-4be8-907f-a06cae6bc067" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 06 01:27:34 crc kubenswrapper[4991]: I1206 01:27:34.348151 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb12fac8-d6c4-4be8-907f-a06cae6bc067" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 06 01:27:34 crc kubenswrapper[4991]: I1206 01:27:34.348434 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb12fac8-d6c4-4be8-907f-a06cae6bc067" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 06 01:27:34 crc kubenswrapper[4991]: I1206 01:27:34.349299 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46jmn" Dec 06 01:27:34 crc kubenswrapper[4991]: I1206 01:27:34.357047 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9tlxv" Dec 06 01:27:34 crc kubenswrapper[4991]: I1206 01:27:34.357879 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 01:27:34 crc kubenswrapper[4991]: I1206 01:27:34.358116 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 01:27:34 crc kubenswrapper[4991]: I1206 01:27:34.358288 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 01:27:34 crc kubenswrapper[4991]: I1206 01:27:34.372789 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46jmn"] Dec 06 01:27:34 crc kubenswrapper[4991]: I1206 01:27:34.467628 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltjqn\" (UniqueName: \"kubernetes.io/projected/aa853f65-564a-473e-a155-b5ac6e57eedf-kube-api-access-ltjqn\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-46jmn\" (UID: \"aa853f65-564a-473e-a155-b5ac6e57eedf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46jmn" Dec 06 01:27:34 crc kubenswrapper[4991]: I1206 01:27:34.467780 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa853f65-564a-473e-a155-b5ac6e57eedf-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-46jmn\" (UID: \"aa853f65-564a-473e-a155-b5ac6e57eedf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46jmn" Dec 06 01:27:34 crc kubenswrapper[4991]: I1206 01:27:34.467808 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa853f65-564a-473e-a155-b5ac6e57eedf-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-46jmn\" (UID: \"aa853f65-564a-473e-a155-b5ac6e57eedf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46jmn" Dec 06 01:27:34 crc kubenswrapper[4991]: I1206 01:27:34.467861 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa853f65-564a-473e-a155-b5ac6e57eedf-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-46jmn\" (UID: \"aa853f65-564a-473e-a155-b5ac6e57eedf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46jmn" Dec 06 01:27:34 crc kubenswrapper[4991]: I1206 01:27:34.569659 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa853f65-564a-473e-a155-b5ac6e57eedf-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-46jmn\" (UID: \"aa853f65-564a-473e-a155-b5ac6e57eedf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46jmn" Dec 06 01:27:34 crc kubenswrapper[4991]: I1206 01:27:34.569716 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa853f65-564a-473e-a155-b5ac6e57eedf-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-46jmn\" (UID: \"aa853f65-564a-473e-a155-b5ac6e57eedf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46jmn" Dec 06 01:27:34 crc kubenswrapper[4991]: I1206 01:27:34.569774 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa853f65-564a-473e-a155-b5ac6e57eedf-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-46jmn\" (UID: \"aa853f65-564a-473e-a155-b5ac6e57eedf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46jmn" Dec 06 01:27:34 crc kubenswrapper[4991]: I1206 01:27:34.569801 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltjqn\" (UniqueName: \"kubernetes.io/projected/aa853f65-564a-473e-a155-b5ac6e57eedf-kube-api-access-ltjqn\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-46jmn\" (UID: \"aa853f65-564a-473e-a155-b5ac6e57eedf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46jmn" Dec 06 01:27:34 crc kubenswrapper[4991]: I1206 01:27:34.575120 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa853f65-564a-473e-a155-b5ac6e57eedf-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-46jmn\" (UID: \"aa853f65-564a-473e-a155-b5ac6e57eedf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46jmn" Dec 06 01:27:34 crc kubenswrapper[4991]: I1206 01:27:34.575277 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa853f65-564a-473e-a155-b5ac6e57eedf-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-46jmn\" (UID: \"aa853f65-564a-473e-a155-b5ac6e57eedf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46jmn" Dec 06 01:27:34 crc kubenswrapper[4991]: I1206 01:27:34.576022 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa853f65-564a-473e-a155-b5ac6e57eedf-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-46jmn\" (UID: \"aa853f65-564a-473e-a155-b5ac6e57eedf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46jmn" Dec 06 01:27:34 crc kubenswrapper[4991]: I1206 01:27:34.586419 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltjqn\" (UniqueName: \"kubernetes.io/projected/aa853f65-564a-473e-a155-b5ac6e57eedf-kube-api-access-ltjqn\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-46jmn\" (UID: \"aa853f65-564a-473e-a155-b5ac6e57eedf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46jmn" Dec 06 01:27:34 crc kubenswrapper[4991]: I1206 01:27:34.672057 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46jmn" Dec 06 01:27:35 crc kubenswrapper[4991]: W1206 01:27:35.316367 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa853f65_564a_473e_a155_b5ac6e57eedf.slice/crio-d818de5dd84af3a24184df52586f698ae91fa4e11b7a61ddfba144dae1f2f216 WatchSource:0}: Error finding container d818de5dd84af3a24184df52586f698ae91fa4e11b7a61ddfba144dae1f2f216: Status 404 returned error can't find the container with id d818de5dd84af3a24184df52586f698ae91fa4e11b7a61ddfba144dae1f2f216 Dec 06 01:27:35 crc kubenswrapper[4991]: I1206 01:27:35.320117 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46jmn"] Dec 06 01:27:36 crc kubenswrapper[4991]: I1206 01:27:36.283825 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46jmn" event={"ID":"aa853f65-564a-473e-a155-b5ac6e57eedf","Type":"ContainerStarted","Data":"57b954e488018583085bd10a1e1ae096a12dc672b5f7b1ceb2c21f1ee8ce8a02"} Dec 06 01:27:36 crc kubenswrapper[4991]: I1206 01:27:36.284239 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46jmn" event={"ID":"aa853f65-564a-473e-a155-b5ac6e57eedf","Type":"ContainerStarted","Data":"d818de5dd84af3a24184df52586f698ae91fa4e11b7a61ddfba144dae1f2f216"} Dec 06 01:27:36 crc kubenswrapper[4991]: I1206 01:27:36.321192 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46jmn" podStartSLOduration=2.0489933909999998 podStartE2EDuration="2.321152895s" podCreationTimestamp="2025-12-06 01:27:34 +0000 UTC" firstStartedPulling="2025-12-06 01:27:35.319144802 +0000 UTC m=+1528.669435280" lastFinishedPulling="2025-12-06 01:27:35.591304296 +0000 UTC m=+1528.941594784" observedRunningTime="2025-12-06 01:27:36.305491555 +0000 UTC m=+1529.655782043" watchObservedRunningTime="2025-12-06 01:27:36.321152895 +0000 UTC m=+1529.671443433" Dec 06 01:28:14 crc kubenswrapper[4991]: I1206 01:28:14.932961 4991 scope.go:117] "RemoveContainer" containerID="96b7d648cf0392096ea187447910834b858e1d9561006dcdf7a4c7bfed01f7d9" Dec 06 01:28:14 crc kubenswrapper[4991]: I1206 01:28:14.982206 4991 scope.go:117] "RemoveContainer" containerID="86cf516df72440324964d815e6964ee90ca245175fb9dd35a5350df7d283107f" Dec 06 01:28:15 crc kubenswrapper[4991]: I1206 01:28:15.045182 4991 scope.go:117] "RemoveContainer" containerID="b9efb04596d3956458c1741004cf0c3c747574ec679f10d4a938952ad80a97e6" Dec 06 01:29:15 crc kubenswrapper[4991]: I1206 01:29:15.170750 4991 scope.go:117] "RemoveContainer" containerID="56968dca98b9f302ced297b2fdc62f37175ba969001325560b448e6f9b944ad6" Dec 06 01:29:15 crc kubenswrapper[4991]: I1206 01:29:15.229676 4991 scope.go:117] "RemoveContainer" containerID="9c8d337db25b8faca94ece7dc0776b6fb45830b2b42f2b7d52e4a5a4778a214e" Dec 06 01:29:15 crc kubenswrapper[4991]: I1206 01:29:15.283708 4991 scope.go:117] "RemoveContainer" containerID="696c3f31706c7bcf6be1fafe3f512c3ea184441fcb34083126b168a2bdddc61d" Dec 06 01:29:16 crc kubenswrapper[4991]: I1206 01:29:16.739782 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 01:29:16 crc kubenswrapper[4991]: I1206 01:29:16.739867 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 01:29:46 crc kubenswrapper[4991]: I1206 01:29:46.739913 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 01:29:46 crc kubenswrapper[4991]: I1206 01:29:46.740594 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 01:30:00 crc kubenswrapper[4991]: I1206 01:30:00.157362 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416410-prk68"] Dec 06 01:30:00 crc kubenswrapper[4991]: I1206 01:30:00.159596 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416410-prk68" Dec 06 01:30:00 crc kubenswrapper[4991]: I1206 01:30:00.164595 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 06 01:30:00 crc kubenswrapper[4991]: I1206 01:30:00.164908 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 06 01:30:00 crc kubenswrapper[4991]: I1206 01:30:00.186201 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25dc6d8b-19d1-45c4-97ed-e5c426a46dce-config-volume\") pod \"collect-profiles-29416410-prk68\" (UID: \"25dc6d8b-19d1-45c4-97ed-e5c426a46dce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416410-prk68" Dec 06 01:30:00 crc kubenswrapper[4991]: I1206 01:30:00.186644 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwh2t\" (UniqueName: \"kubernetes.io/projected/25dc6d8b-19d1-45c4-97ed-e5c426a46dce-kube-api-access-lwh2t\") pod \"collect-profiles-29416410-prk68\" (UID: \"25dc6d8b-19d1-45c4-97ed-e5c426a46dce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416410-prk68" Dec 06 01:30:00 crc kubenswrapper[4991]: I1206 01:30:00.187024 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/25dc6d8b-19d1-45c4-97ed-e5c426a46dce-secret-volume\") pod \"collect-profiles-29416410-prk68\" (UID: \"25dc6d8b-19d1-45c4-97ed-e5c426a46dce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416410-prk68" Dec 06 01:30:00 crc kubenswrapper[4991]: I1206 01:30:00.205640 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416410-prk68"] Dec 06 01:30:00 crc kubenswrapper[4991]: I1206 01:30:00.288745 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/25dc6d8b-19d1-45c4-97ed-e5c426a46dce-secret-volume\") pod \"collect-profiles-29416410-prk68\" (UID: \"25dc6d8b-19d1-45c4-97ed-e5c426a46dce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416410-prk68" Dec 06 01:30:00 crc kubenswrapper[4991]: I1206 01:30:00.288804 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25dc6d8b-19d1-45c4-97ed-e5c426a46dce-config-volume\") pod \"collect-profiles-29416410-prk68\" (UID: \"25dc6d8b-19d1-45c4-97ed-e5c426a46dce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416410-prk68" Dec 06 01:30:00 crc kubenswrapper[4991]: I1206 01:30:00.288910 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwh2t\" (UniqueName: \"kubernetes.io/projected/25dc6d8b-19d1-45c4-97ed-e5c426a46dce-kube-api-access-lwh2t\") pod \"collect-profiles-29416410-prk68\" (UID: \"25dc6d8b-19d1-45c4-97ed-e5c426a46dce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416410-prk68" Dec 06 01:30:00 crc kubenswrapper[4991]: I1206 01:30:00.289803 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25dc6d8b-19d1-45c4-97ed-e5c426a46dce-config-volume\") pod \"collect-profiles-29416410-prk68\" (UID: \"25dc6d8b-19d1-45c4-97ed-e5c426a46dce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416410-prk68" Dec 06 01:30:00 crc kubenswrapper[4991]: I1206 01:30:00.301666 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/25dc6d8b-19d1-45c4-97ed-e5c426a46dce-secret-volume\") pod \"collect-profiles-29416410-prk68\" (UID: \"25dc6d8b-19d1-45c4-97ed-e5c426a46dce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416410-prk68" Dec 06 01:30:00 crc kubenswrapper[4991]: I1206 01:30:00.311293 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwh2t\" (UniqueName: \"kubernetes.io/projected/25dc6d8b-19d1-45c4-97ed-e5c426a46dce-kube-api-access-lwh2t\") pod \"collect-profiles-29416410-prk68\" (UID: \"25dc6d8b-19d1-45c4-97ed-e5c426a46dce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416410-prk68" Dec 06 01:30:00 crc kubenswrapper[4991]: I1206 01:30:00.527781 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416410-prk68" Dec 06 01:30:01 crc kubenswrapper[4991]: I1206 01:30:01.071258 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416410-prk68"] Dec 06 01:30:02 crc kubenswrapper[4991]: I1206 01:30:02.077917 4991 generic.go:334] "Generic (PLEG): container finished" podID="25dc6d8b-19d1-45c4-97ed-e5c426a46dce" containerID="54631a6f269a9d0599b393fce077d2ab0395ce90b3697f6808de48227ca63118" exitCode=0 Dec 06 01:30:02 crc kubenswrapper[4991]: I1206 01:30:02.078004 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416410-prk68" event={"ID":"25dc6d8b-19d1-45c4-97ed-e5c426a46dce","Type":"ContainerDied","Data":"54631a6f269a9d0599b393fce077d2ab0395ce90b3697f6808de48227ca63118"} Dec 06 01:30:02 crc kubenswrapper[4991]: I1206 01:30:02.078046 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416410-prk68" event={"ID":"25dc6d8b-19d1-45c4-97ed-e5c426a46dce","Type":"ContainerStarted","Data":"fc66e978e7bfa75c70403a1257a10df74fb049423a14168c8a886cb2865854e6"} Dec 06 01:30:03 crc kubenswrapper[4991]: I1206 01:30:03.435433 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416410-prk68" Dec 06 01:30:03 crc kubenswrapper[4991]: I1206 01:30:03.456469 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/25dc6d8b-19d1-45c4-97ed-e5c426a46dce-secret-volume\") pod \"25dc6d8b-19d1-45c4-97ed-e5c426a46dce\" (UID: \"25dc6d8b-19d1-45c4-97ed-e5c426a46dce\") " Dec 06 01:30:03 crc kubenswrapper[4991]: I1206 01:30:03.456712 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwh2t\" (UniqueName: \"kubernetes.io/projected/25dc6d8b-19d1-45c4-97ed-e5c426a46dce-kube-api-access-lwh2t\") pod \"25dc6d8b-19d1-45c4-97ed-e5c426a46dce\" (UID: \"25dc6d8b-19d1-45c4-97ed-e5c426a46dce\") " Dec 06 01:30:03 crc kubenswrapper[4991]: I1206 01:30:03.456906 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25dc6d8b-19d1-45c4-97ed-e5c426a46dce-config-volume\") pod \"25dc6d8b-19d1-45c4-97ed-e5c426a46dce\" (UID: \"25dc6d8b-19d1-45c4-97ed-e5c426a46dce\") " Dec 06 01:30:03 crc kubenswrapper[4991]: I1206 01:30:03.458496 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25dc6d8b-19d1-45c4-97ed-e5c426a46dce-config-volume" (OuterVolumeSpecName: "config-volume") pod "25dc6d8b-19d1-45c4-97ed-e5c426a46dce" (UID: "25dc6d8b-19d1-45c4-97ed-e5c426a46dce"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:30:03 crc kubenswrapper[4991]: I1206 01:30:03.468289 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25dc6d8b-19d1-45c4-97ed-e5c426a46dce-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "25dc6d8b-19d1-45c4-97ed-e5c426a46dce" (UID: "25dc6d8b-19d1-45c4-97ed-e5c426a46dce"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:30:03 crc kubenswrapper[4991]: I1206 01:30:03.468451 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25dc6d8b-19d1-45c4-97ed-e5c426a46dce-kube-api-access-lwh2t" (OuterVolumeSpecName: "kube-api-access-lwh2t") pod "25dc6d8b-19d1-45c4-97ed-e5c426a46dce" (UID: "25dc6d8b-19d1-45c4-97ed-e5c426a46dce"). InnerVolumeSpecName "kube-api-access-lwh2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:30:03 crc kubenswrapper[4991]: I1206 01:30:03.558862 4991 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25dc6d8b-19d1-45c4-97ed-e5c426a46dce-config-volume\") on node \"crc\" DevicePath \"\"" Dec 06 01:30:03 crc kubenswrapper[4991]: I1206 01:30:03.558900 4991 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/25dc6d8b-19d1-45c4-97ed-e5c426a46dce-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 06 01:30:03 crc kubenswrapper[4991]: I1206 01:30:03.558917 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwh2t\" (UniqueName: \"kubernetes.io/projected/25dc6d8b-19d1-45c4-97ed-e5c426a46dce-kube-api-access-lwh2t\") on node \"crc\" DevicePath \"\"" Dec 06 01:30:04 crc kubenswrapper[4991]: I1206 01:30:04.107393 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416410-prk68" event={"ID":"25dc6d8b-19d1-45c4-97ed-e5c426a46dce","Type":"ContainerDied","Data":"fc66e978e7bfa75c70403a1257a10df74fb049423a14168c8a886cb2865854e6"} Dec 06 01:30:04 crc kubenswrapper[4991]: I1206 01:30:04.107467 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc66e978e7bfa75c70403a1257a10df74fb049423a14168c8a886cb2865854e6" Dec 06 01:30:04 crc kubenswrapper[4991]: I1206 01:30:04.108055 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416410-prk68" Dec 06 01:30:16 crc kubenswrapper[4991]: I1206 01:30:16.740098 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 01:30:16 crc kubenswrapper[4991]: I1206 01:30:16.740585 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 01:30:16 crc kubenswrapper[4991]: I1206 01:30:16.740638 4991 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" Dec 06 01:30:16 crc kubenswrapper[4991]: I1206 01:30:16.741481 4991 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1191fa5f428c9ddfe26ebd749363cce958056a5e267b2384684767a98228d9bb"} pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 01:30:16 crc kubenswrapper[4991]: I1206 01:30:16.741551 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" containerID="cri-o://1191fa5f428c9ddfe26ebd749363cce958056a5e267b2384684767a98228d9bb" gracePeriod=600 Dec 06 01:30:16 crc kubenswrapper[4991]: E1206 01:30:16.919240 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:30:17 crc kubenswrapper[4991]: I1206 01:30:17.261609 4991 generic.go:334] "Generic (PLEG): container finished" podID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerID="1191fa5f428c9ddfe26ebd749363cce958056a5e267b2384684767a98228d9bb" exitCode=0 Dec 06 01:30:17 crc kubenswrapper[4991]: I1206 01:30:17.261664 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" event={"ID":"f35cea72-9e31-4432-9e3b-16a50ebce794","Type":"ContainerDied","Data":"1191fa5f428c9ddfe26ebd749363cce958056a5e267b2384684767a98228d9bb"} Dec 06 01:30:17 crc kubenswrapper[4991]: I1206 01:30:17.264386 4991 scope.go:117] "RemoveContainer" containerID="fcb06d7c03c6d3eab8bf59045483a7760319fe20cd9fc9098eb56b0b1ff9d4ef" Dec 06 01:30:17 crc kubenswrapper[4991]: I1206 01:30:17.264604 4991 scope.go:117] "RemoveContainer" containerID="1191fa5f428c9ddfe26ebd749363cce958056a5e267b2384684767a98228d9bb" Dec 06 01:30:17 crc kubenswrapper[4991]: E1206 01:30:17.265358 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:30:28 crc kubenswrapper[4991]: I1206 01:30:28.037497 4991 scope.go:117] "RemoveContainer" containerID="1191fa5f428c9ddfe26ebd749363cce958056a5e267b2384684767a98228d9bb" Dec 06 01:30:28 crc kubenswrapper[4991]: E1206 01:30:28.038357 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:30:42 crc kubenswrapper[4991]: I1206 01:30:42.041070 4991 scope.go:117] "RemoveContainer" containerID="1191fa5f428c9ddfe26ebd749363cce958056a5e267b2384684767a98228d9bb" Dec 06 01:30:42 crc kubenswrapper[4991]: E1206 01:30:42.042938 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:30:55 crc kubenswrapper[4991]: I1206 01:30:55.039179 4991 scope.go:117] "RemoveContainer" containerID="1191fa5f428c9ddfe26ebd749363cce958056a5e267b2384684767a98228d9bb" Dec 06 01:30:55 crc kubenswrapper[4991]: E1206 01:30:55.040423 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:30:57 crc kubenswrapper[4991]: I1206 01:30:57.709027 4991 generic.go:334] "Generic (PLEG): container finished" podID="aa853f65-564a-473e-a155-b5ac6e57eedf" containerID="57b954e488018583085bd10a1e1ae096a12dc672b5f7b1ceb2c21f1ee8ce8a02" exitCode=0 Dec 06 01:30:57 crc kubenswrapper[4991]: I1206 01:30:57.709117 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46jmn" event={"ID":"aa853f65-564a-473e-a155-b5ac6e57eedf","Type":"ContainerDied","Data":"57b954e488018583085bd10a1e1ae096a12dc672b5f7b1ceb2c21f1ee8ce8a02"} Dec 06 01:30:59 crc kubenswrapper[4991]: I1206 01:30:59.341626 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46jmn" Dec 06 01:30:59 crc kubenswrapper[4991]: I1206 01:30:59.498948 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa853f65-564a-473e-a155-b5ac6e57eedf-inventory\") pod \"aa853f65-564a-473e-a155-b5ac6e57eedf\" (UID: \"aa853f65-564a-473e-a155-b5ac6e57eedf\") " Dec 06 01:30:59 crc kubenswrapper[4991]: I1206 01:30:59.499649 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa853f65-564a-473e-a155-b5ac6e57eedf-ssh-key\") pod \"aa853f65-564a-473e-a155-b5ac6e57eedf\" (UID: \"aa853f65-564a-473e-a155-b5ac6e57eedf\") " Dec 06 01:30:59 crc kubenswrapper[4991]: I1206 01:30:59.499759 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa853f65-564a-473e-a155-b5ac6e57eedf-bootstrap-combined-ca-bundle\") pod \"aa853f65-564a-473e-a155-b5ac6e57eedf\" (UID: \"aa853f65-564a-473e-a155-b5ac6e57eedf\") " Dec 06 01:30:59 crc kubenswrapper[4991]: I1206 01:30:59.499875 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltjqn\" (UniqueName: \"kubernetes.io/projected/aa853f65-564a-473e-a155-b5ac6e57eedf-kube-api-access-ltjqn\") pod \"aa853f65-564a-473e-a155-b5ac6e57eedf\" (UID: \"aa853f65-564a-473e-a155-b5ac6e57eedf\") " Dec 06 01:30:59 crc kubenswrapper[4991]: I1206 01:30:59.504891 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa853f65-564a-473e-a155-b5ac6e57eedf-kube-api-access-ltjqn" (OuterVolumeSpecName: "kube-api-access-ltjqn") pod "aa853f65-564a-473e-a155-b5ac6e57eedf" (UID: "aa853f65-564a-473e-a155-b5ac6e57eedf"). InnerVolumeSpecName "kube-api-access-ltjqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:30:59 crc kubenswrapper[4991]: I1206 01:30:59.506150 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa853f65-564a-473e-a155-b5ac6e57eedf-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "aa853f65-564a-473e-a155-b5ac6e57eedf" (UID: "aa853f65-564a-473e-a155-b5ac6e57eedf"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:30:59 crc kubenswrapper[4991]: I1206 01:30:59.529690 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa853f65-564a-473e-a155-b5ac6e57eedf-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "aa853f65-564a-473e-a155-b5ac6e57eedf" (UID: "aa853f65-564a-473e-a155-b5ac6e57eedf"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:30:59 crc kubenswrapper[4991]: I1206 01:30:59.548835 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa853f65-564a-473e-a155-b5ac6e57eedf-inventory" (OuterVolumeSpecName: "inventory") pod "aa853f65-564a-473e-a155-b5ac6e57eedf" (UID: "aa853f65-564a-473e-a155-b5ac6e57eedf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:30:59 crc kubenswrapper[4991]: I1206 01:30:59.602700 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltjqn\" (UniqueName: \"kubernetes.io/projected/aa853f65-564a-473e-a155-b5ac6e57eedf-kube-api-access-ltjqn\") on node \"crc\" DevicePath \"\"" Dec 06 01:30:59 crc kubenswrapper[4991]: I1206 01:30:59.602737 4991 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa853f65-564a-473e-a155-b5ac6e57eedf-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 01:30:59 crc kubenswrapper[4991]: I1206 01:30:59.602747 4991 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa853f65-564a-473e-a155-b5ac6e57eedf-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 01:30:59 crc kubenswrapper[4991]: I1206 01:30:59.602757 4991 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa853f65-564a-473e-a155-b5ac6e57eedf-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:30:59 crc kubenswrapper[4991]: I1206 01:30:59.758025 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46jmn" event={"ID":"aa853f65-564a-473e-a155-b5ac6e57eedf","Type":"ContainerDied","Data":"d818de5dd84af3a24184df52586f698ae91fa4e11b7a61ddfba144dae1f2f216"} Dec 06 01:30:59 crc kubenswrapper[4991]: I1206 01:30:59.758076 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d818de5dd84af3a24184df52586f698ae91fa4e11b7a61ddfba144dae1f2f216" Dec 06 01:30:59 crc kubenswrapper[4991]: I1206 01:30:59.758140 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46jmn" Dec 06 01:30:59 crc kubenswrapper[4991]: I1206 01:30:59.831274 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ctg6w"] Dec 06 01:30:59 crc kubenswrapper[4991]: E1206 01:30:59.831673 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25dc6d8b-19d1-45c4-97ed-e5c426a46dce" containerName="collect-profiles" Dec 06 01:30:59 crc kubenswrapper[4991]: I1206 01:30:59.831694 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="25dc6d8b-19d1-45c4-97ed-e5c426a46dce" containerName="collect-profiles" Dec 06 01:30:59 crc kubenswrapper[4991]: E1206 01:30:59.831709 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa853f65-564a-473e-a155-b5ac6e57eedf" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 06 01:30:59 crc kubenswrapper[4991]: I1206 01:30:59.831718 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa853f65-564a-473e-a155-b5ac6e57eedf" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 06 01:30:59 crc kubenswrapper[4991]: I1206 01:30:59.831889 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa853f65-564a-473e-a155-b5ac6e57eedf" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 06 01:30:59 crc kubenswrapper[4991]: I1206 01:30:59.831909 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="25dc6d8b-19d1-45c4-97ed-e5c426a46dce" containerName="collect-profiles" Dec 06 01:30:59 crc kubenswrapper[4991]: I1206 01:30:59.832528 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ctg6w" Dec 06 01:30:59 crc kubenswrapper[4991]: I1206 01:30:59.836786 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 01:30:59 crc kubenswrapper[4991]: I1206 01:30:59.836853 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9tlxv" Dec 06 01:30:59 crc kubenswrapper[4991]: I1206 01:30:59.838319 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 01:30:59 crc kubenswrapper[4991]: I1206 01:30:59.843307 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 01:30:59 crc kubenswrapper[4991]: I1206 01:30:59.864917 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ctg6w"] Dec 06 01:31:00 crc kubenswrapper[4991]: I1206 01:31:00.009643 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bc316311-f861-4cc3-bb9f-1a340e3ec877-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ctg6w\" (UID: \"bc316311-f861-4cc3-bb9f-1a340e3ec877\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ctg6w" Dec 06 01:31:00 crc kubenswrapper[4991]: I1206 01:31:00.009799 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-574gt\" (UniqueName: \"kubernetes.io/projected/bc316311-f861-4cc3-bb9f-1a340e3ec877-kube-api-access-574gt\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ctg6w\" (UID: \"bc316311-f861-4cc3-bb9f-1a340e3ec877\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ctg6w" Dec 06 01:31:00 crc kubenswrapper[4991]: I1206 01:31:00.010684 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bc316311-f861-4cc3-bb9f-1a340e3ec877-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ctg6w\" (UID: \"bc316311-f861-4cc3-bb9f-1a340e3ec877\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ctg6w" Dec 06 01:31:00 crc kubenswrapper[4991]: I1206 01:31:00.112887 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-574gt\" (UniqueName: \"kubernetes.io/projected/bc316311-f861-4cc3-bb9f-1a340e3ec877-kube-api-access-574gt\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ctg6w\" (UID: \"bc316311-f861-4cc3-bb9f-1a340e3ec877\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ctg6w" Dec 06 01:31:00 crc kubenswrapper[4991]: I1206 01:31:00.113031 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bc316311-f861-4cc3-bb9f-1a340e3ec877-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ctg6w\" (UID: \"bc316311-f861-4cc3-bb9f-1a340e3ec877\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ctg6w" Dec 06 01:31:00 crc kubenswrapper[4991]: I1206 01:31:00.113103 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bc316311-f861-4cc3-bb9f-1a340e3ec877-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ctg6w\" (UID: \"bc316311-f861-4cc3-bb9f-1a340e3ec877\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ctg6w" Dec 06 01:31:00 crc kubenswrapper[4991]: I1206 01:31:00.117289 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bc316311-f861-4cc3-bb9f-1a340e3ec877-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ctg6w\" (UID: \"bc316311-f861-4cc3-bb9f-1a340e3ec877\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ctg6w" Dec 06 01:31:00 crc kubenswrapper[4991]: I1206 01:31:00.119707 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bc316311-f861-4cc3-bb9f-1a340e3ec877-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ctg6w\" (UID: \"bc316311-f861-4cc3-bb9f-1a340e3ec877\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ctg6w" Dec 06 01:31:00 crc kubenswrapper[4991]: I1206 01:31:00.133268 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-574gt\" (UniqueName: \"kubernetes.io/projected/bc316311-f861-4cc3-bb9f-1a340e3ec877-kube-api-access-574gt\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ctg6w\" (UID: \"bc316311-f861-4cc3-bb9f-1a340e3ec877\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ctg6w" Dec 06 01:31:00 crc kubenswrapper[4991]: I1206 01:31:00.153862 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ctg6w" Dec 06 01:31:00 crc kubenswrapper[4991]: I1206 01:31:00.546655 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ctg6w"] Dec 06 01:31:00 crc kubenswrapper[4991]: I1206 01:31:00.771081 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ctg6w" event={"ID":"bc316311-f861-4cc3-bb9f-1a340e3ec877","Type":"ContainerStarted","Data":"3b6407130496508ed7dffb5a824a6777fbe415aa7e6847c03629fdc4b15f684d"} Dec 06 01:31:01 crc kubenswrapper[4991]: I1206 01:31:01.787012 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ctg6w" event={"ID":"bc316311-f861-4cc3-bb9f-1a340e3ec877","Type":"ContainerStarted","Data":"4354cab7dd1086815d0c1376f1ef1e41eefa26999e2e37c628f03b24d26d3384"} Dec 06 01:31:01 crc kubenswrapper[4991]: I1206 01:31:01.815243 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ctg6w" podStartSLOduration=2.65404373 podStartE2EDuration="2.815212311s" podCreationTimestamp="2025-12-06 01:30:59 +0000 UTC" firstStartedPulling="2025-12-06 01:31:00.548539737 +0000 UTC m=+1733.898830215" lastFinishedPulling="2025-12-06 01:31:00.709708328 +0000 UTC m=+1734.059998796" observedRunningTime="2025-12-06 01:31:01.806262137 +0000 UTC m=+1735.156552615" watchObservedRunningTime="2025-12-06 01:31:01.815212311 +0000 UTC m=+1735.165502779" Dec 06 01:31:08 crc kubenswrapper[4991]: I1206 01:31:08.037414 4991 scope.go:117] "RemoveContainer" containerID="1191fa5f428c9ddfe26ebd749363cce958056a5e267b2384684767a98228d9bb" Dec 06 01:31:08 crc kubenswrapper[4991]: E1206 01:31:08.038405 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:31:21 crc kubenswrapper[4991]: I1206 01:31:21.038363 4991 scope.go:117] "RemoveContainer" containerID="1191fa5f428c9ddfe26ebd749363cce958056a5e267b2384684767a98228d9bb" Dec 06 01:31:21 crc kubenswrapper[4991]: E1206 01:31:21.039550 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:31:36 crc kubenswrapper[4991]: I1206 01:31:36.038262 4991 scope.go:117] "RemoveContainer" containerID="1191fa5f428c9ddfe26ebd749363cce958056a5e267b2384684767a98228d9bb" Dec 06 01:31:36 crc kubenswrapper[4991]: E1206 01:31:36.039127 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:31:36 crc kubenswrapper[4991]: I1206 01:31:36.059013 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-7xb7k"] Dec 06 01:31:36 crc kubenswrapper[4991]: I1206 01:31:36.075155 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-7xb7k"] Dec 06 01:31:37 crc kubenswrapper[4991]: I1206 01:31:37.104349 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25db95d2-7c84-4c51-a212-4590d91cedee" path="/var/lib/kubelet/pods/25db95d2-7c84-4c51-a212-4590d91cedee/volumes" Dec 06 01:31:37 crc kubenswrapper[4991]: I1206 01:31:37.105197 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-3d5c-account-create-update-vbgdh"] Dec 06 01:31:37 crc kubenswrapper[4991]: I1206 01:31:37.116810 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-6152-account-create-update-zzm8n"] Dec 06 01:31:37 crc kubenswrapper[4991]: I1206 01:31:37.126834 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-2snkc"] Dec 06 01:31:37 crc kubenswrapper[4991]: I1206 01:31:37.136218 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-3d5c-account-create-update-vbgdh"] Dec 06 01:31:37 crc kubenswrapper[4991]: I1206 01:31:37.144858 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-2snkc"] Dec 06 01:31:37 crc kubenswrapper[4991]: I1206 01:31:37.153476 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-6152-account-create-update-zzm8n"] Dec 06 01:31:37 crc kubenswrapper[4991]: I1206 01:31:37.162973 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-e2eb-account-create-update-pd2t5"] Dec 06 01:31:37 crc kubenswrapper[4991]: I1206 01:31:37.176925 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-e2eb-account-create-update-pd2t5"] Dec 06 01:31:37 crc kubenswrapper[4991]: I1206 01:31:37.191084 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-7wx6t"] Dec 06 01:31:37 crc kubenswrapper[4991]: I1206 01:31:37.201814 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-7wx6t"] Dec 06 01:31:39 crc kubenswrapper[4991]: I1206 01:31:39.056313 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31697208-90b9-4041-98b1-9a63a2671b4b" path="/var/lib/kubelet/pods/31697208-90b9-4041-98b1-9a63a2671b4b/volumes" Dec 06 01:31:39 crc kubenswrapper[4991]: I1206 01:31:39.057163 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78a85bfe-6df1-44b8-ad38-9a8eba3e662b" path="/var/lib/kubelet/pods/78a85bfe-6df1-44b8-ad38-9a8eba3e662b/volumes" Dec 06 01:31:39 crc kubenswrapper[4991]: I1206 01:31:39.057827 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97898e71-ae7d-405f-898b-5bb25c2948ff" path="/var/lib/kubelet/pods/97898e71-ae7d-405f-898b-5bb25c2948ff/volumes" Dec 06 01:31:39 crc kubenswrapper[4991]: I1206 01:31:39.058349 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adc2f876-daef-47fe-81a6-1eced742e7f6" path="/var/lib/kubelet/pods/adc2f876-daef-47fe-81a6-1eced742e7f6/volumes" Dec 06 01:31:39 crc kubenswrapper[4991]: I1206 01:31:39.060976 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d902c58d-de3b-420a-95de-27f9393ef061" path="/var/lib/kubelet/pods/d902c58d-de3b-420a-95de-27f9393ef061/volumes" Dec 06 01:31:48 crc kubenswrapper[4991]: I1206 01:31:48.038501 4991 scope.go:117] "RemoveContainer" containerID="1191fa5f428c9ddfe26ebd749363cce958056a5e267b2384684767a98228d9bb" Dec 06 01:31:48 crc kubenswrapper[4991]: E1206 01:31:48.039634 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:32:03 crc kubenswrapper[4991]: I1206 01:32:03.039498 4991 scope.go:117] "RemoveContainer" containerID="1191fa5f428c9ddfe26ebd749363cce958056a5e267b2384684767a98228d9bb" Dec 06 01:32:03 crc kubenswrapper[4991]: E1206 01:32:03.040472 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:32:08 crc kubenswrapper[4991]: I1206 01:32:08.047546 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-1f6b-account-create-update-hcrwx"] Dec 06 01:32:08 crc kubenswrapper[4991]: I1206 01:32:08.063252 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-pz4lk"] Dec 06 01:32:08 crc kubenswrapper[4991]: I1206 01:32:08.075369 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-7g5ts"] Dec 06 01:32:08 crc kubenswrapper[4991]: I1206 01:32:08.087071 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-pz4lk"] Dec 06 01:32:08 crc kubenswrapper[4991]: I1206 01:32:08.102829 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-7g5ts"] Dec 06 01:32:08 crc kubenswrapper[4991]: I1206 01:32:08.121592 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-1f6b-account-create-update-hcrwx"] Dec 06 01:32:09 crc kubenswrapper[4991]: I1206 01:32:09.067563 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59662742-d08d-498c-9e70-53288e527d9f" path="/var/lib/kubelet/pods/59662742-d08d-498c-9e70-53288e527d9f/volumes" Dec 06 01:32:09 crc kubenswrapper[4991]: I1206 01:32:09.068505 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83441b36-34ed-4c10-848b-22ed0afae8cb" path="/var/lib/kubelet/pods/83441b36-34ed-4c10-848b-22ed0afae8cb/volumes" Dec 06 01:32:09 crc kubenswrapper[4991]: I1206 01:32:09.069266 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93aee8c2-c556-4635-ab46-f72bb431ed81" path="/var/lib/kubelet/pods/93aee8c2-c556-4635-ab46-f72bb431ed81/volumes" Dec 06 01:32:09 crc kubenswrapper[4991]: I1206 01:32:09.069868 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-073a-account-create-update-5n44d"] Dec 06 01:32:09 crc kubenswrapper[4991]: I1206 01:32:09.069900 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-5745-account-create-update-54rdp"] Dec 06 01:32:09 crc kubenswrapper[4991]: I1206 01:32:09.075853 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-q7rp9"] Dec 06 01:32:09 crc kubenswrapper[4991]: I1206 01:32:09.086098 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-073a-account-create-update-5n44d"] Dec 06 01:32:09 crc kubenswrapper[4991]: I1206 01:32:09.096215 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-q7rp9"] Dec 06 01:32:09 crc kubenswrapper[4991]: I1206 01:32:09.107484 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-5745-account-create-update-54rdp"] Dec 06 01:32:11 crc kubenswrapper[4991]: I1206 01:32:11.051390 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3672862f-2f4b-487a-b545-48dfd8515d8d" path="/var/lib/kubelet/pods/3672862f-2f4b-487a-b545-48dfd8515d8d/volumes" Dec 06 01:32:11 crc kubenswrapper[4991]: I1206 01:32:11.052217 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba9c8d1e-e8d2-4bd2-8963-45ea5f7110fb" path="/var/lib/kubelet/pods/ba9c8d1e-e8d2-4bd2-8963-45ea5f7110fb/volumes" Dec 06 01:32:11 crc kubenswrapper[4991]: I1206 01:32:11.053269 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f680de3a-45a6-4f83-b27c-79ade3e8071a" path="/var/lib/kubelet/pods/f680de3a-45a6-4f83-b27c-79ade3e8071a/volumes" Dec 06 01:32:14 crc kubenswrapper[4991]: I1206 01:32:14.049128 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-xqxr2"] Dec 06 01:32:14 crc kubenswrapper[4991]: I1206 01:32:14.071737 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-xqxr2"] Dec 06 01:32:15 crc kubenswrapper[4991]: I1206 01:32:15.057341 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e64b0721-7f94-464a-a14f-3fc10d3b6c53" path="/var/lib/kubelet/pods/e64b0721-7f94-464a-a14f-3fc10d3b6c53/volumes" Dec 06 01:32:15 crc kubenswrapper[4991]: I1206 01:32:15.624082 4991 scope.go:117] "RemoveContainer" containerID="7333191c50f374062532cdc0893b90f11fb9de6310c3e103ddcfb276772886c0" Dec 06 01:32:15 crc kubenswrapper[4991]: I1206 01:32:15.676197 4991 scope.go:117] "RemoveContainer" containerID="987f550151737b52f41ff96232ad7783addb53fd67e320f7a9588d5a983ea64b" Dec 06 01:32:15 crc kubenswrapper[4991]: I1206 01:32:15.759790 4991 scope.go:117] "RemoveContainer" containerID="9b837859471346fe2cdb38e81e756d345537fcd908d4593f2c37feb351f0a824" Dec 06 01:32:15 crc kubenswrapper[4991]: I1206 01:32:15.798930 4991 scope.go:117] "RemoveContainer" containerID="5e0cdce1ad6e5bceb0ff6bb0b94bdc291597ef588166dda553c99cd718396e04" Dec 06 01:32:15 crc kubenswrapper[4991]: I1206 01:32:15.840906 4991 scope.go:117] "RemoveContainer" containerID="fa3032c957230ec11f138a64bea4c7d7fff691314baba4427ffab297fbf812d7" Dec 06 01:32:15 crc kubenswrapper[4991]: I1206 01:32:15.891065 4991 scope.go:117] "RemoveContainer" containerID="72a754ea75011988bbe8de2bc3b63835462a8420758ab77ea1bdfeeb4d1c7b28" Dec 06 01:32:15 crc kubenswrapper[4991]: I1206 01:32:15.913291 4991 scope.go:117] "RemoveContainer" containerID="7b563f9d8d80963347c62d3191f2df9a2122f5c1037039569973064b1b1f687f" Dec 06 01:32:15 crc kubenswrapper[4991]: I1206 01:32:15.938683 4991 scope.go:117] "RemoveContainer" containerID="3bc552300db9f5778af42c4fc8bb6bff650f5dbeb539656c8aa06c33751d6be3" Dec 06 01:32:15 crc kubenswrapper[4991]: I1206 01:32:15.961435 4991 scope.go:117] "RemoveContainer" containerID="fbd2f9d4a3c7ec83307cd8d78841f72655859a751be415e536128375885f9d27" Dec 06 01:32:15 crc kubenswrapper[4991]: I1206 01:32:15.983024 4991 scope.go:117] "RemoveContainer" containerID="54bec077365df657bfb739dde45467670bdf44df4cdbeff9adf99517b58523d8" Dec 06 01:32:16 crc kubenswrapper[4991]: I1206 01:32:16.004446 4991 scope.go:117] "RemoveContainer" containerID="38b0d127cd8af1a795f53a3bb1de89e71548588b0396410f5b4dc2f31f456b18" Dec 06 01:32:16 crc kubenswrapper[4991]: I1206 01:32:16.025614 4991 scope.go:117] "RemoveContainer" containerID="9ba1f9363eab2d4cefcc3224ad8dc8008dde6ce02e8ae35584d4698c88943997" Dec 06 01:32:16 crc kubenswrapper[4991]: I1206 01:32:16.037970 4991 scope.go:117] "RemoveContainer" containerID="1191fa5f428c9ddfe26ebd749363cce958056a5e267b2384684767a98228d9bb" Dec 06 01:32:16 crc kubenswrapper[4991]: E1206 01:32:16.038434 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:32:16 crc kubenswrapper[4991]: I1206 01:32:16.053348 4991 scope.go:117] "RemoveContainer" containerID="dedf0cac13abd1211ab22dee8d33e362bd80ff30c0b6bf78ffb8c62b24deae22" Dec 06 01:32:29 crc kubenswrapper[4991]: I1206 01:32:29.038065 4991 scope.go:117] "RemoveContainer" containerID="1191fa5f428c9ddfe26ebd749363cce958056a5e267b2384684767a98228d9bb" Dec 06 01:32:29 crc kubenswrapper[4991]: E1206 01:32:29.039233 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:32:44 crc kubenswrapper[4991]: I1206 01:32:44.039033 4991 scope.go:117] "RemoveContainer" containerID="1191fa5f428c9ddfe26ebd749363cce958056a5e267b2384684767a98228d9bb" Dec 06 01:32:44 crc kubenswrapper[4991]: E1206 01:32:44.040529 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:32:44 crc kubenswrapper[4991]: I1206 01:32:44.070318 4991 generic.go:334] "Generic (PLEG): container finished" podID="bc316311-f861-4cc3-bb9f-1a340e3ec877" containerID="4354cab7dd1086815d0c1376f1ef1e41eefa26999e2e37c628f03b24d26d3384" exitCode=0 Dec 06 01:32:44 crc kubenswrapper[4991]: I1206 01:32:44.070386 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ctg6w" event={"ID":"bc316311-f861-4cc3-bb9f-1a340e3ec877","Type":"ContainerDied","Data":"4354cab7dd1086815d0c1376f1ef1e41eefa26999e2e37c628f03b24d26d3384"} Dec 06 01:32:45 crc kubenswrapper[4991]: I1206 01:32:45.634358 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ctg6w" Dec 06 01:32:45 crc kubenswrapper[4991]: I1206 01:32:45.835526 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bc316311-f861-4cc3-bb9f-1a340e3ec877-ssh-key\") pod \"bc316311-f861-4cc3-bb9f-1a340e3ec877\" (UID: \"bc316311-f861-4cc3-bb9f-1a340e3ec877\") " Dec 06 01:32:45 crc kubenswrapper[4991]: I1206 01:32:45.835626 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-574gt\" (UniqueName: \"kubernetes.io/projected/bc316311-f861-4cc3-bb9f-1a340e3ec877-kube-api-access-574gt\") pod \"bc316311-f861-4cc3-bb9f-1a340e3ec877\" (UID: \"bc316311-f861-4cc3-bb9f-1a340e3ec877\") " Dec 06 01:32:45 crc kubenswrapper[4991]: I1206 01:32:45.835679 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bc316311-f861-4cc3-bb9f-1a340e3ec877-inventory\") pod \"bc316311-f861-4cc3-bb9f-1a340e3ec877\" (UID: \"bc316311-f861-4cc3-bb9f-1a340e3ec877\") " Dec 06 01:32:45 crc kubenswrapper[4991]: I1206 01:32:45.841789 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc316311-f861-4cc3-bb9f-1a340e3ec877-kube-api-access-574gt" (OuterVolumeSpecName: "kube-api-access-574gt") pod "bc316311-f861-4cc3-bb9f-1a340e3ec877" (UID: "bc316311-f861-4cc3-bb9f-1a340e3ec877"). InnerVolumeSpecName "kube-api-access-574gt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:32:45 crc kubenswrapper[4991]: I1206 01:32:45.867334 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc316311-f861-4cc3-bb9f-1a340e3ec877-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bc316311-f861-4cc3-bb9f-1a340e3ec877" (UID: "bc316311-f861-4cc3-bb9f-1a340e3ec877"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:32:45 crc kubenswrapper[4991]: I1206 01:32:45.878929 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc316311-f861-4cc3-bb9f-1a340e3ec877-inventory" (OuterVolumeSpecName: "inventory") pod "bc316311-f861-4cc3-bb9f-1a340e3ec877" (UID: "bc316311-f861-4cc3-bb9f-1a340e3ec877"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:32:45 crc kubenswrapper[4991]: I1206 01:32:45.938367 4991 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bc316311-f861-4cc3-bb9f-1a340e3ec877-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 01:32:45 crc kubenswrapper[4991]: I1206 01:32:45.938423 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-574gt\" (UniqueName: \"kubernetes.io/projected/bc316311-f861-4cc3-bb9f-1a340e3ec877-kube-api-access-574gt\") on node \"crc\" DevicePath \"\"" Dec 06 01:32:45 crc kubenswrapper[4991]: I1206 01:32:45.938452 4991 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bc316311-f861-4cc3-bb9f-1a340e3ec877-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 01:32:46 crc kubenswrapper[4991]: I1206 01:32:46.059628 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-5hg4t"] Dec 06 01:32:46 crc kubenswrapper[4991]: I1206 01:32:46.080732 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-rssqk"] Dec 06 01:32:46 crc kubenswrapper[4991]: I1206 01:32:46.096379 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-rssqk"] Dec 06 01:32:46 crc kubenswrapper[4991]: I1206 01:32:46.108829 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-5hg4t"] Dec 06 01:32:46 crc kubenswrapper[4991]: I1206 01:32:46.109582 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ctg6w" event={"ID":"bc316311-f861-4cc3-bb9f-1a340e3ec877","Type":"ContainerDied","Data":"3b6407130496508ed7dffb5a824a6777fbe415aa7e6847c03629fdc4b15f684d"} Dec 06 01:32:46 crc kubenswrapper[4991]: I1206 01:32:46.109629 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ctg6w" Dec 06 01:32:46 crc kubenswrapper[4991]: I1206 01:32:46.109637 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b6407130496508ed7dffb5a824a6777fbe415aa7e6847c03629fdc4b15f684d" Dec 06 01:32:46 crc kubenswrapper[4991]: I1206 01:32:46.187783 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hdf44"] Dec 06 01:32:46 crc kubenswrapper[4991]: E1206 01:32:46.188202 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc316311-f861-4cc3-bb9f-1a340e3ec877" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 06 01:32:46 crc kubenswrapper[4991]: I1206 01:32:46.188225 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc316311-f861-4cc3-bb9f-1a340e3ec877" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 06 01:32:46 crc kubenswrapper[4991]: I1206 01:32:46.188452 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc316311-f861-4cc3-bb9f-1a340e3ec877" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 06 01:32:46 crc kubenswrapper[4991]: I1206 01:32:46.189094 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hdf44" Dec 06 01:32:46 crc kubenswrapper[4991]: I1206 01:32:46.192486 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 01:32:46 crc kubenswrapper[4991]: I1206 01:32:46.193074 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9tlxv" Dec 06 01:32:46 crc kubenswrapper[4991]: I1206 01:32:46.193648 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 01:32:46 crc kubenswrapper[4991]: I1206 01:32:46.200333 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 01:32:46 crc kubenswrapper[4991]: I1206 01:32:46.203011 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hdf44"] Dec 06 01:32:46 crc kubenswrapper[4991]: I1206 01:32:46.345703 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptb9h\" (UniqueName: \"kubernetes.io/projected/62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8-kube-api-access-ptb9h\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hdf44\" (UID: \"62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hdf44" Dec 06 01:32:46 crc kubenswrapper[4991]: I1206 01:32:46.345768 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hdf44\" (UID: \"62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hdf44" Dec 06 01:32:46 crc kubenswrapper[4991]: I1206 01:32:46.345899 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hdf44\" (UID: \"62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hdf44" Dec 06 01:32:46 crc kubenswrapper[4991]: I1206 01:32:46.447313 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hdf44\" (UID: \"62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hdf44" Dec 06 01:32:46 crc kubenswrapper[4991]: I1206 01:32:46.447443 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptb9h\" (UniqueName: \"kubernetes.io/projected/62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8-kube-api-access-ptb9h\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hdf44\" (UID: \"62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hdf44" Dec 06 01:32:46 crc kubenswrapper[4991]: I1206 01:32:46.447481 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hdf44\" (UID: \"62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hdf44" Dec 06 01:32:46 crc kubenswrapper[4991]: I1206 01:32:46.453035 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hdf44\" (UID: \"62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hdf44" Dec 06 01:32:46 crc kubenswrapper[4991]: I1206 01:32:46.460686 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hdf44\" (UID: \"62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hdf44" Dec 06 01:32:46 crc kubenswrapper[4991]: I1206 01:32:46.474028 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptb9h\" (UniqueName: \"kubernetes.io/projected/62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8-kube-api-access-ptb9h\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hdf44\" (UID: \"62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hdf44" Dec 06 01:32:46 crc kubenswrapper[4991]: I1206 01:32:46.513315 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hdf44" Dec 06 01:32:47 crc kubenswrapper[4991]: I1206 01:32:47.061808 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1befea2d-2e39-4901-98ef-da500dcb82f9" path="/var/lib/kubelet/pods/1befea2d-2e39-4901-98ef-da500dcb82f9/volumes" Dec 06 01:32:47 crc kubenswrapper[4991]: I1206 01:32:47.064076 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48107f10-c266-4cf9-b65f-bca87b8cdc86" path="/var/lib/kubelet/pods/48107f10-c266-4cf9-b65f-bca87b8cdc86/volumes" Dec 06 01:32:47 crc kubenswrapper[4991]: I1206 01:32:47.158382 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hdf44"] Dec 06 01:32:47 crc kubenswrapper[4991]: I1206 01:32:47.169843 4991 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 01:32:48 crc kubenswrapper[4991]: I1206 01:32:48.132793 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hdf44" event={"ID":"62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8","Type":"ContainerStarted","Data":"ad85873d79ceba636366038a59764e38d9fc9c801168f2f3016913c36feb8cd8"} Dec 06 01:32:48 crc kubenswrapper[4991]: I1206 01:32:48.133608 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hdf44" event={"ID":"62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8","Type":"ContainerStarted","Data":"87995888c35217cef52a96a3c62507d99798a09c83cc1ac957aa5911e2f62476"} Dec 06 01:32:48 crc kubenswrapper[4991]: I1206 01:32:48.163611 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hdf44" podStartSLOduration=1.961898325 podStartE2EDuration="2.163593028s" podCreationTimestamp="2025-12-06 01:32:46 +0000 UTC" firstStartedPulling="2025-12-06 01:32:47.16941671 +0000 UTC m=+1840.519707198" lastFinishedPulling="2025-12-06 01:32:47.371111433 +0000 UTC m=+1840.721401901" observedRunningTime="2025-12-06 01:32:48.15106749 +0000 UTC m=+1841.501357968" watchObservedRunningTime="2025-12-06 01:32:48.163593028 +0000 UTC m=+1841.513883496" Dec 06 01:32:59 crc kubenswrapper[4991]: I1206 01:32:59.037935 4991 scope.go:117] "RemoveContainer" containerID="1191fa5f428c9ddfe26ebd749363cce958056a5e267b2384684767a98228d9bb" Dec 06 01:32:59 crc kubenswrapper[4991]: E1206 01:32:59.038903 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:33:00 crc kubenswrapper[4991]: I1206 01:33:00.075277 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-zmlwh"] Dec 06 01:33:00 crc kubenswrapper[4991]: I1206 01:33:00.085361 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-669lm"] Dec 06 01:33:00 crc kubenswrapper[4991]: I1206 01:33:00.105014 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-669lm"] Dec 06 01:33:00 crc kubenswrapper[4991]: I1206 01:33:00.113340 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-zmlwh"] Dec 06 01:33:01 crc kubenswrapper[4991]: I1206 01:33:01.053957 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01da8e85-cc82-46af-9370-419a710b2a8d" path="/var/lib/kubelet/pods/01da8e85-cc82-46af-9370-419a710b2a8d/volumes" Dec 06 01:33:01 crc kubenswrapper[4991]: I1206 01:33:01.055087 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae198f8f-705c-406d-af09-387de07cb200" path="/var/lib/kubelet/pods/ae198f8f-705c-406d-af09-387de07cb200/volumes" Dec 06 01:33:09 crc kubenswrapper[4991]: I1206 01:33:09.075417 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-znbg5"] Dec 06 01:33:09 crc kubenswrapper[4991]: I1206 01:33:09.076068 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-7sjq8"] Dec 06 01:33:09 crc kubenswrapper[4991]: I1206 01:33:09.076094 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-7sjq8"] Dec 06 01:33:09 crc kubenswrapper[4991]: I1206 01:33:09.082148 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-znbg5"] Dec 06 01:33:10 crc kubenswrapper[4991]: I1206 01:33:10.038323 4991 scope.go:117] "RemoveContainer" containerID="1191fa5f428c9ddfe26ebd749363cce958056a5e267b2384684767a98228d9bb" Dec 06 01:33:10 crc kubenswrapper[4991]: E1206 01:33:10.038726 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:33:11 crc kubenswrapper[4991]: I1206 01:33:11.061134 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f102ba4-81e9-4d61-b5a7-90126ab9dd80" path="/var/lib/kubelet/pods/1f102ba4-81e9-4d61-b5a7-90126ab9dd80/volumes" Dec 06 01:33:11 crc kubenswrapper[4991]: I1206 01:33:11.063487 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41629f2e-563a-4e76-8ac2-530e8d204578" path="/var/lib/kubelet/pods/41629f2e-563a-4e76-8ac2-530e8d204578/volumes" Dec 06 01:33:16 crc kubenswrapper[4991]: I1206 01:33:16.350809 4991 scope.go:117] "RemoveContainer" containerID="c8d22aa6c500f81dff21c8443461eed1cfc666214b361737a7232ea85a332bcd" Dec 06 01:33:16 crc kubenswrapper[4991]: I1206 01:33:16.424656 4991 scope.go:117] "RemoveContainer" containerID="4a90743ef33aeb47c6f93044fbd3923478c6129c07c0bf622a6df73a06b52017" Dec 06 01:33:16 crc kubenswrapper[4991]: I1206 01:33:16.468792 4991 scope.go:117] "RemoveContainer" containerID="54571095ffb1dce420e52b47fa807af289281627c618d64453b102d3d8876e94" Dec 06 01:33:16 crc kubenswrapper[4991]: I1206 01:33:16.572419 4991 scope.go:117] "RemoveContainer" containerID="206f7e2f38328658fe5c0922e582194c7963df17b8d2aa3ff32ca649c0c51a4d" Dec 06 01:33:16 crc kubenswrapper[4991]: I1206 01:33:16.605753 4991 scope.go:117] "RemoveContainer" containerID="eee10b7a9a72aa2cf7fe4e5cdd174929dfe17242e129ac00d999664b94fd1593" Dec 06 01:33:16 crc kubenswrapper[4991]: I1206 01:33:16.660425 4991 scope.go:117] "RemoveContainer" containerID="749a729c7be8d240d98d7f22cc3e56464e3a2a9de0852d17fa2a4741ed2a4381" Dec 06 01:33:16 crc kubenswrapper[4991]: I1206 01:33:16.691136 4991 scope.go:117] "RemoveContainer" containerID="10da77ec5bb248db647aef422fbfa7a37863c9390171bce36e5c692118d93337" Dec 06 01:33:16 crc kubenswrapper[4991]: I1206 01:33:16.739155 4991 scope.go:117] "RemoveContainer" containerID="53ca5ffaa63419d3c128994a5584380b438b4f851235cc4a9fef77cc891d9f7f" Dec 06 01:33:22 crc kubenswrapper[4991]: I1206 01:33:22.038486 4991 scope.go:117] "RemoveContainer" containerID="1191fa5f428c9ddfe26ebd749363cce958056a5e267b2384684767a98228d9bb" Dec 06 01:33:22 crc kubenswrapper[4991]: E1206 01:33:22.039588 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:33:35 crc kubenswrapper[4991]: I1206 01:33:35.037876 4991 scope.go:117] "RemoveContainer" containerID="1191fa5f428c9ddfe26ebd749363cce958056a5e267b2384684767a98228d9bb" Dec 06 01:33:35 crc kubenswrapper[4991]: E1206 01:33:35.038975 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:33:47 crc kubenswrapper[4991]: I1206 01:33:47.053464 4991 scope.go:117] "RemoveContainer" containerID="1191fa5f428c9ddfe26ebd749363cce958056a5e267b2384684767a98228d9bb" Dec 06 01:33:47 crc kubenswrapper[4991]: E1206 01:33:47.054324 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:33:57 crc kubenswrapper[4991]: I1206 01:33:57.092510 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-hctc4"] Dec 06 01:33:57 crc kubenswrapper[4991]: I1206 01:33:57.106732 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-2l6rx"] Dec 06 01:33:57 crc kubenswrapper[4991]: I1206 01:33:57.117454 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-qt8jb"] Dec 06 01:33:57 crc kubenswrapper[4991]: I1206 01:33:57.126370 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-hctc4"] Dec 06 01:33:57 crc kubenswrapper[4991]: I1206 01:33:57.136106 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-adf2-account-create-update-57sjq"] Dec 06 01:33:57 crc kubenswrapper[4991]: I1206 01:33:57.145489 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-2l6rx"] Dec 06 01:33:57 crc kubenswrapper[4991]: I1206 01:33:57.154145 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-adf2-account-create-update-57sjq"] Dec 06 01:33:57 crc kubenswrapper[4991]: I1206 01:33:57.162936 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-qt8jb"] Dec 06 01:33:58 crc kubenswrapper[4991]: I1206 01:33:58.052808 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-f23d-account-create-update-lk7cd"] Dec 06 01:33:58 crc kubenswrapper[4991]: I1206 01:33:58.064042 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-f23d-account-create-update-lk7cd"] Dec 06 01:33:58 crc kubenswrapper[4991]: I1206 01:33:58.073564 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-92a2-account-create-update-8xs8q"] Dec 06 01:33:58 crc kubenswrapper[4991]: I1206 01:33:58.088042 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-92a2-account-create-update-8xs8q"] Dec 06 01:33:59 crc kubenswrapper[4991]: I1206 01:33:59.060141 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22ac1a63-4582-436a-a235-bb4fe90fe6f3" path="/var/lib/kubelet/pods/22ac1a63-4582-436a-a235-bb4fe90fe6f3/volumes" Dec 06 01:33:59 crc kubenswrapper[4991]: I1206 01:33:59.061373 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4899a057-9b8a-4dd0-8219-9a626d671fdf" path="/var/lib/kubelet/pods/4899a057-9b8a-4dd0-8219-9a626d671fdf/volumes" Dec 06 01:33:59 crc kubenswrapper[4991]: I1206 01:33:59.062650 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f786a82-5143-4f90-aed1-f9d4ffa49f00" path="/var/lib/kubelet/pods/6f786a82-5143-4f90-aed1-f9d4ffa49f00/volumes" Dec 06 01:33:59 crc kubenswrapper[4991]: I1206 01:33:59.063859 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f87ca3d-5755-4aa8-8cd0-99f6841bbd83" path="/var/lib/kubelet/pods/9f87ca3d-5755-4aa8-8cd0-99f6841bbd83/volumes" Dec 06 01:33:59 crc kubenswrapper[4991]: I1206 01:33:59.066624 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cad619ec-5672-46aa-b7d8-2f996561383d" path="/var/lib/kubelet/pods/cad619ec-5672-46aa-b7d8-2f996561383d/volumes" Dec 06 01:33:59 crc kubenswrapper[4991]: I1206 01:33:59.067948 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbf1edef-446b-4eb8-beba-fde054153a2b" path="/var/lib/kubelet/pods/dbf1edef-446b-4eb8-beba-fde054153a2b/volumes" Dec 06 01:34:02 crc kubenswrapper[4991]: I1206 01:34:02.037422 4991 scope.go:117] "RemoveContainer" containerID="1191fa5f428c9ddfe26ebd749363cce958056a5e267b2384684767a98228d9bb" Dec 06 01:34:02 crc kubenswrapper[4991]: E1206 01:34:02.038430 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:34:05 crc kubenswrapper[4991]: I1206 01:34:05.136322 4991 generic.go:334] "Generic (PLEG): container finished" podID="62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8" containerID="ad85873d79ceba636366038a59764e38d9fc9c801168f2f3016913c36feb8cd8" exitCode=0 Dec 06 01:34:05 crc kubenswrapper[4991]: I1206 01:34:05.136470 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hdf44" event={"ID":"62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8","Type":"ContainerDied","Data":"ad85873d79ceba636366038a59764e38d9fc9c801168f2f3016913c36feb8cd8"} Dec 06 01:34:06 crc kubenswrapper[4991]: I1206 01:34:06.661200 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hdf44" Dec 06 01:34:06 crc kubenswrapper[4991]: I1206 01:34:06.806556 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptb9h\" (UniqueName: \"kubernetes.io/projected/62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8-kube-api-access-ptb9h\") pod \"62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8\" (UID: \"62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8\") " Dec 06 01:34:06 crc kubenswrapper[4991]: I1206 01:34:06.806603 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8-inventory\") pod \"62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8\" (UID: \"62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8\") " Dec 06 01:34:06 crc kubenswrapper[4991]: I1206 01:34:06.806776 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8-ssh-key\") pod \"62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8\" (UID: \"62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8\") " Dec 06 01:34:06 crc kubenswrapper[4991]: I1206 01:34:06.819068 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8-kube-api-access-ptb9h" (OuterVolumeSpecName: "kube-api-access-ptb9h") pod "62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8" (UID: "62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8"). InnerVolumeSpecName "kube-api-access-ptb9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:34:06 crc kubenswrapper[4991]: I1206 01:34:06.840717 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8-inventory" (OuterVolumeSpecName: "inventory") pod "62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8" (UID: "62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:34:06 crc kubenswrapper[4991]: I1206 01:34:06.844918 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8" (UID: "62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:34:06 crc kubenswrapper[4991]: I1206 01:34:06.908811 4991 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 01:34:06 crc kubenswrapper[4991]: I1206 01:34:06.908843 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptb9h\" (UniqueName: \"kubernetes.io/projected/62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8-kube-api-access-ptb9h\") on node \"crc\" DevicePath \"\"" Dec 06 01:34:06 crc kubenswrapper[4991]: I1206 01:34:06.908857 4991 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 01:34:07 crc kubenswrapper[4991]: I1206 01:34:07.162859 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hdf44" event={"ID":"62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8","Type":"ContainerDied","Data":"87995888c35217cef52a96a3c62507d99798a09c83cc1ac957aa5911e2f62476"} Dec 06 01:34:07 crc kubenswrapper[4991]: I1206 01:34:07.163317 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87995888c35217cef52a96a3c62507d99798a09c83cc1ac957aa5911e2f62476" Dec 06 01:34:07 crc kubenswrapper[4991]: I1206 01:34:07.163044 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hdf44" Dec 06 01:34:07 crc kubenswrapper[4991]: I1206 01:34:07.287694 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9l6vj"] Dec 06 01:34:07 crc kubenswrapper[4991]: E1206 01:34:07.289057 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 06 01:34:07 crc kubenswrapper[4991]: I1206 01:34:07.289089 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 06 01:34:07 crc kubenswrapper[4991]: I1206 01:34:07.289587 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 06 01:34:07 crc kubenswrapper[4991]: I1206 01:34:07.310697 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9l6vj" Dec 06 01:34:07 crc kubenswrapper[4991]: I1206 01:34:07.313717 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 01:34:07 crc kubenswrapper[4991]: I1206 01:34:07.313941 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 01:34:07 crc kubenswrapper[4991]: I1206 01:34:07.314190 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 01:34:07 crc kubenswrapper[4991]: I1206 01:34:07.315735 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9l6vj"] Dec 06 01:34:07 crc kubenswrapper[4991]: I1206 01:34:07.316142 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9tlxv" Dec 06 01:34:07 crc kubenswrapper[4991]: I1206 01:34:07.420586 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvttp\" (UniqueName: \"kubernetes.io/projected/e4c4d67b-be1b-4502-a13d-96a119cbb436-kube-api-access-tvttp\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9l6vj\" (UID: \"e4c4d67b-be1b-4502-a13d-96a119cbb436\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9l6vj" Dec 06 01:34:07 crc kubenswrapper[4991]: I1206 01:34:07.420698 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e4c4d67b-be1b-4502-a13d-96a119cbb436-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9l6vj\" (UID: \"e4c4d67b-be1b-4502-a13d-96a119cbb436\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9l6vj" Dec 06 01:34:07 crc kubenswrapper[4991]: I1206 01:34:07.421203 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4c4d67b-be1b-4502-a13d-96a119cbb436-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9l6vj\" (UID: \"e4c4d67b-be1b-4502-a13d-96a119cbb436\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9l6vj" Dec 06 01:34:07 crc kubenswrapper[4991]: I1206 01:34:07.523535 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e4c4d67b-be1b-4502-a13d-96a119cbb436-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9l6vj\" (UID: \"e4c4d67b-be1b-4502-a13d-96a119cbb436\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9l6vj" Dec 06 01:34:07 crc kubenswrapper[4991]: I1206 01:34:07.523913 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4c4d67b-be1b-4502-a13d-96a119cbb436-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9l6vj\" (UID: \"e4c4d67b-be1b-4502-a13d-96a119cbb436\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9l6vj" Dec 06 01:34:07 crc kubenswrapper[4991]: I1206 01:34:07.524229 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvttp\" (UniqueName: \"kubernetes.io/projected/e4c4d67b-be1b-4502-a13d-96a119cbb436-kube-api-access-tvttp\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9l6vj\" (UID: \"e4c4d67b-be1b-4502-a13d-96a119cbb436\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9l6vj" Dec 06 01:34:07 crc kubenswrapper[4991]: I1206 01:34:07.530837 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4c4d67b-be1b-4502-a13d-96a119cbb436-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9l6vj\" (UID: \"e4c4d67b-be1b-4502-a13d-96a119cbb436\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9l6vj" Dec 06 01:34:07 crc kubenswrapper[4991]: I1206 01:34:07.534882 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e4c4d67b-be1b-4502-a13d-96a119cbb436-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9l6vj\" (UID: \"e4c4d67b-be1b-4502-a13d-96a119cbb436\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9l6vj" Dec 06 01:34:07 crc kubenswrapper[4991]: I1206 01:34:07.557142 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvttp\" (UniqueName: \"kubernetes.io/projected/e4c4d67b-be1b-4502-a13d-96a119cbb436-kube-api-access-tvttp\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9l6vj\" (UID: \"e4c4d67b-be1b-4502-a13d-96a119cbb436\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9l6vj" Dec 06 01:34:07 crc kubenswrapper[4991]: I1206 01:34:07.637269 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9l6vj" Dec 06 01:34:08 crc kubenswrapper[4991]: I1206 01:34:08.257098 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9l6vj"] Dec 06 01:34:08 crc kubenswrapper[4991]: I1206 01:34:08.667967 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pm2rl"] Dec 06 01:34:08 crc kubenswrapper[4991]: I1206 01:34:08.672451 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pm2rl" Dec 06 01:34:08 crc kubenswrapper[4991]: I1206 01:34:08.685220 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pm2rl"] Dec 06 01:34:08 crc kubenswrapper[4991]: I1206 01:34:08.851901 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvxrh\" (UniqueName: \"kubernetes.io/projected/cfb6a485-bd5c-4974-bc47-806a475931b9-kube-api-access-zvxrh\") pod \"community-operators-pm2rl\" (UID: \"cfb6a485-bd5c-4974-bc47-806a475931b9\") " pod="openshift-marketplace/community-operators-pm2rl" Dec 06 01:34:08 crc kubenswrapper[4991]: I1206 01:34:08.851968 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfb6a485-bd5c-4974-bc47-806a475931b9-catalog-content\") pod \"community-operators-pm2rl\" (UID: \"cfb6a485-bd5c-4974-bc47-806a475931b9\") " pod="openshift-marketplace/community-operators-pm2rl" Dec 06 01:34:08 crc kubenswrapper[4991]: I1206 01:34:08.852124 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfb6a485-bd5c-4974-bc47-806a475931b9-utilities\") pod \"community-operators-pm2rl\" (UID: \"cfb6a485-bd5c-4974-bc47-806a475931b9\") " pod="openshift-marketplace/community-operators-pm2rl" Dec 06 01:34:08 crc kubenswrapper[4991]: I1206 01:34:08.864423 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xpcwh"] Dec 06 01:34:08 crc kubenswrapper[4991]: I1206 01:34:08.870670 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xpcwh" Dec 06 01:34:08 crc kubenswrapper[4991]: I1206 01:34:08.882004 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xpcwh"] Dec 06 01:34:08 crc kubenswrapper[4991]: I1206 01:34:08.953660 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfb6a485-bd5c-4974-bc47-806a475931b9-utilities\") pod \"community-operators-pm2rl\" (UID: \"cfb6a485-bd5c-4974-bc47-806a475931b9\") " pod="openshift-marketplace/community-operators-pm2rl" Dec 06 01:34:08 crc kubenswrapper[4991]: I1206 01:34:08.953901 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvxrh\" (UniqueName: \"kubernetes.io/projected/cfb6a485-bd5c-4974-bc47-806a475931b9-kube-api-access-zvxrh\") pod \"community-operators-pm2rl\" (UID: \"cfb6a485-bd5c-4974-bc47-806a475931b9\") " pod="openshift-marketplace/community-operators-pm2rl" Dec 06 01:34:08 crc kubenswrapper[4991]: I1206 01:34:08.953956 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfb6a485-bd5c-4974-bc47-806a475931b9-catalog-content\") pod \"community-operators-pm2rl\" (UID: \"cfb6a485-bd5c-4974-bc47-806a475931b9\") " pod="openshift-marketplace/community-operators-pm2rl" Dec 06 01:34:08 crc kubenswrapper[4991]: I1206 01:34:08.954632 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfb6a485-bd5c-4974-bc47-806a475931b9-utilities\") pod \"community-operators-pm2rl\" (UID: \"cfb6a485-bd5c-4974-bc47-806a475931b9\") " pod="openshift-marketplace/community-operators-pm2rl" Dec 06 01:34:08 crc kubenswrapper[4991]: I1206 01:34:08.954700 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfb6a485-bd5c-4974-bc47-806a475931b9-catalog-content\") pod \"community-operators-pm2rl\" (UID: \"cfb6a485-bd5c-4974-bc47-806a475931b9\") " pod="openshift-marketplace/community-operators-pm2rl" Dec 06 01:34:08 crc kubenswrapper[4991]: I1206 01:34:08.979025 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvxrh\" (UniqueName: \"kubernetes.io/projected/cfb6a485-bd5c-4974-bc47-806a475931b9-kube-api-access-zvxrh\") pod \"community-operators-pm2rl\" (UID: \"cfb6a485-bd5c-4974-bc47-806a475931b9\") " pod="openshift-marketplace/community-operators-pm2rl" Dec 06 01:34:08 crc kubenswrapper[4991]: I1206 01:34:08.992721 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pm2rl" Dec 06 01:34:09 crc kubenswrapper[4991]: I1206 01:34:09.056799 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2cdccf7-38a8-41cc-994f-480434585c45-utilities\") pod \"certified-operators-xpcwh\" (UID: \"a2cdccf7-38a8-41cc-994f-480434585c45\") " pod="openshift-marketplace/certified-operators-xpcwh" Dec 06 01:34:09 crc kubenswrapper[4991]: I1206 01:34:09.056884 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njk42\" (UniqueName: \"kubernetes.io/projected/a2cdccf7-38a8-41cc-994f-480434585c45-kube-api-access-njk42\") pod \"certified-operators-xpcwh\" (UID: \"a2cdccf7-38a8-41cc-994f-480434585c45\") " pod="openshift-marketplace/certified-operators-xpcwh" Dec 06 01:34:09 crc kubenswrapper[4991]: I1206 01:34:09.056964 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2cdccf7-38a8-41cc-994f-480434585c45-catalog-content\") pod \"certified-operators-xpcwh\" (UID: \"a2cdccf7-38a8-41cc-994f-480434585c45\") " pod="openshift-marketplace/certified-operators-xpcwh" Dec 06 01:34:09 crc kubenswrapper[4991]: I1206 01:34:09.161373 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2cdccf7-38a8-41cc-994f-480434585c45-utilities\") pod \"certified-operators-xpcwh\" (UID: \"a2cdccf7-38a8-41cc-994f-480434585c45\") " pod="openshift-marketplace/certified-operators-xpcwh" Dec 06 01:34:09 crc kubenswrapper[4991]: I1206 01:34:09.161476 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njk42\" (UniqueName: \"kubernetes.io/projected/a2cdccf7-38a8-41cc-994f-480434585c45-kube-api-access-njk42\") pod \"certified-operators-xpcwh\" (UID: \"a2cdccf7-38a8-41cc-994f-480434585c45\") " pod="openshift-marketplace/certified-operators-xpcwh" Dec 06 01:34:09 crc kubenswrapper[4991]: I1206 01:34:09.161496 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2cdccf7-38a8-41cc-994f-480434585c45-catalog-content\") pod \"certified-operators-xpcwh\" (UID: \"a2cdccf7-38a8-41cc-994f-480434585c45\") " pod="openshift-marketplace/certified-operators-xpcwh" Dec 06 01:34:09 crc kubenswrapper[4991]: I1206 01:34:09.162409 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2cdccf7-38a8-41cc-994f-480434585c45-catalog-content\") pod \"certified-operators-xpcwh\" (UID: \"a2cdccf7-38a8-41cc-994f-480434585c45\") " pod="openshift-marketplace/certified-operators-xpcwh" Dec 06 01:34:09 crc kubenswrapper[4991]: I1206 01:34:09.166302 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2cdccf7-38a8-41cc-994f-480434585c45-utilities\") pod \"certified-operators-xpcwh\" (UID: \"a2cdccf7-38a8-41cc-994f-480434585c45\") " pod="openshift-marketplace/certified-operators-xpcwh" Dec 06 01:34:09 crc kubenswrapper[4991]: I1206 01:34:09.211215 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9l6vj" event={"ID":"e4c4d67b-be1b-4502-a13d-96a119cbb436","Type":"ContainerStarted","Data":"cd5f09699a33fb35a2254dcee558ca8e63b5c50be8a0039d9593bf1ffd3becc7"} Dec 06 01:34:09 crc kubenswrapper[4991]: I1206 01:34:09.211274 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9l6vj" event={"ID":"e4c4d67b-be1b-4502-a13d-96a119cbb436","Type":"ContainerStarted","Data":"104f1af6ec1a01ed5ec1d6c7bb8134b97289dac49239253b3dd8cb14a7926773"} Dec 06 01:34:09 crc kubenswrapper[4991]: I1206 01:34:09.229758 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njk42\" (UniqueName: \"kubernetes.io/projected/a2cdccf7-38a8-41cc-994f-480434585c45-kube-api-access-njk42\") pod \"certified-operators-xpcwh\" (UID: \"a2cdccf7-38a8-41cc-994f-480434585c45\") " pod="openshift-marketplace/certified-operators-xpcwh" Dec 06 01:34:09 crc kubenswrapper[4991]: I1206 01:34:09.255687 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9l6vj" podStartSLOduration=2.066104879 podStartE2EDuration="2.255473587s" podCreationTimestamp="2025-12-06 01:34:07 +0000 UTC" firstStartedPulling="2025-12-06 01:34:08.259922298 +0000 UTC m=+1921.610212766" lastFinishedPulling="2025-12-06 01:34:08.449291006 +0000 UTC m=+1921.799581474" observedRunningTime="2025-12-06 01:34:09.252493608 +0000 UTC m=+1922.602784076" watchObservedRunningTime="2025-12-06 01:34:09.255473587 +0000 UTC m=+1922.605764055" Dec 06 01:34:09 crc kubenswrapper[4991]: I1206 01:34:09.346562 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pm2rl"] Dec 06 01:34:09 crc kubenswrapper[4991]: I1206 01:34:09.493543 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xpcwh" Dec 06 01:34:10 crc kubenswrapper[4991]: I1206 01:34:10.029118 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xpcwh"] Dec 06 01:34:10 crc kubenswrapper[4991]: I1206 01:34:10.221391 4991 generic.go:334] "Generic (PLEG): container finished" podID="cfb6a485-bd5c-4974-bc47-806a475931b9" containerID="d8d562711c3fecc501b4a5d34c5dcd7ae60878e18a9f2e983b4e6dea5b0f252c" exitCode=0 Dec 06 01:34:10 crc kubenswrapper[4991]: I1206 01:34:10.221448 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pm2rl" event={"ID":"cfb6a485-bd5c-4974-bc47-806a475931b9","Type":"ContainerDied","Data":"d8d562711c3fecc501b4a5d34c5dcd7ae60878e18a9f2e983b4e6dea5b0f252c"} Dec 06 01:34:10 crc kubenswrapper[4991]: I1206 01:34:10.221834 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pm2rl" event={"ID":"cfb6a485-bd5c-4974-bc47-806a475931b9","Type":"ContainerStarted","Data":"024f9ad0df9933a5013295b06887fb899aa87b594a269e1b5a6dfb5cb1f0d3f0"} Dec 06 01:34:10 crc kubenswrapper[4991]: I1206 01:34:10.223588 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xpcwh" event={"ID":"a2cdccf7-38a8-41cc-994f-480434585c45","Type":"ContainerStarted","Data":"7fda2d8912d57f083229fcf9ff40422fc358e4fbdbcb09cf5905dd115405e0a7"} Dec 06 01:34:11 crc kubenswrapper[4991]: I1206 01:34:11.232764 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pm2rl" event={"ID":"cfb6a485-bd5c-4974-bc47-806a475931b9","Type":"ContainerStarted","Data":"d9f9b7f5e20873b164ae2648ea470c32ef7f9fa60200621236e064d833d7ca14"} Dec 06 01:34:11 crc kubenswrapper[4991]: I1206 01:34:11.235371 4991 generic.go:334] "Generic (PLEG): container finished" podID="a2cdccf7-38a8-41cc-994f-480434585c45" containerID="b3da2ed849964c6a47768e40e6fea64327db6dd56e9be7906b4e882672f363ef" exitCode=0 Dec 06 01:34:11 crc kubenswrapper[4991]: I1206 01:34:11.235417 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xpcwh" event={"ID":"a2cdccf7-38a8-41cc-994f-480434585c45","Type":"ContainerDied","Data":"b3da2ed849964c6a47768e40e6fea64327db6dd56e9be7906b4e882672f363ef"} Dec 06 01:34:12 crc kubenswrapper[4991]: I1206 01:34:12.251004 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xpcwh" event={"ID":"a2cdccf7-38a8-41cc-994f-480434585c45","Type":"ContainerStarted","Data":"b6489d3a60d403bb5f7a6736772b0ddddb5214473e4bf06b35478f5f5f03c283"} Dec 06 01:34:12 crc kubenswrapper[4991]: I1206 01:34:12.254726 4991 generic.go:334] "Generic (PLEG): container finished" podID="cfb6a485-bd5c-4974-bc47-806a475931b9" containerID="d9f9b7f5e20873b164ae2648ea470c32ef7f9fa60200621236e064d833d7ca14" exitCode=0 Dec 06 01:34:12 crc kubenswrapper[4991]: I1206 01:34:12.254756 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pm2rl" event={"ID":"cfb6a485-bd5c-4974-bc47-806a475931b9","Type":"ContainerDied","Data":"d9f9b7f5e20873b164ae2648ea470c32ef7f9fa60200621236e064d833d7ca14"} Dec 06 01:34:12 crc kubenswrapper[4991]: E1206 01:34:12.501489 4991 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2cdccf7_38a8_41cc_994f_480434585c45.slice/crio-b6489d3a60d403bb5f7a6736772b0ddddb5214473e4bf06b35478f5f5f03c283.scope\": RecentStats: unable to find data in memory cache]" Dec 06 01:34:13 crc kubenswrapper[4991]: I1206 01:34:13.281240 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pm2rl" event={"ID":"cfb6a485-bd5c-4974-bc47-806a475931b9","Type":"ContainerStarted","Data":"5f855fba09846b9175d60d59c2ba50719c71516168d0e32fcdedcc1d1e212fab"} Dec 06 01:34:13 crc kubenswrapper[4991]: I1206 01:34:13.312084 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pm2rl" podStartSLOduration=2.806389465 podStartE2EDuration="5.312049175s" podCreationTimestamp="2025-12-06 01:34:08 +0000 UTC" firstStartedPulling="2025-12-06 01:34:10.223432538 +0000 UTC m=+1923.573723016" lastFinishedPulling="2025-12-06 01:34:12.729092218 +0000 UTC m=+1926.079382726" observedRunningTime="2025-12-06 01:34:13.309616541 +0000 UTC m=+1926.659907039" watchObservedRunningTime="2025-12-06 01:34:13.312049175 +0000 UTC m=+1926.662339643" Dec 06 01:34:14 crc kubenswrapper[4991]: I1206 01:34:14.038823 4991 scope.go:117] "RemoveContainer" containerID="1191fa5f428c9ddfe26ebd749363cce958056a5e267b2384684767a98228d9bb" Dec 06 01:34:14 crc kubenswrapper[4991]: E1206 01:34:14.039455 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:34:14 crc kubenswrapper[4991]: I1206 01:34:14.295953 4991 generic.go:334] "Generic (PLEG): container finished" podID="a2cdccf7-38a8-41cc-994f-480434585c45" containerID="b6489d3a60d403bb5f7a6736772b0ddddb5214473e4bf06b35478f5f5f03c283" exitCode=0 Dec 06 01:34:14 crc kubenswrapper[4991]: I1206 01:34:14.296036 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xpcwh" event={"ID":"a2cdccf7-38a8-41cc-994f-480434585c45","Type":"ContainerDied","Data":"b6489d3a60d403bb5f7a6736772b0ddddb5214473e4bf06b35478f5f5f03c283"} Dec 06 01:34:15 crc kubenswrapper[4991]: I1206 01:34:15.308017 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xpcwh" event={"ID":"a2cdccf7-38a8-41cc-994f-480434585c45","Type":"ContainerStarted","Data":"6ab8b649970d793056f15cfe7a9f725980534f1688e3a30683e8832f5c2c3b01"} Dec 06 01:34:15 crc kubenswrapper[4991]: I1206 01:34:15.310090 4991 generic.go:334] "Generic (PLEG): container finished" podID="e4c4d67b-be1b-4502-a13d-96a119cbb436" containerID="cd5f09699a33fb35a2254dcee558ca8e63b5c50be8a0039d9593bf1ffd3becc7" exitCode=0 Dec 06 01:34:15 crc kubenswrapper[4991]: I1206 01:34:15.310144 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9l6vj" event={"ID":"e4c4d67b-be1b-4502-a13d-96a119cbb436","Type":"ContainerDied","Data":"cd5f09699a33fb35a2254dcee558ca8e63b5c50be8a0039d9593bf1ffd3becc7"} Dec 06 01:34:15 crc kubenswrapper[4991]: I1206 01:34:15.340429 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xpcwh" podStartSLOduration=3.824920121 podStartE2EDuration="7.340403096s" podCreationTimestamp="2025-12-06 01:34:08 +0000 UTC" firstStartedPulling="2025-12-06 01:34:11.237428224 +0000 UTC m=+1924.587718692" lastFinishedPulling="2025-12-06 01:34:14.752911189 +0000 UTC m=+1928.103201667" observedRunningTime="2025-12-06 01:34:15.330853824 +0000 UTC m=+1928.681144372" watchObservedRunningTime="2025-12-06 01:34:15.340403096 +0000 UTC m=+1928.690693604" Dec 06 01:34:16 crc kubenswrapper[4991]: I1206 01:34:16.798342 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9l6vj" Dec 06 01:34:16 crc kubenswrapper[4991]: I1206 01:34:16.929159 4991 scope.go:117] "RemoveContainer" containerID="0235a66113624b53f29859d28fc5f6a0ffa576010f106a33ba72c4c1ea8aa8d8" Dec 06 01:34:16 crc kubenswrapper[4991]: I1206 01:34:16.932790 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e4c4d67b-be1b-4502-a13d-96a119cbb436-ssh-key\") pod \"e4c4d67b-be1b-4502-a13d-96a119cbb436\" (UID: \"e4c4d67b-be1b-4502-a13d-96a119cbb436\") " Dec 06 01:34:16 crc kubenswrapper[4991]: I1206 01:34:16.932883 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvttp\" (UniqueName: \"kubernetes.io/projected/e4c4d67b-be1b-4502-a13d-96a119cbb436-kube-api-access-tvttp\") pod \"e4c4d67b-be1b-4502-a13d-96a119cbb436\" (UID: \"e4c4d67b-be1b-4502-a13d-96a119cbb436\") " Dec 06 01:34:16 crc kubenswrapper[4991]: I1206 01:34:16.933159 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4c4d67b-be1b-4502-a13d-96a119cbb436-inventory\") pod \"e4c4d67b-be1b-4502-a13d-96a119cbb436\" (UID: \"e4c4d67b-be1b-4502-a13d-96a119cbb436\") " Dec 06 01:34:16 crc kubenswrapper[4991]: I1206 01:34:16.958545 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4c4d67b-be1b-4502-a13d-96a119cbb436-kube-api-access-tvttp" (OuterVolumeSpecName: "kube-api-access-tvttp") pod "e4c4d67b-be1b-4502-a13d-96a119cbb436" (UID: "e4c4d67b-be1b-4502-a13d-96a119cbb436"). InnerVolumeSpecName "kube-api-access-tvttp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:34:16 crc kubenswrapper[4991]: I1206 01:34:16.962744 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4c4d67b-be1b-4502-a13d-96a119cbb436-inventory" (OuterVolumeSpecName: "inventory") pod "e4c4d67b-be1b-4502-a13d-96a119cbb436" (UID: "e4c4d67b-be1b-4502-a13d-96a119cbb436"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:34:16 crc kubenswrapper[4991]: I1206 01:34:16.964472 4991 scope.go:117] "RemoveContainer" containerID="8e09c4543f5a3160784e09fed0e4849d9c5931ae2962530def345e545c7e8e05" Dec 06 01:34:16 crc kubenswrapper[4991]: I1206 01:34:16.974321 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4c4d67b-be1b-4502-a13d-96a119cbb436-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e4c4d67b-be1b-4502-a13d-96a119cbb436" (UID: "e4c4d67b-be1b-4502-a13d-96a119cbb436"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:34:17 crc kubenswrapper[4991]: I1206 01:34:17.035296 4991 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4c4d67b-be1b-4502-a13d-96a119cbb436-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 01:34:17 crc kubenswrapper[4991]: I1206 01:34:17.035337 4991 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e4c4d67b-be1b-4502-a13d-96a119cbb436-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 01:34:17 crc kubenswrapper[4991]: I1206 01:34:17.035352 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvttp\" (UniqueName: \"kubernetes.io/projected/e4c4d67b-be1b-4502-a13d-96a119cbb436-kube-api-access-tvttp\") on node \"crc\" DevicePath \"\"" Dec 06 01:34:17 crc kubenswrapper[4991]: I1206 01:34:17.065563 4991 scope.go:117] "RemoveContainer" containerID="84f20f5129d9f77ae1b49058cf2080084cf5f23d46cb8e6fd7eeff9066a5d747" Dec 06 01:34:17 crc kubenswrapper[4991]: I1206 01:34:17.098431 4991 scope.go:117] "RemoveContainer" containerID="3b491f4c057603a34220deafe82de216d1ee9347fc6ae89a40de77462dabc0af" Dec 06 01:34:17 crc kubenswrapper[4991]: I1206 01:34:17.124470 4991 scope.go:117] "RemoveContainer" containerID="cd23f41039c06876fc04bf7affc4879fd72f6c958db0e42354341dce0b857b95" Dec 06 01:34:17 crc kubenswrapper[4991]: I1206 01:34:17.153508 4991 scope.go:117] "RemoveContainer" containerID="f4a6f9d709cfe08d2d52f893780718dbcf0330c9cf1fc125856dd67768020bed" Dec 06 01:34:17 crc kubenswrapper[4991]: I1206 01:34:17.342938 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9l6vj" event={"ID":"e4c4d67b-be1b-4502-a13d-96a119cbb436","Type":"ContainerDied","Data":"104f1af6ec1a01ed5ec1d6c7bb8134b97289dac49239253b3dd8cb14a7926773"} Dec 06 01:34:17 crc kubenswrapper[4991]: I1206 01:34:17.343028 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="104f1af6ec1a01ed5ec1d6c7bb8134b97289dac49239253b3dd8cb14a7926773" Dec 06 01:34:17 crc kubenswrapper[4991]: I1206 01:34:17.343097 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9l6vj" Dec 06 01:34:17 crc kubenswrapper[4991]: I1206 01:34:17.531782 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-gml2f"] Dec 06 01:34:17 crc kubenswrapper[4991]: E1206 01:34:17.532896 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4c4d67b-be1b-4502-a13d-96a119cbb436" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 06 01:34:17 crc kubenswrapper[4991]: I1206 01:34:17.533061 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4c4d67b-be1b-4502-a13d-96a119cbb436" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 06 01:34:17 crc kubenswrapper[4991]: I1206 01:34:17.533511 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4c4d67b-be1b-4502-a13d-96a119cbb436" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 06 01:34:17 crc kubenswrapper[4991]: I1206 01:34:17.534849 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gml2f" Dec 06 01:34:17 crc kubenswrapper[4991]: I1206 01:34:17.537978 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9tlxv" Dec 06 01:34:17 crc kubenswrapper[4991]: I1206 01:34:17.538004 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 01:34:17 crc kubenswrapper[4991]: I1206 01:34:17.541524 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 01:34:17 crc kubenswrapper[4991]: I1206 01:34:17.541833 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 01:34:17 crc kubenswrapper[4991]: I1206 01:34:17.542898 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-gml2f"] Dec 06 01:34:17 crc kubenswrapper[4991]: I1206 01:34:17.649470 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/801ece65-0bd2-49cb-9c03-439c624bb145-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gml2f\" (UID: \"801ece65-0bd2-49cb-9c03-439c624bb145\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gml2f" Dec 06 01:34:17 crc kubenswrapper[4991]: I1206 01:34:17.649604 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86n22\" (UniqueName: \"kubernetes.io/projected/801ece65-0bd2-49cb-9c03-439c624bb145-kube-api-access-86n22\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gml2f\" (UID: \"801ece65-0bd2-49cb-9c03-439c624bb145\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gml2f" Dec 06 01:34:17 crc kubenswrapper[4991]: I1206 01:34:17.649678 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/801ece65-0bd2-49cb-9c03-439c624bb145-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gml2f\" (UID: \"801ece65-0bd2-49cb-9c03-439c624bb145\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gml2f" Dec 06 01:34:17 crc kubenswrapper[4991]: I1206 01:34:17.751730 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/801ece65-0bd2-49cb-9c03-439c624bb145-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gml2f\" (UID: \"801ece65-0bd2-49cb-9c03-439c624bb145\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gml2f" Dec 06 01:34:17 crc kubenswrapper[4991]: I1206 01:34:17.752419 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/801ece65-0bd2-49cb-9c03-439c624bb145-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gml2f\" (UID: \"801ece65-0bd2-49cb-9c03-439c624bb145\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gml2f" Dec 06 01:34:17 crc kubenswrapper[4991]: I1206 01:34:17.752534 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86n22\" (UniqueName: \"kubernetes.io/projected/801ece65-0bd2-49cb-9c03-439c624bb145-kube-api-access-86n22\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gml2f\" (UID: \"801ece65-0bd2-49cb-9c03-439c624bb145\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gml2f" Dec 06 01:34:17 crc kubenswrapper[4991]: I1206 01:34:17.760749 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/801ece65-0bd2-49cb-9c03-439c624bb145-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gml2f\" (UID: \"801ece65-0bd2-49cb-9c03-439c624bb145\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gml2f" Dec 06 01:34:17 crc kubenswrapper[4991]: I1206 01:34:17.761575 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/801ece65-0bd2-49cb-9c03-439c624bb145-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gml2f\" (UID: \"801ece65-0bd2-49cb-9c03-439c624bb145\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gml2f" Dec 06 01:34:17 crc kubenswrapper[4991]: I1206 01:34:17.778321 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86n22\" (UniqueName: \"kubernetes.io/projected/801ece65-0bd2-49cb-9c03-439c624bb145-kube-api-access-86n22\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gml2f\" (UID: \"801ece65-0bd2-49cb-9c03-439c624bb145\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gml2f" Dec 06 01:34:17 crc kubenswrapper[4991]: I1206 01:34:17.889606 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gml2f" Dec 06 01:34:18 crc kubenswrapper[4991]: I1206 01:34:18.459410 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-gml2f"] Dec 06 01:34:18 crc kubenswrapper[4991]: W1206 01:34:18.460527 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod801ece65_0bd2_49cb_9c03_439c624bb145.slice/crio-d13565f17d71f878d76661928b58c7a0983c07d710522358d4abc164425ed2cb WatchSource:0}: Error finding container d13565f17d71f878d76661928b58c7a0983c07d710522358d4abc164425ed2cb: Status 404 returned error can't find the container with id d13565f17d71f878d76661928b58c7a0983c07d710522358d4abc164425ed2cb Dec 06 01:34:18 crc kubenswrapper[4991]: I1206 01:34:18.993567 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pm2rl" Dec 06 01:34:18 crc kubenswrapper[4991]: I1206 01:34:18.994010 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pm2rl" Dec 06 01:34:19 crc kubenswrapper[4991]: I1206 01:34:19.065836 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pm2rl" Dec 06 01:34:19 crc kubenswrapper[4991]: I1206 01:34:19.367688 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gml2f" event={"ID":"801ece65-0bd2-49cb-9c03-439c624bb145","Type":"ContainerStarted","Data":"c2281e23fab948809761211ba2be467d7452c3e1d1fe9ed591c12381bc0e3887"} Dec 06 01:34:19 crc kubenswrapper[4991]: I1206 01:34:19.367758 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gml2f" event={"ID":"801ece65-0bd2-49cb-9c03-439c624bb145","Type":"ContainerStarted","Data":"d13565f17d71f878d76661928b58c7a0983c07d710522358d4abc164425ed2cb"} Dec 06 01:34:19 crc kubenswrapper[4991]: I1206 01:34:19.394390 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gml2f" podStartSLOduration=2.194697996 podStartE2EDuration="2.394352126s" podCreationTimestamp="2025-12-06 01:34:17 +0000 UTC" firstStartedPulling="2025-12-06 01:34:18.464196704 +0000 UTC m=+1931.814487212" lastFinishedPulling="2025-12-06 01:34:18.663850864 +0000 UTC m=+1932.014141342" observedRunningTime="2025-12-06 01:34:19.390628457 +0000 UTC m=+1932.740918965" watchObservedRunningTime="2025-12-06 01:34:19.394352126 +0000 UTC m=+1932.744642624" Dec 06 01:34:19 crc kubenswrapper[4991]: I1206 01:34:19.478363 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pm2rl" Dec 06 01:34:19 crc kubenswrapper[4991]: I1206 01:34:19.494037 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xpcwh" Dec 06 01:34:19 crc kubenswrapper[4991]: I1206 01:34:19.496881 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xpcwh" Dec 06 01:34:19 crc kubenswrapper[4991]: I1206 01:34:19.555107 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pm2rl"] Dec 06 01:34:19 crc kubenswrapper[4991]: I1206 01:34:19.562494 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xpcwh" Dec 06 01:34:20 crc kubenswrapper[4991]: I1206 01:34:20.488192 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xpcwh" Dec 06 01:34:21 crc kubenswrapper[4991]: I1206 01:34:21.394152 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pm2rl" podUID="cfb6a485-bd5c-4974-bc47-806a475931b9" containerName="registry-server" containerID="cri-o://5f855fba09846b9175d60d59c2ba50719c71516168d0e32fcdedcc1d1e212fab" gracePeriod=2 Dec 06 01:34:21 crc kubenswrapper[4991]: I1206 01:34:21.845554 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xpcwh"] Dec 06 01:34:21 crc kubenswrapper[4991]: I1206 01:34:21.951403 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pm2rl" Dec 06 01:34:22 crc kubenswrapper[4991]: I1206 01:34:22.057241 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfb6a485-bd5c-4974-bc47-806a475931b9-catalog-content\") pod \"cfb6a485-bd5c-4974-bc47-806a475931b9\" (UID: \"cfb6a485-bd5c-4974-bc47-806a475931b9\") " Dec 06 01:34:22 crc kubenswrapper[4991]: I1206 01:34:22.057331 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvxrh\" (UniqueName: \"kubernetes.io/projected/cfb6a485-bd5c-4974-bc47-806a475931b9-kube-api-access-zvxrh\") pod \"cfb6a485-bd5c-4974-bc47-806a475931b9\" (UID: \"cfb6a485-bd5c-4974-bc47-806a475931b9\") " Dec 06 01:34:22 crc kubenswrapper[4991]: I1206 01:34:22.057357 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfb6a485-bd5c-4974-bc47-806a475931b9-utilities\") pod \"cfb6a485-bd5c-4974-bc47-806a475931b9\" (UID: \"cfb6a485-bd5c-4974-bc47-806a475931b9\") " Dec 06 01:34:22 crc kubenswrapper[4991]: I1206 01:34:22.058288 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfb6a485-bd5c-4974-bc47-806a475931b9-utilities" (OuterVolumeSpecName: "utilities") pod "cfb6a485-bd5c-4974-bc47-806a475931b9" (UID: "cfb6a485-bd5c-4974-bc47-806a475931b9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:34:22 crc kubenswrapper[4991]: I1206 01:34:22.067889 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfb6a485-bd5c-4974-bc47-806a475931b9-kube-api-access-zvxrh" (OuterVolumeSpecName: "kube-api-access-zvxrh") pod "cfb6a485-bd5c-4974-bc47-806a475931b9" (UID: "cfb6a485-bd5c-4974-bc47-806a475931b9"). InnerVolumeSpecName "kube-api-access-zvxrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:34:22 crc kubenswrapper[4991]: I1206 01:34:22.117251 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfb6a485-bd5c-4974-bc47-806a475931b9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cfb6a485-bd5c-4974-bc47-806a475931b9" (UID: "cfb6a485-bd5c-4974-bc47-806a475931b9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:34:22 crc kubenswrapper[4991]: I1206 01:34:22.159462 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvxrh\" (UniqueName: \"kubernetes.io/projected/cfb6a485-bd5c-4974-bc47-806a475931b9-kube-api-access-zvxrh\") on node \"crc\" DevicePath \"\"" Dec 06 01:34:22 crc kubenswrapper[4991]: I1206 01:34:22.159513 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfb6a485-bd5c-4974-bc47-806a475931b9-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 01:34:22 crc kubenswrapper[4991]: I1206 01:34:22.159531 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfb6a485-bd5c-4974-bc47-806a475931b9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 01:34:22 crc kubenswrapper[4991]: I1206 01:34:22.428927 4991 generic.go:334] "Generic (PLEG): container finished" podID="cfb6a485-bd5c-4974-bc47-806a475931b9" containerID="5f855fba09846b9175d60d59c2ba50719c71516168d0e32fcdedcc1d1e212fab" exitCode=0 Dec 06 01:34:22 crc kubenswrapper[4991]: I1206 01:34:22.430392 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pm2rl" Dec 06 01:34:22 crc kubenswrapper[4991]: I1206 01:34:22.431250 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pm2rl" event={"ID":"cfb6a485-bd5c-4974-bc47-806a475931b9","Type":"ContainerDied","Data":"5f855fba09846b9175d60d59c2ba50719c71516168d0e32fcdedcc1d1e212fab"} Dec 06 01:34:22 crc kubenswrapper[4991]: I1206 01:34:22.431338 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pm2rl" event={"ID":"cfb6a485-bd5c-4974-bc47-806a475931b9","Type":"ContainerDied","Data":"024f9ad0df9933a5013295b06887fb899aa87b594a269e1b5a6dfb5cb1f0d3f0"} Dec 06 01:34:22 crc kubenswrapper[4991]: I1206 01:34:22.431359 4991 scope.go:117] "RemoveContainer" containerID="5f855fba09846b9175d60d59c2ba50719c71516168d0e32fcdedcc1d1e212fab" Dec 06 01:34:22 crc kubenswrapper[4991]: I1206 01:34:22.454482 4991 scope.go:117] "RemoveContainer" containerID="d9f9b7f5e20873b164ae2648ea470c32ef7f9fa60200621236e064d833d7ca14" Dec 06 01:34:22 crc kubenswrapper[4991]: I1206 01:34:22.487372 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pm2rl"] Dec 06 01:34:22 crc kubenswrapper[4991]: I1206 01:34:22.495945 4991 scope.go:117] "RemoveContainer" containerID="d8d562711c3fecc501b4a5d34c5dcd7ae60878e18a9f2e983b4e6dea5b0f252c" Dec 06 01:34:22 crc kubenswrapper[4991]: I1206 01:34:22.498020 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pm2rl"] Dec 06 01:34:22 crc kubenswrapper[4991]: I1206 01:34:22.560604 4991 scope.go:117] "RemoveContainer" containerID="5f855fba09846b9175d60d59c2ba50719c71516168d0e32fcdedcc1d1e212fab" Dec 06 01:34:22 crc kubenswrapper[4991]: E1206 01:34:22.560978 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f855fba09846b9175d60d59c2ba50719c71516168d0e32fcdedcc1d1e212fab\": container with ID starting with 5f855fba09846b9175d60d59c2ba50719c71516168d0e32fcdedcc1d1e212fab not found: ID does not exist" containerID="5f855fba09846b9175d60d59c2ba50719c71516168d0e32fcdedcc1d1e212fab" Dec 06 01:34:22 crc kubenswrapper[4991]: I1206 01:34:22.561035 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f855fba09846b9175d60d59c2ba50719c71516168d0e32fcdedcc1d1e212fab"} err="failed to get container status \"5f855fba09846b9175d60d59c2ba50719c71516168d0e32fcdedcc1d1e212fab\": rpc error: code = NotFound desc = could not find container \"5f855fba09846b9175d60d59c2ba50719c71516168d0e32fcdedcc1d1e212fab\": container with ID starting with 5f855fba09846b9175d60d59c2ba50719c71516168d0e32fcdedcc1d1e212fab not found: ID does not exist" Dec 06 01:34:22 crc kubenswrapper[4991]: I1206 01:34:22.561066 4991 scope.go:117] "RemoveContainer" containerID="d9f9b7f5e20873b164ae2648ea470c32ef7f9fa60200621236e064d833d7ca14" Dec 06 01:34:22 crc kubenswrapper[4991]: E1206 01:34:22.561465 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9f9b7f5e20873b164ae2648ea470c32ef7f9fa60200621236e064d833d7ca14\": container with ID starting with d9f9b7f5e20873b164ae2648ea470c32ef7f9fa60200621236e064d833d7ca14 not found: ID does not exist" containerID="d9f9b7f5e20873b164ae2648ea470c32ef7f9fa60200621236e064d833d7ca14" Dec 06 01:34:22 crc kubenswrapper[4991]: I1206 01:34:22.561489 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9f9b7f5e20873b164ae2648ea470c32ef7f9fa60200621236e064d833d7ca14"} err="failed to get container status \"d9f9b7f5e20873b164ae2648ea470c32ef7f9fa60200621236e064d833d7ca14\": rpc error: code = NotFound desc = could not find container \"d9f9b7f5e20873b164ae2648ea470c32ef7f9fa60200621236e064d833d7ca14\": container with ID starting with d9f9b7f5e20873b164ae2648ea470c32ef7f9fa60200621236e064d833d7ca14 not found: ID does not exist" Dec 06 01:34:22 crc kubenswrapper[4991]: I1206 01:34:22.561512 4991 scope.go:117] "RemoveContainer" containerID="d8d562711c3fecc501b4a5d34c5dcd7ae60878e18a9f2e983b4e6dea5b0f252c" Dec 06 01:34:22 crc kubenswrapper[4991]: E1206 01:34:22.561905 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8d562711c3fecc501b4a5d34c5dcd7ae60878e18a9f2e983b4e6dea5b0f252c\": container with ID starting with d8d562711c3fecc501b4a5d34c5dcd7ae60878e18a9f2e983b4e6dea5b0f252c not found: ID does not exist" containerID="d8d562711c3fecc501b4a5d34c5dcd7ae60878e18a9f2e983b4e6dea5b0f252c" Dec 06 01:34:22 crc kubenswrapper[4991]: I1206 01:34:22.561941 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8d562711c3fecc501b4a5d34c5dcd7ae60878e18a9f2e983b4e6dea5b0f252c"} err="failed to get container status \"d8d562711c3fecc501b4a5d34c5dcd7ae60878e18a9f2e983b4e6dea5b0f252c\": rpc error: code = NotFound desc = could not find container \"d8d562711c3fecc501b4a5d34c5dcd7ae60878e18a9f2e983b4e6dea5b0f252c\": container with ID starting with d8d562711c3fecc501b4a5d34c5dcd7ae60878e18a9f2e983b4e6dea5b0f252c not found: ID does not exist" Dec 06 01:34:23 crc kubenswrapper[4991]: I1206 01:34:23.056503 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfb6a485-bd5c-4974-bc47-806a475931b9" path="/var/lib/kubelet/pods/cfb6a485-bd5c-4974-bc47-806a475931b9/volumes" Dec 06 01:34:23 crc kubenswrapper[4991]: I1206 01:34:23.440313 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xpcwh" podUID="a2cdccf7-38a8-41cc-994f-480434585c45" containerName="registry-server" containerID="cri-o://6ab8b649970d793056f15cfe7a9f725980534f1688e3a30683e8832f5c2c3b01" gracePeriod=2 Dec 06 01:34:23 crc kubenswrapper[4991]: I1206 01:34:23.908948 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xpcwh" Dec 06 01:34:23 crc kubenswrapper[4991]: I1206 01:34:23.998802 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2cdccf7-38a8-41cc-994f-480434585c45-catalog-content\") pod \"a2cdccf7-38a8-41cc-994f-480434585c45\" (UID: \"a2cdccf7-38a8-41cc-994f-480434585c45\") " Dec 06 01:34:23 crc kubenswrapper[4991]: I1206 01:34:23.998882 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njk42\" (UniqueName: \"kubernetes.io/projected/a2cdccf7-38a8-41cc-994f-480434585c45-kube-api-access-njk42\") pod \"a2cdccf7-38a8-41cc-994f-480434585c45\" (UID: \"a2cdccf7-38a8-41cc-994f-480434585c45\") " Dec 06 01:34:23 crc kubenswrapper[4991]: I1206 01:34:23.999051 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2cdccf7-38a8-41cc-994f-480434585c45-utilities\") pod \"a2cdccf7-38a8-41cc-994f-480434585c45\" (UID: \"a2cdccf7-38a8-41cc-994f-480434585c45\") " Dec 06 01:34:24 crc kubenswrapper[4991]: I1206 01:34:24.000369 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2cdccf7-38a8-41cc-994f-480434585c45-utilities" (OuterVolumeSpecName: "utilities") pod "a2cdccf7-38a8-41cc-994f-480434585c45" (UID: "a2cdccf7-38a8-41cc-994f-480434585c45"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:34:24 crc kubenswrapper[4991]: I1206 01:34:24.006815 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2cdccf7-38a8-41cc-994f-480434585c45-kube-api-access-njk42" (OuterVolumeSpecName: "kube-api-access-njk42") pod "a2cdccf7-38a8-41cc-994f-480434585c45" (UID: "a2cdccf7-38a8-41cc-994f-480434585c45"). InnerVolumeSpecName "kube-api-access-njk42". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:34:24 crc kubenswrapper[4991]: I1206 01:34:24.057217 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2cdccf7-38a8-41cc-994f-480434585c45-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a2cdccf7-38a8-41cc-994f-480434585c45" (UID: "a2cdccf7-38a8-41cc-994f-480434585c45"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:34:24 crc kubenswrapper[4991]: I1206 01:34:24.101761 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2cdccf7-38a8-41cc-994f-480434585c45-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 01:34:24 crc kubenswrapper[4991]: I1206 01:34:24.101800 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2cdccf7-38a8-41cc-994f-480434585c45-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 01:34:24 crc kubenswrapper[4991]: I1206 01:34:24.101953 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njk42\" (UniqueName: \"kubernetes.io/projected/a2cdccf7-38a8-41cc-994f-480434585c45-kube-api-access-njk42\") on node \"crc\" DevicePath \"\"" Dec 06 01:34:24 crc kubenswrapper[4991]: I1206 01:34:24.451651 4991 generic.go:334] "Generic (PLEG): container finished" podID="a2cdccf7-38a8-41cc-994f-480434585c45" containerID="6ab8b649970d793056f15cfe7a9f725980534f1688e3a30683e8832f5c2c3b01" exitCode=0 Dec 06 01:34:24 crc kubenswrapper[4991]: I1206 01:34:24.451713 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xpcwh" event={"ID":"a2cdccf7-38a8-41cc-994f-480434585c45","Type":"ContainerDied","Data":"6ab8b649970d793056f15cfe7a9f725980534f1688e3a30683e8832f5c2c3b01"} Dec 06 01:34:24 crc kubenswrapper[4991]: I1206 01:34:24.451770 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xpcwh" event={"ID":"a2cdccf7-38a8-41cc-994f-480434585c45","Type":"ContainerDied","Data":"7fda2d8912d57f083229fcf9ff40422fc358e4fbdbcb09cf5905dd115405e0a7"} Dec 06 01:34:24 crc kubenswrapper[4991]: I1206 01:34:24.451799 4991 scope.go:117] "RemoveContainer" containerID="6ab8b649970d793056f15cfe7a9f725980534f1688e3a30683e8832f5c2c3b01" Dec 06 01:34:24 crc kubenswrapper[4991]: I1206 01:34:24.452244 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xpcwh" Dec 06 01:34:24 crc kubenswrapper[4991]: I1206 01:34:24.470299 4991 scope.go:117] "RemoveContainer" containerID="b6489d3a60d403bb5f7a6736772b0ddddb5214473e4bf06b35478f5f5f03c283" Dec 06 01:34:24 crc kubenswrapper[4991]: I1206 01:34:24.499073 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xpcwh"] Dec 06 01:34:24 crc kubenswrapper[4991]: I1206 01:34:24.501616 4991 scope.go:117] "RemoveContainer" containerID="b3da2ed849964c6a47768e40e6fea64327db6dd56e9be7906b4e882672f363ef" Dec 06 01:34:24 crc kubenswrapper[4991]: I1206 01:34:24.509326 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xpcwh"] Dec 06 01:34:24 crc kubenswrapper[4991]: I1206 01:34:24.561800 4991 scope.go:117] "RemoveContainer" containerID="6ab8b649970d793056f15cfe7a9f725980534f1688e3a30683e8832f5c2c3b01" Dec 06 01:34:24 crc kubenswrapper[4991]: E1206 01:34:24.562387 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ab8b649970d793056f15cfe7a9f725980534f1688e3a30683e8832f5c2c3b01\": container with ID starting with 6ab8b649970d793056f15cfe7a9f725980534f1688e3a30683e8832f5c2c3b01 not found: ID does not exist" containerID="6ab8b649970d793056f15cfe7a9f725980534f1688e3a30683e8832f5c2c3b01" Dec 06 01:34:24 crc kubenswrapper[4991]: I1206 01:34:24.562458 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ab8b649970d793056f15cfe7a9f725980534f1688e3a30683e8832f5c2c3b01"} err="failed to get container status \"6ab8b649970d793056f15cfe7a9f725980534f1688e3a30683e8832f5c2c3b01\": rpc error: code = NotFound desc = could not find container \"6ab8b649970d793056f15cfe7a9f725980534f1688e3a30683e8832f5c2c3b01\": container with ID starting with 6ab8b649970d793056f15cfe7a9f725980534f1688e3a30683e8832f5c2c3b01 not found: ID does not exist" Dec 06 01:34:24 crc kubenswrapper[4991]: I1206 01:34:24.562503 4991 scope.go:117] "RemoveContainer" containerID="b6489d3a60d403bb5f7a6736772b0ddddb5214473e4bf06b35478f5f5f03c283" Dec 06 01:34:24 crc kubenswrapper[4991]: E1206 01:34:24.563233 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6489d3a60d403bb5f7a6736772b0ddddb5214473e4bf06b35478f5f5f03c283\": container with ID starting with b6489d3a60d403bb5f7a6736772b0ddddb5214473e4bf06b35478f5f5f03c283 not found: ID does not exist" containerID="b6489d3a60d403bb5f7a6736772b0ddddb5214473e4bf06b35478f5f5f03c283" Dec 06 01:34:24 crc kubenswrapper[4991]: I1206 01:34:24.563278 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6489d3a60d403bb5f7a6736772b0ddddb5214473e4bf06b35478f5f5f03c283"} err="failed to get container status \"b6489d3a60d403bb5f7a6736772b0ddddb5214473e4bf06b35478f5f5f03c283\": rpc error: code = NotFound desc = could not find container \"b6489d3a60d403bb5f7a6736772b0ddddb5214473e4bf06b35478f5f5f03c283\": container with ID starting with b6489d3a60d403bb5f7a6736772b0ddddb5214473e4bf06b35478f5f5f03c283 not found: ID does not exist" Dec 06 01:34:24 crc kubenswrapper[4991]: I1206 01:34:24.563311 4991 scope.go:117] "RemoveContainer" containerID="b3da2ed849964c6a47768e40e6fea64327db6dd56e9be7906b4e882672f363ef" Dec 06 01:34:24 crc kubenswrapper[4991]: E1206 01:34:24.563608 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3da2ed849964c6a47768e40e6fea64327db6dd56e9be7906b4e882672f363ef\": container with ID starting with b3da2ed849964c6a47768e40e6fea64327db6dd56e9be7906b4e882672f363ef not found: ID does not exist" containerID="b3da2ed849964c6a47768e40e6fea64327db6dd56e9be7906b4e882672f363ef" Dec 06 01:34:24 crc kubenswrapper[4991]: I1206 01:34:24.563644 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3da2ed849964c6a47768e40e6fea64327db6dd56e9be7906b4e882672f363ef"} err="failed to get container status \"b3da2ed849964c6a47768e40e6fea64327db6dd56e9be7906b4e882672f363ef\": rpc error: code = NotFound desc = could not find container \"b3da2ed849964c6a47768e40e6fea64327db6dd56e9be7906b4e882672f363ef\": container with ID starting with b3da2ed849964c6a47768e40e6fea64327db6dd56e9be7906b4e882672f363ef not found: ID does not exist" Dec 06 01:34:25 crc kubenswrapper[4991]: I1206 01:34:25.038384 4991 scope.go:117] "RemoveContainer" containerID="1191fa5f428c9ddfe26ebd749363cce958056a5e267b2384684767a98228d9bb" Dec 06 01:34:25 crc kubenswrapper[4991]: E1206 01:34:25.038919 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:34:25 crc kubenswrapper[4991]: I1206 01:34:25.058182 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2cdccf7-38a8-41cc-994f-480434585c45" path="/var/lib/kubelet/pods/a2cdccf7-38a8-41cc-994f-480434585c45/volumes" Dec 06 01:34:29 crc kubenswrapper[4991]: I1206 01:34:29.065514 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ktmxx"] Dec 06 01:34:29 crc kubenswrapper[4991]: I1206 01:34:29.067971 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ktmxx"] Dec 06 01:34:31 crc kubenswrapper[4991]: I1206 01:34:31.054613 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4490e9af-2abc-4d2c-83e2-ce5b7324a32a" path="/var/lib/kubelet/pods/4490e9af-2abc-4d2c-83e2-ce5b7324a32a/volumes" Dec 06 01:34:37 crc kubenswrapper[4991]: I1206 01:34:37.048370 4991 scope.go:117] "RemoveContainer" containerID="1191fa5f428c9ddfe26ebd749363cce958056a5e267b2384684767a98228d9bb" Dec 06 01:34:37 crc kubenswrapper[4991]: E1206 01:34:37.049569 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:34:50 crc kubenswrapper[4991]: I1206 01:34:50.082041 4991 scope.go:117] "RemoveContainer" containerID="1191fa5f428c9ddfe26ebd749363cce958056a5e267b2384684767a98228d9bb" Dec 06 01:34:50 crc kubenswrapper[4991]: E1206 01:34:50.082870 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:34:52 crc kubenswrapper[4991]: I1206 01:34:52.043712 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-m77vk"] Dec 06 01:34:52 crc kubenswrapper[4991]: I1206 01:34:52.054898 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jtnk5"] Dec 06 01:34:52 crc kubenswrapper[4991]: I1206 01:34:52.064648 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-m77vk"] Dec 06 01:34:52 crc kubenswrapper[4991]: I1206 01:34:52.076028 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jtnk5"] Dec 06 01:34:53 crc kubenswrapper[4991]: I1206 01:34:53.049651 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fc9bd44-9dec-455c-85a1-87064befa3ce" path="/var/lib/kubelet/pods/3fc9bd44-9dec-455c-85a1-87064befa3ce/volumes" Dec 06 01:34:53 crc kubenswrapper[4991]: I1206 01:34:53.050947 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="828f26ed-d1bf-4625-a643-a72c9207b0a5" path="/var/lib/kubelet/pods/828f26ed-d1bf-4625-a643-a72c9207b0a5/volumes" Dec 06 01:34:58 crc kubenswrapper[4991]: I1206 01:34:58.882726 4991 generic.go:334] "Generic (PLEG): container finished" podID="801ece65-0bd2-49cb-9c03-439c624bb145" containerID="c2281e23fab948809761211ba2be467d7452c3e1d1fe9ed591c12381bc0e3887" exitCode=0 Dec 06 01:34:58 crc kubenswrapper[4991]: I1206 01:34:58.882816 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gml2f" event={"ID":"801ece65-0bd2-49cb-9c03-439c624bb145","Type":"ContainerDied","Data":"c2281e23fab948809761211ba2be467d7452c3e1d1fe9ed591c12381bc0e3887"} Dec 06 01:35:00 crc kubenswrapper[4991]: I1206 01:35:00.333613 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gml2f" Dec 06 01:35:00 crc kubenswrapper[4991]: I1206 01:35:00.415614 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86n22\" (UniqueName: \"kubernetes.io/projected/801ece65-0bd2-49cb-9c03-439c624bb145-kube-api-access-86n22\") pod \"801ece65-0bd2-49cb-9c03-439c624bb145\" (UID: \"801ece65-0bd2-49cb-9c03-439c624bb145\") " Dec 06 01:35:00 crc kubenswrapper[4991]: I1206 01:35:00.415743 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/801ece65-0bd2-49cb-9c03-439c624bb145-ssh-key\") pod \"801ece65-0bd2-49cb-9c03-439c624bb145\" (UID: \"801ece65-0bd2-49cb-9c03-439c624bb145\") " Dec 06 01:35:00 crc kubenswrapper[4991]: I1206 01:35:00.415869 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/801ece65-0bd2-49cb-9c03-439c624bb145-inventory\") pod \"801ece65-0bd2-49cb-9c03-439c624bb145\" (UID: \"801ece65-0bd2-49cb-9c03-439c624bb145\") " Dec 06 01:35:00 crc kubenswrapper[4991]: I1206 01:35:00.422347 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/801ece65-0bd2-49cb-9c03-439c624bb145-kube-api-access-86n22" (OuterVolumeSpecName: "kube-api-access-86n22") pod "801ece65-0bd2-49cb-9c03-439c624bb145" (UID: "801ece65-0bd2-49cb-9c03-439c624bb145"). InnerVolumeSpecName "kube-api-access-86n22". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:35:00 crc kubenswrapper[4991]: I1206 01:35:00.447815 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/801ece65-0bd2-49cb-9c03-439c624bb145-inventory" (OuterVolumeSpecName: "inventory") pod "801ece65-0bd2-49cb-9c03-439c624bb145" (UID: "801ece65-0bd2-49cb-9c03-439c624bb145"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:35:00 crc kubenswrapper[4991]: I1206 01:35:00.467193 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/801ece65-0bd2-49cb-9c03-439c624bb145-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "801ece65-0bd2-49cb-9c03-439c624bb145" (UID: "801ece65-0bd2-49cb-9c03-439c624bb145"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:35:00 crc kubenswrapper[4991]: I1206 01:35:00.517722 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86n22\" (UniqueName: \"kubernetes.io/projected/801ece65-0bd2-49cb-9c03-439c624bb145-kube-api-access-86n22\") on node \"crc\" DevicePath \"\"" Dec 06 01:35:00 crc kubenswrapper[4991]: I1206 01:35:00.517759 4991 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/801ece65-0bd2-49cb-9c03-439c624bb145-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 01:35:00 crc kubenswrapper[4991]: I1206 01:35:00.517772 4991 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/801ece65-0bd2-49cb-9c03-439c624bb145-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 01:35:00 crc kubenswrapper[4991]: I1206 01:35:00.921163 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gml2f" event={"ID":"801ece65-0bd2-49cb-9c03-439c624bb145","Type":"ContainerDied","Data":"d13565f17d71f878d76661928b58c7a0983c07d710522358d4abc164425ed2cb"} Dec 06 01:35:00 crc kubenswrapper[4991]: I1206 01:35:00.921225 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d13565f17d71f878d76661928b58c7a0983c07d710522358d4abc164425ed2cb" Dec 06 01:35:00 crc kubenswrapper[4991]: I1206 01:35:00.921290 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gml2f" Dec 06 01:35:01 crc kubenswrapper[4991]: I1206 01:35:01.033534 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mwrmt"] Dec 06 01:35:01 crc kubenswrapper[4991]: E1206 01:35:01.034038 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2cdccf7-38a8-41cc-994f-480434585c45" containerName="extract-content" Dec 06 01:35:01 crc kubenswrapper[4991]: I1206 01:35:01.034057 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2cdccf7-38a8-41cc-994f-480434585c45" containerName="extract-content" Dec 06 01:35:01 crc kubenswrapper[4991]: E1206 01:35:01.034067 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2cdccf7-38a8-41cc-994f-480434585c45" containerName="registry-server" Dec 06 01:35:01 crc kubenswrapper[4991]: I1206 01:35:01.034075 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2cdccf7-38a8-41cc-994f-480434585c45" containerName="registry-server" Dec 06 01:35:01 crc kubenswrapper[4991]: E1206 01:35:01.034115 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfb6a485-bd5c-4974-bc47-806a475931b9" containerName="extract-utilities" Dec 06 01:35:01 crc kubenswrapper[4991]: I1206 01:35:01.034124 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfb6a485-bd5c-4974-bc47-806a475931b9" containerName="extract-utilities" Dec 06 01:35:01 crc kubenswrapper[4991]: E1206 01:35:01.034133 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfb6a485-bd5c-4974-bc47-806a475931b9" containerName="registry-server" Dec 06 01:35:01 crc kubenswrapper[4991]: I1206 01:35:01.034140 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfb6a485-bd5c-4974-bc47-806a475931b9" containerName="registry-server" Dec 06 01:35:01 crc kubenswrapper[4991]: E1206 01:35:01.034151 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfb6a485-bd5c-4974-bc47-806a475931b9" containerName="extract-content" Dec 06 01:35:01 crc kubenswrapper[4991]: I1206 01:35:01.034158 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfb6a485-bd5c-4974-bc47-806a475931b9" containerName="extract-content" Dec 06 01:35:01 crc kubenswrapper[4991]: E1206 01:35:01.034171 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="801ece65-0bd2-49cb-9c03-439c624bb145" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 06 01:35:01 crc kubenswrapper[4991]: I1206 01:35:01.034181 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="801ece65-0bd2-49cb-9c03-439c624bb145" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 06 01:35:01 crc kubenswrapper[4991]: E1206 01:35:01.034199 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2cdccf7-38a8-41cc-994f-480434585c45" containerName="extract-utilities" Dec 06 01:35:01 crc kubenswrapper[4991]: I1206 01:35:01.034207 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2cdccf7-38a8-41cc-994f-480434585c45" containerName="extract-utilities" Dec 06 01:35:01 crc kubenswrapper[4991]: I1206 01:35:01.034445 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2cdccf7-38a8-41cc-994f-480434585c45" containerName="registry-server" Dec 06 01:35:01 crc kubenswrapper[4991]: I1206 01:35:01.034460 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfb6a485-bd5c-4974-bc47-806a475931b9" containerName="registry-server" Dec 06 01:35:01 crc kubenswrapper[4991]: I1206 01:35:01.034486 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="801ece65-0bd2-49cb-9c03-439c624bb145" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 06 01:35:01 crc kubenswrapper[4991]: I1206 01:35:01.035541 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mwrmt" Dec 06 01:35:01 crc kubenswrapper[4991]: I1206 01:35:01.038545 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 01:35:01 crc kubenswrapper[4991]: I1206 01:35:01.039049 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9tlxv" Dec 06 01:35:01 crc kubenswrapper[4991]: I1206 01:35:01.039195 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 01:35:01 crc kubenswrapper[4991]: I1206 01:35:01.039629 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 01:35:01 crc kubenswrapper[4991]: I1206 01:35:01.101325 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mwrmt"] Dec 06 01:35:01 crc kubenswrapper[4991]: I1206 01:35:01.235410 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdngx\" (UniqueName: \"kubernetes.io/projected/69c9b4b8-de83-4f43-8670-e9279daadb9b-kube-api-access-rdngx\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mwrmt\" (UID: \"69c9b4b8-de83-4f43-8670-e9279daadb9b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mwrmt" Dec 06 01:35:01 crc kubenswrapper[4991]: I1206 01:35:01.235491 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69c9b4b8-de83-4f43-8670-e9279daadb9b-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mwrmt\" (UID: \"69c9b4b8-de83-4f43-8670-e9279daadb9b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mwrmt" Dec 06 01:35:01 crc kubenswrapper[4991]: I1206 01:35:01.235763 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/69c9b4b8-de83-4f43-8670-e9279daadb9b-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mwrmt\" (UID: \"69c9b4b8-de83-4f43-8670-e9279daadb9b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mwrmt" Dec 06 01:35:01 crc kubenswrapper[4991]: I1206 01:35:01.336934 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69c9b4b8-de83-4f43-8670-e9279daadb9b-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mwrmt\" (UID: \"69c9b4b8-de83-4f43-8670-e9279daadb9b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mwrmt" Dec 06 01:35:01 crc kubenswrapper[4991]: I1206 01:35:01.337009 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdngx\" (UniqueName: \"kubernetes.io/projected/69c9b4b8-de83-4f43-8670-e9279daadb9b-kube-api-access-rdngx\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mwrmt\" (UID: \"69c9b4b8-de83-4f43-8670-e9279daadb9b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mwrmt" Dec 06 01:35:01 crc kubenswrapper[4991]: I1206 01:35:01.337101 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/69c9b4b8-de83-4f43-8670-e9279daadb9b-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mwrmt\" (UID: \"69c9b4b8-de83-4f43-8670-e9279daadb9b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mwrmt" Dec 06 01:35:01 crc kubenswrapper[4991]: I1206 01:35:01.343709 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69c9b4b8-de83-4f43-8670-e9279daadb9b-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mwrmt\" (UID: \"69c9b4b8-de83-4f43-8670-e9279daadb9b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mwrmt" Dec 06 01:35:01 crc kubenswrapper[4991]: I1206 01:35:01.347515 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/69c9b4b8-de83-4f43-8670-e9279daadb9b-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mwrmt\" (UID: \"69c9b4b8-de83-4f43-8670-e9279daadb9b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mwrmt" Dec 06 01:35:01 crc kubenswrapper[4991]: I1206 01:35:01.368541 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdngx\" (UniqueName: \"kubernetes.io/projected/69c9b4b8-de83-4f43-8670-e9279daadb9b-kube-api-access-rdngx\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mwrmt\" (UID: \"69c9b4b8-de83-4f43-8670-e9279daadb9b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mwrmt" Dec 06 01:35:01 crc kubenswrapper[4991]: I1206 01:35:01.414534 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mwrmt" Dec 06 01:35:01 crc kubenswrapper[4991]: I1206 01:35:01.995704 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mwrmt"] Dec 06 01:35:02 crc kubenswrapper[4991]: I1206 01:35:02.943761 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mwrmt" event={"ID":"69c9b4b8-de83-4f43-8670-e9279daadb9b","Type":"ContainerStarted","Data":"d3d69cc2ca60618e9d19a324a000ff21bdb3e3b3da2bd7b47bf82455d244f015"} Dec 06 01:35:02 crc kubenswrapper[4991]: I1206 01:35:02.944203 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mwrmt" event={"ID":"69c9b4b8-de83-4f43-8670-e9279daadb9b","Type":"ContainerStarted","Data":"eccdcf8e5e52934fafdf03614a6e52b075ee1e764781fdba2427c5ae2f94e3f2"} Dec 06 01:35:02 crc kubenswrapper[4991]: I1206 01:35:02.975105 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mwrmt" podStartSLOduration=1.77727595 podStartE2EDuration="1.975081921s" podCreationTimestamp="2025-12-06 01:35:01 +0000 UTC" firstStartedPulling="2025-12-06 01:35:01.996959592 +0000 UTC m=+1975.347250060" lastFinishedPulling="2025-12-06 01:35:02.194765523 +0000 UTC m=+1975.545056031" observedRunningTime="2025-12-06 01:35:02.966233007 +0000 UTC m=+1976.316523485" watchObservedRunningTime="2025-12-06 01:35:02.975081921 +0000 UTC m=+1976.325372409" Dec 06 01:35:03 crc kubenswrapper[4991]: I1206 01:35:03.038329 4991 scope.go:117] "RemoveContainer" containerID="1191fa5f428c9ddfe26ebd749363cce958056a5e267b2384684767a98228d9bb" Dec 06 01:35:03 crc kubenswrapper[4991]: E1206 01:35:03.038744 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:35:17 crc kubenswrapper[4991]: I1206 01:35:17.054709 4991 scope.go:117] "RemoveContainer" containerID="1191fa5f428c9ddfe26ebd749363cce958056a5e267b2384684767a98228d9bb" Dec 06 01:35:17 crc kubenswrapper[4991]: I1206 01:35:17.303584 4991 scope.go:117] "RemoveContainer" containerID="e4520f4c284293c26259544c9abb2578872b250069cdf4ad7a9791a263cff66c" Dec 06 01:35:17 crc kubenswrapper[4991]: I1206 01:35:17.351803 4991 scope.go:117] "RemoveContainer" containerID="8e1b13a8c747a4ec45a52166136620ce2aa7810514a9fcfaae96d18504b018cb" Dec 06 01:35:17 crc kubenswrapper[4991]: I1206 01:35:17.411573 4991 scope.go:117] "RemoveContainer" containerID="d1285efbb0688e01733671f5dbc93c616f9c7da3c7945ee4ef98b0d0c37a60b0" Dec 06 01:35:18 crc kubenswrapper[4991]: I1206 01:35:18.131391 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" event={"ID":"f35cea72-9e31-4432-9e3b-16a50ebce794","Type":"ContainerStarted","Data":"a03b675513fc3791b58894fb873d91c11d8a1aef5118ca63efd76f9359bb700d"} Dec 06 01:35:37 crc kubenswrapper[4991]: I1206 01:35:37.054931 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-wkngq"] Dec 06 01:35:37 crc kubenswrapper[4991]: I1206 01:35:37.062766 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-wkngq"] Dec 06 01:35:39 crc kubenswrapper[4991]: I1206 01:35:39.056843 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df43ecec-34dd-4ec1-aa1e-bdc70ff19359" path="/var/lib/kubelet/pods/df43ecec-34dd-4ec1-aa1e-bdc70ff19359/volumes" Dec 06 01:35:55 crc kubenswrapper[4991]: I1206 01:35:55.564946 4991 generic.go:334] "Generic (PLEG): container finished" podID="69c9b4b8-de83-4f43-8670-e9279daadb9b" containerID="d3d69cc2ca60618e9d19a324a000ff21bdb3e3b3da2bd7b47bf82455d244f015" exitCode=0 Dec 06 01:35:55 crc kubenswrapper[4991]: I1206 01:35:55.565051 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mwrmt" event={"ID":"69c9b4b8-de83-4f43-8670-e9279daadb9b","Type":"ContainerDied","Data":"d3d69cc2ca60618e9d19a324a000ff21bdb3e3b3da2bd7b47bf82455d244f015"} Dec 06 01:35:57 crc kubenswrapper[4991]: I1206 01:35:57.179093 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mwrmt" Dec 06 01:35:57 crc kubenswrapper[4991]: I1206 01:35:57.315421 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdngx\" (UniqueName: \"kubernetes.io/projected/69c9b4b8-de83-4f43-8670-e9279daadb9b-kube-api-access-rdngx\") pod \"69c9b4b8-de83-4f43-8670-e9279daadb9b\" (UID: \"69c9b4b8-de83-4f43-8670-e9279daadb9b\") " Dec 06 01:35:57 crc kubenswrapper[4991]: I1206 01:35:57.315600 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69c9b4b8-de83-4f43-8670-e9279daadb9b-inventory\") pod \"69c9b4b8-de83-4f43-8670-e9279daadb9b\" (UID: \"69c9b4b8-de83-4f43-8670-e9279daadb9b\") " Dec 06 01:35:57 crc kubenswrapper[4991]: I1206 01:35:57.315824 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/69c9b4b8-de83-4f43-8670-e9279daadb9b-ssh-key\") pod \"69c9b4b8-de83-4f43-8670-e9279daadb9b\" (UID: \"69c9b4b8-de83-4f43-8670-e9279daadb9b\") " Dec 06 01:35:57 crc kubenswrapper[4991]: I1206 01:35:57.322889 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69c9b4b8-de83-4f43-8670-e9279daadb9b-kube-api-access-rdngx" (OuterVolumeSpecName: "kube-api-access-rdngx") pod "69c9b4b8-de83-4f43-8670-e9279daadb9b" (UID: "69c9b4b8-de83-4f43-8670-e9279daadb9b"). InnerVolumeSpecName "kube-api-access-rdngx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:35:57 crc kubenswrapper[4991]: I1206 01:35:57.352028 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69c9b4b8-de83-4f43-8670-e9279daadb9b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "69c9b4b8-de83-4f43-8670-e9279daadb9b" (UID: "69c9b4b8-de83-4f43-8670-e9279daadb9b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:35:57 crc kubenswrapper[4991]: I1206 01:35:57.355811 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69c9b4b8-de83-4f43-8670-e9279daadb9b-inventory" (OuterVolumeSpecName: "inventory") pod "69c9b4b8-de83-4f43-8670-e9279daadb9b" (UID: "69c9b4b8-de83-4f43-8670-e9279daadb9b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:35:57 crc kubenswrapper[4991]: I1206 01:35:57.419236 4991 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/69c9b4b8-de83-4f43-8670-e9279daadb9b-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 01:35:57 crc kubenswrapper[4991]: I1206 01:35:57.419300 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdngx\" (UniqueName: \"kubernetes.io/projected/69c9b4b8-de83-4f43-8670-e9279daadb9b-kube-api-access-rdngx\") on node \"crc\" DevicePath \"\"" Dec 06 01:35:57 crc kubenswrapper[4991]: I1206 01:35:57.419318 4991 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69c9b4b8-de83-4f43-8670-e9279daadb9b-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 01:35:57 crc kubenswrapper[4991]: I1206 01:35:57.609173 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mwrmt" event={"ID":"69c9b4b8-de83-4f43-8670-e9279daadb9b","Type":"ContainerDied","Data":"eccdcf8e5e52934fafdf03614a6e52b075ee1e764781fdba2427c5ae2f94e3f2"} Dec 06 01:35:57 crc kubenswrapper[4991]: I1206 01:35:57.609232 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eccdcf8e5e52934fafdf03614a6e52b075ee1e764781fdba2427c5ae2f94e3f2" Dec 06 01:35:57 crc kubenswrapper[4991]: I1206 01:35:57.609309 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mwrmt" Dec 06 01:35:57 crc kubenswrapper[4991]: I1206 01:35:57.701467 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-74xt7"] Dec 06 01:35:57 crc kubenswrapper[4991]: E1206 01:35:57.702395 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69c9b4b8-de83-4f43-8670-e9279daadb9b" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 06 01:35:57 crc kubenswrapper[4991]: I1206 01:35:57.702580 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="69c9b4b8-de83-4f43-8670-e9279daadb9b" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 06 01:35:57 crc kubenswrapper[4991]: I1206 01:35:57.703142 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="69c9b4b8-de83-4f43-8670-e9279daadb9b" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 06 01:35:57 crc kubenswrapper[4991]: I1206 01:35:57.704416 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-74xt7" Dec 06 01:35:57 crc kubenswrapper[4991]: I1206 01:35:57.707002 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9tlxv" Dec 06 01:35:57 crc kubenswrapper[4991]: I1206 01:35:57.707213 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 01:35:57 crc kubenswrapper[4991]: I1206 01:35:57.711355 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-74xt7"] Dec 06 01:35:57 crc kubenswrapper[4991]: I1206 01:35:57.713398 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 01:35:57 crc kubenswrapper[4991]: I1206 01:35:57.713467 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 01:35:57 crc kubenswrapper[4991]: I1206 01:35:57.829426 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5fxx\" (UniqueName: \"kubernetes.io/projected/dfadc831-65e8-4c83-a3f6-1316db177fb5-kube-api-access-v5fxx\") pod \"ssh-known-hosts-edpm-deployment-74xt7\" (UID: \"dfadc831-65e8-4c83-a3f6-1316db177fb5\") " pod="openstack/ssh-known-hosts-edpm-deployment-74xt7" Dec 06 01:35:57 crc kubenswrapper[4991]: I1206 01:35:57.829484 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/dfadc831-65e8-4c83-a3f6-1316db177fb5-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-74xt7\" (UID: \"dfadc831-65e8-4c83-a3f6-1316db177fb5\") " pod="openstack/ssh-known-hosts-edpm-deployment-74xt7" Dec 06 01:35:57 crc kubenswrapper[4991]: I1206 01:35:57.829527 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dfadc831-65e8-4c83-a3f6-1316db177fb5-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-74xt7\" (UID: \"dfadc831-65e8-4c83-a3f6-1316db177fb5\") " pod="openstack/ssh-known-hosts-edpm-deployment-74xt7" Dec 06 01:35:57 crc kubenswrapper[4991]: I1206 01:35:57.931910 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5fxx\" (UniqueName: \"kubernetes.io/projected/dfadc831-65e8-4c83-a3f6-1316db177fb5-kube-api-access-v5fxx\") pod \"ssh-known-hosts-edpm-deployment-74xt7\" (UID: \"dfadc831-65e8-4c83-a3f6-1316db177fb5\") " pod="openstack/ssh-known-hosts-edpm-deployment-74xt7" Dec 06 01:35:57 crc kubenswrapper[4991]: I1206 01:35:57.931973 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/dfadc831-65e8-4c83-a3f6-1316db177fb5-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-74xt7\" (UID: \"dfadc831-65e8-4c83-a3f6-1316db177fb5\") " pod="openstack/ssh-known-hosts-edpm-deployment-74xt7" Dec 06 01:35:57 crc kubenswrapper[4991]: I1206 01:35:57.932029 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dfadc831-65e8-4c83-a3f6-1316db177fb5-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-74xt7\" (UID: \"dfadc831-65e8-4c83-a3f6-1316db177fb5\") " pod="openstack/ssh-known-hosts-edpm-deployment-74xt7" Dec 06 01:35:57 crc kubenswrapper[4991]: I1206 01:35:57.937747 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dfadc831-65e8-4c83-a3f6-1316db177fb5-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-74xt7\" (UID: \"dfadc831-65e8-4c83-a3f6-1316db177fb5\") " pod="openstack/ssh-known-hosts-edpm-deployment-74xt7" Dec 06 01:35:57 crc kubenswrapper[4991]: I1206 01:35:57.941888 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/dfadc831-65e8-4c83-a3f6-1316db177fb5-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-74xt7\" (UID: \"dfadc831-65e8-4c83-a3f6-1316db177fb5\") " pod="openstack/ssh-known-hosts-edpm-deployment-74xt7" Dec 06 01:35:57 crc kubenswrapper[4991]: I1206 01:35:57.956347 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5fxx\" (UniqueName: \"kubernetes.io/projected/dfadc831-65e8-4c83-a3f6-1316db177fb5-kube-api-access-v5fxx\") pod \"ssh-known-hosts-edpm-deployment-74xt7\" (UID: \"dfadc831-65e8-4c83-a3f6-1316db177fb5\") " pod="openstack/ssh-known-hosts-edpm-deployment-74xt7" Dec 06 01:35:58 crc kubenswrapper[4991]: I1206 01:35:58.032388 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-74xt7" Dec 06 01:35:58 crc kubenswrapper[4991]: I1206 01:35:58.649537 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-74xt7"] Dec 06 01:35:59 crc kubenswrapper[4991]: I1206 01:35:59.633156 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-74xt7" event={"ID":"dfadc831-65e8-4c83-a3f6-1316db177fb5","Type":"ContainerStarted","Data":"46bd7ca9c55a00f7937975f9c3d322562005059a5e171e17dfd92a0b5330f9af"} Dec 06 01:35:59 crc kubenswrapper[4991]: I1206 01:35:59.633581 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-74xt7" event={"ID":"dfadc831-65e8-4c83-a3f6-1316db177fb5","Type":"ContainerStarted","Data":"fc90cf3a802ee9981f15fc78eb8bda76d95fe336b75e96b01173f52b7180b403"} Dec 06 01:35:59 crc kubenswrapper[4991]: I1206 01:35:59.656140 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-74xt7" podStartSLOduration=2.4679790280000002 podStartE2EDuration="2.656120514s" podCreationTimestamp="2025-12-06 01:35:57 +0000 UTC" firstStartedPulling="2025-12-06 01:35:58.650599243 +0000 UTC m=+2032.000889711" lastFinishedPulling="2025-12-06 01:35:58.838740729 +0000 UTC m=+2032.189031197" observedRunningTime="2025-12-06 01:35:59.647883947 +0000 UTC m=+2032.998174505" watchObservedRunningTime="2025-12-06 01:35:59.656120514 +0000 UTC m=+2033.006410982" Dec 06 01:36:06 crc kubenswrapper[4991]: I1206 01:36:06.717555 4991 generic.go:334] "Generic (PLEG): container finished" podID="dfadc831-65e8-4c83-a3f6-1316db177fb5" containerID="46bd7ca9c55a00f7937975f9c3d322562005059a5e171e17dfd92a0b5330f9af" exitCode=0 Dec 06 01:36:06 crc kubenswrapper[4991]: I1206 01:36:06.717670 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-74xt7" event={"ID":"dfadc831-65e8-4c83-a3f6-1316db177fb5","Type":"ContainerDied","Data":"46bd7ca9c55a00f7937975f9c3d322562005059a5e171e17dfd92a0b5330f9af"} Dec 06 01:36:08 crc kubenswrapper[4991]: I1206 01:36:08.233211 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-74xt7" Dec 06 01:36:08 crc kubenswrapper[4991]: I1206 01:36:08.270551 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5fxx\" (UniqueName: \"kubernetes.io/projected/dfadc831-65e8-4c83-a3f6-1316db177fb5-kube-api-access-v5fxx\") pod \"dfadc831-65e8-4c83-a3f6-1316db177fb5\" (UID: \"dfadc831-65e8-4c83-a3f6-1316db177fb5\") " Dec 06 01:36:08 crc kubenswrapper[4991]: I1206 01:36:08.270750 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dfadc831-65e8-4c83-a3f6-1316db177fb5-ssh-key-openstack-edpm-ipam\") pod \"dfadc831-65e8-4c83-a3f6-1316db177fb5\" (UID: \"dfadc831-65e8-4c83-a3f6-1316db177fb5\") " Dec 06 01:36:08 crc kubenswrapper[4991]: I1206 01:36:08.271042 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/dfadc831-65e8-4c83-a3f6-1316db177fb5-inventory-0\") pod \"dfadc831-65e8-4c83-a3f6-1316db177fb5\" (UID: \"dfadc831-65e8-4c83-a3f6-1316db177fb5\") " Dec 06 01:36:08 crc kubenswrapper[4991]: I1206 01:36:08.277588 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfadc831-65e8-4c83-a3f6-1316db177fb5-kube-api-access-v5fxx" (OuterVolumeSpecName: "kube-api-access-v5fxx") pod "dfadc831-65e8-4c83-a3f6-1316db177fb5" (UID: "dfadc831-65e8-4c83-a3f6-1316db177fb5"). InnerVolumeSpecName "kube-api-access-v5fxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:36:08 crc kubenswrapper[4991]: I1206 01:36:08.314568 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfadc831-65e8-4c83-a3f6-1316db177fb5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "dfadc831-65e8-4c83-a3f6-1316db177fb5" (UID: "dfadc831-65e8-4c83-a3f6-1316db177fb5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:36:08 crc kubenswrapper[4991]: I1206 01:36:08.315443 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfadc831-65e8-4c83-a3f6-1316db177fb5-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "dfadc831-65e8-4c83-a3f6-1316db177fb5" (UID: "dfadc831-65e8-4c83-a3f6-1316db177fb5"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:36:08 crc kubenswrapper[4991]: I1206 01:36:08.374248 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5fxx\" (UniqueName: \"kubernetes.io/projected/dfadc831-65e8-4c83-a3f6-1316db177fb5-kube-api-access-v5fxx\") on node \"crc\" DevicePath \"\"" Dec 06 01:36:08 crc kubenswrapper[4991]: I1206 01:36:08.374498 4991 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dfadc831-65e8-4c83-a3f6-1316db177fb5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 06 01:36:08 crc kubenswrapper[4991]: I1206 01:36:08.374559 4991 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/dfadc831-65e8-4c83-a3f6-1316db177fb5-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 06 01:36:08 crc kubenswrapper[4991]: I1206 01:36:08.745271 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-74xt7" event={"ID":"dfadc831-65e8-4c83-a3f6-1316db177fb5","Type":"ContainerDied","Data":"fc90cf3a802ee9981f15fc78eb8bda76d95fe336b75e96b01173f52b7180b403"} Dec 06 01:36:08 crc kubenswrapper[4991]: I1206 01:36:08.745334 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc90cf3a802ee9981f15fc78eb8bda76d95fe336b75e96b01173f52b7180b403" Dec 06 01:36:08 crc kubenswrapper[4991]: I1206 01:36:08.745387 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-74xt7" Dec 06 01:36:08 crc kubenswrapper[4991]: I1206 01:36:08.849537 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-lwgtr"] Dec 06 01:36:08 crc kubenswrapper[4991]: E1206 01:36:08.850219 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfadc831-65e8-4c83-a3f6-1316db177fb5" containerName="ssh-known-hosts-edpm-deployment" Dec 06 01:36:08 crc kubenswrapper[4991]: I1206 01:36:08.850240 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfadc831-65e8-4c83-a3f6-1316db177fb5" containerName="ssh-known-hosts-edpm-deployment" Dec 06 01:36:08 crc kubenswrapper[4991]: I1206 01:36:08.850566 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfadc831-65e8-4c83-a3f6-1316db177fb5" containerName="ssh-known-hosts-edpm-deployment" Dec 06 01:36:08 crc kubenswrapper[4991]: I1206 01:36:08.851689 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lwgtr" Dec 06 01:36:08 crc kubenswrapper[4991]: I1206 01:36:08.854701 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 01:36:08 crc kubenswrapper[4991]: I1206 01:36:08.858495 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9tlxv" Dec 06 01:36:08 crc kubenswrapper[4991]: I1206 01:36:08.858822 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 01:36:08 crc kubenswrapper[4991]: I1206 01:36:08.860203 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 01:36:08 crc kubenswrapper[4991]: I1206 01:36:08.866153 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-lwgtr"] Dec 06 01:36:08 crc kubenswrapper[4991]: I1206 01:36:08.886234 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1c339885-2776-49cc-88b6-e79ff184d76a-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-lwgtr\" (UID: \"1c339885-2776-49cc-88b6-e79ff184d76a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lwgtr" Dec 06 01:36:08 crc kubenswrapper[4991]: I1206 01:36:08.886305 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c339885-2776-49cc-88b6-e79ff184d76a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-lwgtr\" (UID: \"1c339885-2776-49cc-88b6-e79ff184d76a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lwgtr" Dec 06 01:36:08 crc kubenswrapper[4991]: I1206 01:36:08.886503 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frcv2\" (UniqueName: \"kubernetes.io/projected/1c339885-2776-49cc-88b6-e79ff184d76a-kube-api-access-frcv2\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-lwgtr\" (UID: \"1c339885-2776-49cc-88b6-e79ff184d76a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lwgtr" Dec 06 01:36:08 crc kubenswrapper[4991]: I1206 01:36:08.989665 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frcv2\" (UniqueName: \"kubernetes.io/projected/1c339885-2776-49cc-88b6-e79ff184d76a-kube-api-access-frcv2\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-lwgtr\" (UID: \"1c339885-2776-49cc-88b6-e79ff184d76a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lwgtr" Dec 06 01:36:08 crc kubenswrapper[4991]: I1206 01:36:08.989897 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1c339885-2776-49cc-88b6-e79ff184d76a-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-lwgtr\" (UID: \"1c339885-2776-49cc-88b6-e79ff184d76a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lwgtr" Dec 06 01:36:08 crc kubenswrapper[4991]: I1206 01:36:08.989940 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c339885-2776-49cc-88b6-e79ff184d76a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-lwgtr\" (UID: \"1c339885-2776-49cc-88b6-e79ff184d76a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lwgtr" Dec 06 01:36:08 crc kubenswrapper[4991]: I1206 01:36:08.996366 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c339885-2776-49cc-88b6-e79ff184d76a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-lwgtr\" (UID: \"1c339885-2776-49cc-88b6-e79ff184d76a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lwgtr" Dec 06 01:36:08 crc kubenswrapper[4991]: I1206 01:36:08.997324 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1c339885-2776-49cc-88b6-e79ff184d76a-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-lwgtr\" (UID: \"1c339885-2776-49cc-88b6-e79ff184d76a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lwgtr" Dec 06 01:36:09 crc kubenswrapper[4991]: I1206 01:36:09.014141 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frcv2\" (UniqueName: \"kubernetes.io/projected/1c339885-2776-49cc-88b6-e79ff184d76a-kube-api-access-frcv2\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-lwgtr\" (UID: \"1c339885-2776-49cc-88b6-e79ff184d76a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lwgtr" Dec 06 01:36:09 crc kubenswrapper[4991]: I1206 01:36:09.185233 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lwgtr" Dec 06 01:36:09 crc kubenswrapper[4991]: I1206 01:36:09.770917 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-lwgtr"] Dec 06 01:36:10 crc kubenswrapper[4991]: I1206 01:36:10.765886 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lwgtr" event={"ID":"1c339885-2776-49cc-88b6-e79ff184d76a","Type":"ContainerStarted","Data":"52dae9bbf7c7ea1c6ab4a498051ce793e1a3c9a61a0f412a6bd21ebcbec5d50f"} Dec 06 01:36:10 crc kubenswrapper[4991]: I1206 01:36:10.766468 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lwgtr" event={"ID":"1c339885-2776-49cc-88b6-e79ff184d76a","Type":"ContainerStarted","Data":"9cda40385a4acaf12d0cfd857f25f68a467f38a5a9675769c38cccf94a0081cf"} Dec 06 01:36:10 crc kubenswrapper[4991]: I1206 01:36:10.789012 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lwgtr" podStartSLOduration=2.58127762 podStartE2EDuration="2.78847456s" podCreationTimestamp="2025-12-06 01:36:08 +0000 UTC" firstStartedPulling="2025-12-06 01:36:09.779944588 +0000 UTC m=+2043.130235056" lastFinishedPulling="2025-12-06 01:36:09.987141528 +0000 UTC m=+2043.337431996" observedRunningTime="2025-12-06 01:36:10.783402366 +0000 UTC m=+2044.133692844" watchObservedRunningTime="2025-12-06 01:36:10.78847456 +0000 UTC m=+2044.138765028" Dec 06 01:36:17 crc kubenswrapper[4991]: I1206 01:36:17.585626 4991 scope.go:117] "RemoveContainer" containerID="94f8e0d6570e7f3ee8ea306d927da00a57a898b9302ebe496fc79e910650da3c" Dec 06 01:36:18 crc kubenswrapper[4991]: I1206 01:36:18.872554 4991 generic.go:334] "Generic (PLEG): container finished" podID="1c339885-2776-49cc-88b6-e79ff184d76a" containerID="52dae9bbf7c7ea1c6ab4a498051ce793e1a3c9a61a0f412a6bd21ebcbec5d50f" exitCode=0 Dec 06 01:36:18 crc kubenswrapper[4991]: I1206 01:36:18.872581 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lwgtr" event={"ID":"1c339885-2776-49cc-88b6-e79ff184d76a","Type":"ContainerDied","Data":"52dae9bbf7c7ea1c6ab4a498051ce793e1a3c9a61a0f412a6bd21ebcbec5d50f"} Dec 06 01:36:20 crc kubenswrapper[4991]: I1206 01:36:20.301046 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lwgtr" Dec 06 01:36:20 crc kubenswrapper[4991]: I1206 01:36:20.448493 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c339885-2776-49cc-88b6-e79ff184d76a-inventory\") pod \"1c339885-2776-49cc-88b6-e79ff184d76a\" (UID: \"1c339885-2776-49cc-88b6-e79ff184d76a\") " Dec 06 01:36:20 crc kubenswrapper[4991]: I1206 01:36:20.449260 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frcv2\" (UniqueName: \"kubernetes.io/projected/1c339885-2776-49cc-88b6-e79ff184d76a-kube-api-access-frcv2\") pod \"1c339885-2776-49cc-88b6-e79ff184d76a\" (UID: \"1c339885-2776-49cc-88b6-e79ff184d76a\") " Dec 06 01:36:20 crc kubenswrapper[4991]: I1206 01:36:20.449371 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1c339885-2776-49cc-88b6-e79ff184d76a-ssh-key\") pod \"1c339885-2776-49cc-88b6-e79ff184d76a\" (UID: \"1c339885-2776-49cc-88b6-e79ff184d76a\") " Dec 06 01:36:20 crc kubenswrapper[4991]: I1206 01:36:20.457272 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c339885-2776-49cc-88b6-e79ff184d76a-kube-api-access-frcv2" (OuterVolumeSpecName: "kube-api-access-frcv2") pod "1c339885-2776-49cc-88b6-e79ff184d76a" (UID: "1c339885-2776-49cc-88b6-e79ff184d76a"). InnerVolumeSpecName "kube-api-access-frcv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:36:20 crc kubenswrapper[4991]: I1206 01:36:20.482708 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c339885-2776-49cc-88b6-e79ff184d76a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1c339885-2776-49cc-88b6-e79ff184d76a" (UID: "1c339885-2776-49cc-88b6-e79ff184d76a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:36:20 crc kubenswrapper[4991]: I1206 01:36:20.484360 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c339885-2776-49cc-88b6-e79ff184d76a-inventory" (OuterVolumeSpecName: "inventory") pod "1c339885-2776-49cc-88b6-e79ff184d76a" (UID: "1c339885-2776-49cc-88b6-e79ff184d76a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:36:20 crc kubenswrapper[4991]: I1206 01:36:20.553330 4991 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c339885-2776-49cc-88b6-e79ff184d76a-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 01:36:20 crc kubenswrapper[4991]: I1206 01:36:20.553795 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frcv2\" (UniqueName: \"kubernetes.io/projected/1c339885-2776-49cc-88b6-e79ff184d76a-kube-api-access-frcv2\") on node \"crc\" DevicePath \"\"" Dec 06 01:36:20 crc kubenswrapper[4991]: I1206 01:36:20.553980 4991 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1c339885-2776-49cc-88b6-e79ff184d76a-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 01:36:20 crc kubenswrapper[4991]: I1206 01:36:20.900339 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lwgtr" event={"ID":"1c339885-2776-49cc-88b6-e79ff184d76a","Type":"ContainerDied","Data":"9cda40385a4acaf12d0cfd857f25f68a467f38a5a9675769c38cccf94a0081cf"} Dec 06 01:36:20 crc kubenswrapper[4991]: I1206 01:36:20.900640 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cda40385a4acaf12d0cfd857f25f68a467f38a5a9675769c38cccf94a0081cf" Dec 06 01:36:20 crc kubenswrapper[4991]: I1206 01:36:20.900591 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lwgtr" Dec 06 01:36:20 crc kubenswrapper[4991]: I1206 01:36:20.987091 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hmx9g"] Dec 06 01:36:20 crc kubenswrapper[4991]: E1206 01:36:20.987818 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c339885-2776-49cc-88b6-e79ff184d76a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 06 01:36:20 crc kubenswrapper[4991]: I1206 01:36:20.987857 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c339885-2776-49cc-88b6-e79ff184d76a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 06 01:36:20 crc kubenswrapper[4991]: I1206 01:36:20.988214 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c339885-2776-49cc-88b6-e79ff184d76a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 06 01:36:20 crc kubenswrapper[4991]: I1206 01:36:20.989425 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hmx9g" Dec 06 01:36:20 crc kubenswrapper[4991]: I1206 01:36:20.991719 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 01:36:20 crc kubenswrapper[4991]: I1206 01:36:20.992037 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9tlxv" Dec 06 01:36:20 crc kubenswrapper[4991]: I1206 01:36:20.992141 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 01:36:20 crc kubenswrapper[4991]: I1206 01:36:20.992246 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 01:36:21 crc kubenswrapper[4991]: I1206 01:36:21.017394 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hmx9g"] Dec 06 01:36:21 crc kubenswrapper[4991]: I1206 01:36:21.082757 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8695023e-4250-4f03-960e-873fd9f9fba5-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hmx9g\" (UID: \"8695023e-4250-4f03-960e-873fd9f9fba5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hmx9g" Dec 06 01:36:21 crc kubenswrapper[4991]: I1206 01:36:21.082857 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn4kx\" (UniqueName: \"kubernetes.io/projected/8695023e-4250-4f03-960e-873fd9f9fba5-kube-api-access-bn4kx\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hmx9g\" (UID: \"8695023e-4250-4f03-960e-873fd9f9fba5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hmx9g" Dec 06 01:36:21 crc kubenswrapper[4991]: I1206 01:36:21.083246 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8695023e-4250-4f03-960e-873fd9f9fba5-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hmx9g\" (UID: \"8695023e-4250-4f03-960e-873fd9f9fba5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hmx9g" Dec 06 01:36:21 crc kubenswrapper[4991]: I1206 01:36:21.186291 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8695023e-4250-4f03-960e-873fd9f9fba5-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hmx9g\" (UID: \"8695023e-4250-4f03-960e-873fd9f9fba5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hmx9g" Dec 06 01:36:21 crc kubenswrapper[4991]: I1206 01:36:21.186405 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn4kx\" (UniqueName: \"kubernetes.io/projected/8695023e-4250-4f03-960e-873fd9f9fba5-kube-api-access-bn4kx\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hmx9g\" (UID: \"8695023e-4250-4f03-960e-873fd9f9fba5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hmx9g" Dec 06 01:36:21 crc kubenswrapper[4991]: I1206 01:36:21.186660 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8695023e-4250-4f03-960e-873fd9f9fba5-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hmx9g\" (UID: \"8695023e-4250-4f03-960e-873fd9f9fba5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hmx9g" Dec 06 01:36:21 crc kubenswrapper[4991]: I1206 01:36:21.191663 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8695023e-4250-4f03-960e-873fd9f9fba5-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hmx9g\" (UID: \"8695023e-4250-4f03-960e-873fd9f9fba5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hmx9g" Dec 06 01:36:21 crc kubenswrapper[4991]: I1206 01:36:21.191960 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8695023e-4250-4f03-960e-873fd9f9fba5-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hmx9g\" (UID: \"8695023e-4250-4f03-960e-873fd9f9fba5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hmx9g" Dec 06 01:36:21 crc kubenswrapper[4991]: I1206 01:36:21.207817 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn4kx\" (UniqueName: \"kubernetes.io/projected/8695023e-4250-4f03-960e-873fd9f9fba5-kube-api-access-bn4kx\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hmx9g\" (UID: \"8695023e-4250-4f03-960e-873fd9f9fba5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hmx9g" Dec 06 01:36:21 crc kubenswrapper[4991]: I1206 01:36:21.315449 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hmx9g" Dec 06 01:36:21 crc kubenswrapper[4991]: I1206 01:36:21.888153 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hmx9g"] Dec 06 01:36:21 crc kubenswrapper[4991]: I1206 01:36:21.913622 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hmx9g" event={"ID":"8695023e-4250-4f03-960e-873fd9f9fba5","Type":"ContainerStarted","Data":"7c6cda20bdf148e2be4ec806b7ca8c26af40c2ccd01e938baa2867cf1223221e"} Dec 06 01:36:22 crc kubenswrapper[4991]: I1206 01:36:22.930221 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hmx9g" event={"ID":"8695023e-4250-4f03-960e-873fd9f9fba5","Type":"ContainerStarted","Data":"9f6dbf3d41273fde41ade3e7f7d809dbe702e76671c7a3c859afff6099fa90b3"} Dec 06 01:36:22 crc kubenswrapper[4991]: I1206 01:36:22.966835 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hmx9g" podStartSLOduration=2.780148107 podStartE2EDuration="2.966812825s" podCreationTimestamp="2025-12-06 01:36:20 +0000 UTC" firstStartedPulling="2025-12-06 01:36:21.893361609 +0000 UTC m=+2055.243652087" lastFinishedPulling="2025-12-06 01:36:22.080026327 +0000 UTC m=+2055.430316805" observedRunningTime="2025-12-06 01:36:22.952853006 +0000 UTC m=+2056.303143484" watchObservedRunningTime="2025-12-06 01:36:22.966812825 +0000 UTC m=+2056.317103303" Dec 06 01:36:32 crc kubenswrapper[4991]: I1206 01:36:32.044432 4991 generic.go:334] "Generic (PLEG): container finished" podID="8695023e-4250-4f03-960e-873fd9f9fba5" containerID="9f6dbf3d41273fde41ade3e7f7d809dbe702e76671c7a3c859afff6099fa90b3" exitCode=0 Dec 06 01:36:32 crc kubenswrapper[4991]: I1206 01:36:32.044585 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hmx9g" event={"ID":"8695023e-4250-4f03-960e-873fd9f9fba5","Type":"ContainerDied","Data":"9f6dbf3d41273fde41ade3e7f7d809dbe702e76671c7a3c859afff6099fa90b3"} Dec 06 01:36:33 crc kubenswrapper[4991]: I1206 01:36:33.501385 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hmx9g" Dec 06 01:36:33 crc kubenswrapper[4991]: I1206 01:36:33.567752 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bn4kx\" (UniqueName: \"kubernetes.io/projected/8695023e-4250-4f03-960e-873fd9f9fba5-kube-api-access-bn4kx\") pod \"8695023e-4250-4f03-960e-873fd9f9fba5\" (UID: \"8695023e-4250-4f03-960e-873fd9f9fba5\") " Dec 06 01:36:33 crc kubenswrapper[4991]: I1206 01:36:33.567934 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8695023e-4250-4f03-960e-873fd9f9fba5-ssh-key\") pod \"8695023e-4250-4f03-960e-873fd9f9fba5\" (UID: \"8695023e-4250-4f03-960e-873fd9f9fba5\") " Dec 06 01:36:33 crc kubenswrapper[4991]: I1206 01:36:33.568352 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8695023e-4250-4f03-960e-873fd9f9fba5-inventory\") pod \"8695023e-4250-4f03-960e-873fd9f9fba5\" (UID: \"8695023e-4250-4f03-960e-873fd9f9fba5\") " Dec 06 01:36:33 crc kubenswrapper[4991]: I1206 01:36:33.575071 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8695023e-4250-4f03-960e-873fd9f9fba5-kube-api-access-bn4kx" (OuterVolumeSpecName: "kube-api-access-bn4kx") pod "8695023e-4250-4f03-960e-873fd9f9fba5" (UID: "8695023e-4250-4f03-960e-873fd9f9fba5"). InnerVolumeSpecName "kube-api-access-bn4kx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:36:33 crc kubenswrapper[4991]: I1206 01:36:33.596910 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8695023e-4250-4f03-960e-873fd9f9fba5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8695023e-4250-4f03-960e-873fd9f9fba5" (UID: "8695023e-4250-4f03-960e-873fd9f9fba5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:36:33 crc kubenswrapper[4991]: I1206 01:36:33.607743 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8695023e-4250-4f03-960e-873fd9f9fba5-inventory" (OuterVolumeSpecName: "inventory") pod "8695023e-4250-4f03-960e-873fd9f9fba5" (UID: "8695023e-4250-4f03-960e-873fd9f9fba5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:36:33 crc kubenswrapper[4991]: I1206 01:36:33.671423 4991 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8695023e-4250-4f03-960e-873fd9f9fba5-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 01:36:33 crc kubenswrapper[4991]: I1206 01:36:33.671465 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bn4kx\" (UniqueName: \"kubernetes.io/projected/8695023e-4250-4f03-960e-873fd9f9fba5-kube-api-access-bn4kx\") on node \"crc\" DevicePath \"\"" Dec 06 01:36:33 crc kubenswrapper[4991]: I1206 01:36:33.671696 4991 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8695023e-4250-4f03-960e-873fd9f9fba5-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.068337 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hmx9g" event={"ID":"8695023e-4250-4f03-960e-873fd9f9fba5","Type":"ContainerDied","Data":"7c6cda20bdf148e2be4ec806b7ca8c26af40c2ccd01e938baa2867cf1223221e"} Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.068376 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hmx9g" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.068396 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c6cda20bdf148e2be4ec806b7ca8c26af40c2ccd01e938baa2867cf1223221e" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.183147 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9"] Dec 06 01:36:34 crc kubenswrapper[4991]: E1206 01:36:34.183742 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8695023e-4250-4f03-960e-873fd9f9fba5" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.183771 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="8695023e-4250-4f03-960e-873fd9f9fba5" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.184078 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="8695023e-4250-4f03-960e-873fd9f9fba5" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.185071 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.188026 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.188285 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.189207 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.189602 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.190057 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.190473 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.191167 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.193511 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9tlxv" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.209461 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9"] Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.282242 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hls9\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.282298 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hls9\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.282319 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr8b7\" (UniqueName: \"kubernetes.io/projected/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-kube-api-access-wr8b7\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hls9\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.282653 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hls9\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.282774 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hls9\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.282856 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hls9\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.282925 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hls9\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.283001 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hls9\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.283049 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hls9\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.283112 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hls9\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.283166 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hls9\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.283199 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hls9\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.283218 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hls9\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.283260 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hls9\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.386779 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hls9\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.386842 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hls9\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.386875 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wr8b7\" (UniqueName: \"kubernetes.io/projected/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-kube-api-access-wr8b7\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hls9\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.387024 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hls9\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.387088 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hls9\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.387135 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hls9\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.387182 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hls9\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.387226 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hls9\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.387257 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hls9\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.387315 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hls9\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.387400 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hls9\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.387462 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hls9\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.387502 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hls9\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.387532 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hls9\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.398492 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hls9\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.399199 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hls9\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.401673 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hls9\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.402338 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hls9\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.403774 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hls9\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.404366 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hls9\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.404599 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hls9\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.405268 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hls9\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.405928 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hls9\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.406949 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hls9\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.409089 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hls9\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.410188 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hls9\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.422045 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hls9\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.434125 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr8b7\" (UniqueName: \"kubernetes.io/projected/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-kube-api-access-wr8b7\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hls9\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" Dec 06 01:36:34 crc kubenswrapper[4991]: I1206 01:36:34.516690 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" Dec 06 01:36:35 crc kubenswrapper[4991]: I1206 01:36:35.175238 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9"] Dec 06 01:36:36 crc kubenswrapper[4991]: I1206 01:36:36.090922 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" event={"ID":"a64fe8ac-b9eb-4c7e-917b-56fb0da51872","Type":"ContainerStarted","Data":"24455171d73d74cc5e9cf11a88bba2a101b1d8372b0657e51c7d9803dca18162"} Dec 06 01:36:36 crc kubenswrapper[4991]: I1206 01:36:36.090978 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" event={"ID":"a64fe8ac-b9eb-4c7e-917b-56fb0da51872","Type":"ContainerStarted","Data":"cfc564d56cdd7fcb4113048537b2282eadf1f19b6cd59594cc747a83b178dc26"} Dec 06 01:36:36 crc kubenswrapper[4991]: I1206 01:36:36.128893 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" podStartSLOduration=1.965044769 podStartE2EDuration="2.128870274s" podCreationTimestamp="2025-12-06 01:36:34 +0000 UTC" firstStartedPulling="2025-12-06 01:36:35.168941555 +0000 UTC m=+2068.519232063" lastFinishedPulling="2025-12-06 01:36:35.33276707 +0000 UTC m=+2068.683057568" observedRunningTime="2025-12-06 01:36:36.123636666 +0000 UTC m=+2069.473927134" watchObservedRunningTime="2025-12-06 01:36:36.128870274 +0000 UTC m=+2069.479160752" Dec 06 01:36:57 crc kubenswrapper[4991]: I1206 01:36:57.299930 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zpp6g"] Dec 06 01:36:57 crc kubenswrapper[4991]: I1206 01:36:57.303730 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zpp6g" Dec 06 01:36:57 crc kubenswrapper[4991]: I1206 01:36:57.344415 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zpp6g"] Dec 06 01:36:57 crc kubenswrapper[4991]: I1206 01:36:57.445466 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mtxd\" (UniqueName: \"kubernetes.io/projected/05904923-5e00-4c85-bfbe-e7699eaa6f44-kube-api-access-7mtxd\") pod \"redhat-operators-zpp6g\" (UID: \"05904923-5e00-4c85-bfbe-e7699eaa6f44\") " pod="openshift-marketplace/redhat-operators-zpp6g" Dec 06 01:36:57 crc kubenswrapper[4991]: I1206 01:36:57.445590 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05904923-5e00-4c85-bfbe-e7699eaa6f44-utilities\") pod \"redhat-operators-zpp6g\" (UID: \"05904923-5e00-4c85-bfbe-e7699eaa6f44\") " pod="openshift-marketplace/redhat-operators-zpp6g" Dec 06 01:36:57 crc kubenswrapper[4991]: I1206 01:36:57.445774 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05904923-5e00-4c85-bfbe-e7699eaa6f44-catalog-content\") pod \"redhat-operators-zpp6g\" (UID: \"05904923-5e00-4c85-bfbe-e7699eaa6f44\") " pod="openshift-marketplace/redhat-operators-zpp6g" Dec 06 01:36:57 crc kubenswrapper[4991]: I1206 01:36:57.548284 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05904923-5e00-4c85-bfbe-e7699eaa6f44-catalog-content\") pod \"redhat-operators-zpp6g\" (UID: \"05904923-5e00-4c85-bfbe-e7699eaa6f44\") " pod="openshift-marketplace/redhat-operators-zpp6g" Dec 06 01:36:57 crc kubenswrapper[4991]: I1206 01:36:57.548453 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mtxd\" (UniqueName: \"kubernetes.io/projected/05904923-5e00-4c85-bfbe-e7699eaa6f44-kube-api-access-7mtxd\") pod \"redhat-operators-zpp6g\" (UID: \"05904923-5e00-4c85-bfbe-e7699eaa6f44\") " pod="openshift-marketplace/redhat-operators-zpp6g" Dec 06 01:36:57 crc kubenswrapper[4991]: I1206 01:36:57.548516 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05904923-5e00-4c85-bfbe-e7699eaa6f44-utilities\") pod \"redhat-operators-zpp6g\" (UID: \"05904923-5e00-4c85-bfbe-e7699eaa6f44\") " pod="openshift-marketplace/redhat-operators-zpp6g" Dec 06 01:36:57 crc kubenswrapper[4991]: I1206 01:36:57.548961 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05904923-5e00-4c85-bfbe-e7699eaa6f44-catalog-content\") pod \"redhat-operators-zpp6g\" (UID: \"05904923-5e00-4c85-bfbe-e7699eaa6f44\") " pod="openshift-marketplace/redhat-operators-zpp6g" Dec 06 01:36:57 crc kubenswrapper[4991]: I1206 01:36:57.549391 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05904923-5e00-4c85-bfbe-e7699eaa6f44-utilities\") pod \"redhat-operators-zpp6g\" (UID: \"05904923-5e00-4c85-bfbe-e7699eaa6f44\") " pod="openshift-marketplace/redhat-operators-zpp6g" Dec 06 01:36:57 crc kubenswrapper[4991]: I1206 01:36:57.574302 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mtxd\" (UniqueName: \"kubernetes.io/projected/05904923-5e00-4c85-bfbe-e7699eaa6f44-kube-api-access-7mtxd\") pod \"redhat-operators-zpp6g\" (UID: \"05904923-5e00-4c85-bfbe-e7699eaa6f44\") " pod="openshift-marketplace/redhat-operators-zpp6g" Dec 06 01:36:57 crc kubenswrapper[4991]: I1206 01:36:57.634096 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zpp6g" Dec 06 01:36:58 crc kubenswrapper[4991]: I1206 01:36:58.157497 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zpp6g"] Dec 06 01:36:58 crc kubenswrapper[4991]: W1206 01:36:58.161867 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05904923_5e00_4c85_bfbe_e7699eaa6f44.slice/crio-a7bb0fa971fb02bff7dea16bcf8d614050a6a412b91a28c2c624214cb41af79d WatchSource:0}: Error finding container a7bb0fa971fb02bff7dea16bcf8d614050a6a412b91a28c2c624214cb41af79d: Status 404 returned error can't find the container with id a7bb0fa971fb02bff7dea16bcf8d614050a6a412b91a28c2c624214cb41af79d Dec 06 01:36:58 crc kubenswrapper[4991]: I1206 01:36:58.345809 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zpp6g" event={"ID":"05904923-5e00-4c85-bfbe-e7699eaa6f44","Type":"ContainerStarted","Data":"a7bb0fa971fb02bff7dea16bcf8d614050a6a412b91a28c2c624214cb41af79d"} Dec 06 01:36:59 crc kubenswrapper[4991]: I1206 01:36:59.358381 4991 generic.go:334] "Generic (PLEG): container finished" podID="05904923-5e00-4c85-bfbe-e7699eaa6f44" containerID="0dc9fd200a513d48f0320e8b69bffeee8d0b3b3ee2239b07dde94543d705b1ad" exitCode=0 Dec 06 01:36:59 crc kubenswrapper[4991]: I1206 01:36:59.358450 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zpp6g" event={"ID":"05904923-5e00-4c85-bfbe-e7699eaa6f44","Type":"ContainerDied","Data":"0dc9fd200a513d48f0320e8b69bffeee8d0b3b3ee2239b07dde94543d705b1ad"} Dec 06 01:37:00 crc kubenswrapper[4991]: I1206 01:37:00.369567 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zpp6g" event={"ID":"05904923-5e00-4c85-bfbe-e7699eaa6f44","Type":"ContainerStarted","Data":"e03641d5d9749e9c8a5a105b4a46e94fb206f59ba48467d26b1fa6068a8fcfb9"} Dec 06 01:37:01 crc kubenswrapper[4991]: I1206 01:37:01.383286 4991 generic.go:334] "Generic (PLEG): container finished" podID="05904923-5e00-4c85-bfbe-e7699eaa6f44" containerID="e03641d5d9749e9c8a5a105b4a46e94fb206f59ba48467d26b1fa6068a8fcfb9" exitCode=0 Dec 06 01:37:01 crc kubenswrapper[4991]: I1206 01:37:01.383381 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zpp6g" event={"ID":"05904923-5e00-4c85-bfbe-e7699eaa6f44","Type":"ContainerDied","Data":"e03641d5d9749e9c8a5a105b4a46e94fb206f59ba48467d26b1fa6068a8fcfb9"} Dec 06 01:37:02 crc kubenswrapper[4991]: I1206 01:37:02.400151 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zpp6g" event={"ID":"05904923-5e00-4c85-bfbe-e7699eaa6f44","Type":"ContainerStarted","Data":"877363a28238b9dc4e07185b81dd33615d370ac6fe32cc910980dd5359b5738a"} Dec 06 01:37:02 crc kubenswrapper[4991]: I1206 01:37:02.437476 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zpp6g" podStartSLOduration=2.83954255 podStartE2EDuration="5.437456925s" podCreationTimestamp="2025-12-06 01:36:57 +0000 UTC" firstStartedPulling="2025-12-06 01:36:59.362317663 +0000 UTC m=+2092.712608141" lastFinishedPulling="2025-12-06 01:37:01.960232028 +0000 UTC m=+2095.310522516" observedRunningTime="2025-12-06 01:37:02.430602155 +0000 UTC m=+2095.780892623" watchObservedRunningTime="2025-12-06 01:37:02.437456925 +0000 UTC m=+2095.787747393" Dec 06 01:37:07 crc kubenswrapper[4991]: I1206 01:37:07.636316 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zpp6g" Dec 06 01:37:07 crc kubenswrapper[4991]: I1206 01:37:07.637363 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zpp6g" Dec 06 01:37:08 crc kubenswrapper[4991]: I1206 01:37:08.694220 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zpp6g" podUID="05904923-5e00-4c85-bfbe-e7699eaa6f44" containerName="registry-server" probeResult="failure" output=< Dec 06 01:37:08 crc kubenswrapper[4991]: timeout: failed to connect service ":50051" within 1s Dec 06 01:37:08 crc kubenswrapper[4991]: > Dec 06 01:37:15 crc kubenswrapper[4991]: I1206 01:37:15.541129 4991 generic.go:334] "Generic (PLEG): container finished" podID="a64fe8ac-b9eb-4c7e-917b-56fb0da51872" containerID="24455171d73d74cc5e9cf11a88bba2a101b1d8372b0657e51c7d9803dca18162" exitCode=0 Dec 06 01:37:15 crc kubenswrapper[4991]: I1206 01:37:15.542041 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" event={"ID":"a64fe8ac-b9eb-4c7e-917b-56fb0da51872","Type":"ContainerDied","Data":"24455171d73d74cc5e9cf11a88bba2a101b1d8372b0657e51c7d9803dca18162"} Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.170933 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.316932 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.317054 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-bootstrap-combined-ca-bundle\") pod \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.317130 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.317205 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-nova-combined-ca-bundle\") pod \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.317228 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-inventory\") pod \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.317266 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-telemetry-combined-ca-bundle\") pod \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.317300 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-neutron-metadata-combined-ca-bundle\") pod \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.317350 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-repo-setup-combined-ca-bundle\") pod \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.317402 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.317440 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-ovn-combined-ca-bundle\") pod \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.317477 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wr8b7\" (UniqueName: \"kubernetes.io/projected/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-kube-api-access-wr8b7\") pod \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.317551 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-ssh-key\") pod \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.317590 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-openstack-edpm-ipam-ovn-default-certs-0\") pod \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.317622 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-libvirt-combined-ca-bundle\") pod \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\" (UID: \"a64fe8ac-b9eb-4c7e-917b-56fb0da51872\") " Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.328664 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-kube-api-access-wr8b7" (OuterVolumeSpecName: "kube-api-access-wr8b7") pod "a64fe8ac-b9eb-4c7e-917b-56fb0da51872" (UID: "a64fe8ac-b9eb-4c7e-917b-56fb0da51872"). InnerVolumeSpecName "kube-api-access-wr8b7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.330201 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "a64fe8ac-b9eb-4c7e-917b-56fb0da51872" (UID: "a64fe8ac-b9eb-4c7e-917b-56fb0da51872"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.330387 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "a64fe8ac-b9eb-4c7e-917b-56fb0da51872" (UID: "a64fe8ac-b9eb-4c7e-917b-56fb0da51872"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.330773 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "a64fe8ac-b9eb-4c7e-917b-56fb0da51872" (UID: "a64fe8ac-b9eb-4c7e-917b-56fb0da51872"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.334351 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "a64fe8ac-b9eb-4c7e-917b-56fb0da51872" (UID: "a64fe8ac-b9eb-4c7e-917b-56fb0da51872"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.334324 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "a64fe8ac-b9eb-4c7e-917b-56fb0da51872" (UID: "a64fe8ac-b9eb-4c7e-917b-56fb0da51872"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.339252 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "a64fe8ac-b9eb-4c7e-917b-56fb0da51872" (UID: "a64fe8ac-b9eb-4c7e-917b-56fb0da51872"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.339299 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "a64fe8ac-b9eb-4c7e-917b-56fb0da51872" (UID: "a64fe8ac-b9eb-4c7e-917b-56fb0da51872"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.339275 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "a64fe8ac-b9eb-4c7e-917b-56fb0da51872" (UID: "a64fe8ac-b9eb-4c7e-917b-56fb0da51872"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.339308 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "a64fe8ac-b9eb-4c7e-917b-56fb0da51872" (UID: "a64fe8ac-b9eb-4c7e-917b-56fb0da51872"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.339514 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "a64fe8ac-b9eb-4c7e-917b-56fb0da51872" (UID: "a64fe8ac-b9eb-4c7e-917b-56fb0da51872"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.339905 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "a64fe8ac-b9eb-4c7e-917b-56fb0da51872" (UID: "a64fe8ac-b9eb-4c7e-917b-56fb0da51872"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.365186 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-inventory" (OuterVolumeSpecName: "inventory") pod "a64fe8ac-b9eb-4c7e-917b-56fb0da51872" (UID: "a64fe8ac-b9eb-4c7e-917b-56fb0da51872"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.366548 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a64fe8ac-b9eb-4c7e-917b-56fb0da51872" (UID: "a64fe8ac-b9eb-4c7e-917b-56fb0da51872"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.420160 4991 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.420207 4991 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.420229 4991 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.420246 4991 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.420264 4991 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.420278 4991 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.420289 4991 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.420301 4991 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.420315 4991 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.420330 4991 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.420344 4991 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.420362 4991 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.420375 4991 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.420391 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wr8b7\" (UniqueName: \"kubernetes.io/projected/a64fe8ac-b9eb-4c7e-917b-56fb0da51872-kube-api-access-wr8b7\") on node \"crc\" DevicePath \"\"" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.568194 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" event={"ID":"a64fe8ac-b9eb-4c7e-917b-56fb0da51872","Type":"ContainerDied","Data":"cfc564d56cdd7fcb4113048537b2282eadf1f19b6cd59594cc747a83b178dc26"} Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.568259 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cfc564d56cdd7fcb4113048537b2282eadf1f19b6cd59594cc747a83b178dc26" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.568348 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hls9" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.693070 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-fcgx6"] Dec 06 01:37:17 crc kubenswrapper[4991]: E1206 01:37:17.693844 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a64fe8ac-b9eb-4c7e-917b-56fb0da51872" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.693862 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="a64fe8ac-b9eb-4c7e-917b-56fb0da51872" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.694101 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="a64fe8ac-b9eb-4c7e-917b-56fb0da51872" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.694759 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fcgx6" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.701965 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.702034 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.702217 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.702408 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.702660 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9tlxv" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.715849 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-fcgx6"] Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.745738 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zpp6g" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.815736 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zpp6g" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.832406 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8ae62c44-69d1-4cbf-abb9-491082bf83e3-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fcgx6\" (UID: \"8ae62c44-69d1-4cbf-abb9-491082bf83e3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fcgx6" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.832530 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ae62c44-69d1-4cbf-abb9-491082bf83e3-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fcgx6\" (UID: \"8ae62c44-69d1-4cbf-abb9-491082bf83e3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fcgx6" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.855588 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8ae62c44-69d1-4cbf-abb9-491082bf83e3-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fcgx6\" (UID: \"8ae62c44-69d1-4cbf-abb9-491082bf83e3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fcgx6" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.855634 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8ae62c44-69d1-4cbf-abb9-491082bf83e3-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fcgx6\" (UID: \"8ae62c44-69d1-4cbf-abb9-491082bf83e3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fcgx6" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.855659 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxbmj\" (UniqueName: \"kubernetes.io/projected/8ae62c44-69d1-4cbf-abb9-491082bf83e3-kube-api-access-kxbmj\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fcgx6\" (UID: \"8ae62c44-69d1-4cbf-abb9-491082bf83e3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fcgx6" Dec 06 01:37:17 crc kubenswrapper[4991]: E1206 01:37:17.895888 4991 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda64fe8ac_b9eb_4c7e_917b_56fb0da51872.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda64fe8ac_b9eb_4c7e_917b_56fb0da51872.slice/crio-cfc564d56cdd7fcb4113048537b2282eadf1f19b6cd59594cc747a83b178dc26\": RecentStats: unable to find data in memory cache]" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.957475 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8ae62c44-69d1-4cbf-abb9-491082bf83e3-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fcgx6\" (UID: \"8ae62c44-69d1-4cbf-abb9-491082bf83e3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fcgx6" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.957520 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8ae62c44-69d1-4cbf-abb9-491082bf83e3-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fcgx6\" (UID: \"8ae62c44-69d1-4cbf-abb9-491082bf83e3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fcgx6" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.957542 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxbmj\" (UniqueName: \"kubernetes.io/projected/8ae62c44-69d1-4cbf-abb9-491082bf83e3-kube-api-access-kxbmj\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fcgx6\" (UID: \"8ae62c44-69d1-4cbf-abb9-491082bf83e3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fcgx6" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.957608 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8ae62c44-69d1-4cbf-abb9-491082bf83e3-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fcgx6\" (UID: \"8ae62c44-69d1-4cbf-abb9-491082bf83e3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fcgx6" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.957663 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ae62c44-69d1-4cbf-abb9-491082bf83e3-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fcgx6\" (UID: \"8ae62c44-69d1-4cbf-abb9-491082bf83e3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fcgx6" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.959824 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8ae62c44-69d1-4cbf-abb9-491082bf83e3-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fcgx6\" (UID: \"8ae62c44-69d1-4cbf-abb9-491082bf83e3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fcgx6" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.963768 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8ae62c44-69d1-4cbf-abb9-491082bf83e3-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fcgx6\" (UID: \"8ae62c44-69d1-4cbf-abb9-491082bf83e3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fcgx6" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.963948 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8ae62c44-69d1-4cbf-abb9-491082bf83e3-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fcgx6\" (UID: \"8ae62c44-69d1-4cbf-abb9-491082bf83e3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fcgx6" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.969014 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ae62c44-69d1-4cbf-abb9-491082bf83e3-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fcgx6\" (UID: \"8ae62c44-69d1-4cbf-abb9-491082bf83e3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fcgx6" Dec 06 01:37:17 crc kubenswrapper[4991]: I1206 01:37:17.986716 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxbmj\" (UniqueName: \"kubernetes.io/projected/8ae62c44-69d1-4cbf-abb9-491082bf83e3-kube-api-access-kxbmj\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fcgx6\" (UID: \"8ae62c44-69d1-4cbf-abb9-491082bf83e3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fcgx6" Dec 06 01:37:18 crc kubenswrapper[4991]: I1206 01:37:18.003836 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zpp6g"] Dec 06 01:37:18 crc kubenswrapper[4991]: I1206 01:37:18.096706 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fcgx6" Dec 06 01:37:18 crc kubenswrapper[4991]: I1206 01:37:18.680835 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-fcgx6"] Dec 06 01:37:18 crc kubenswrapper[4991]: W1206 01:37:18.687322 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ae62c44_69d1_4cbf_abb9_491082bf83e3.slice/crio-36c0c74585aa16b769a079c7ef49d60567b2f86e961664e4c9cb425eb37af5a3 WatchSource:0}: Error finding container 36c0c74585aa16b769a079c7ef49d60567b2f86e961664e4c9cb425eb37af5a3: Status 404 returned error can't find the container with id 36c0c74585aa16b769a079c7ef49d60567b2f86e961664e4c9cb425eb37af5a3 Dec 06 01:37:19 crc kubenswrapper[4991]: I1206 01:37:19.592460 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fcgx6" event={"ID":"8ae62c44-69d1-4cbf-abb9-491082bf83e3","Type":"ContainerStarted","Data":"f9bb76df1ecbeff2386e9e9a9c794adb62bd3192f75afe16b39977e1ecd39689"} Dec 06 01:37:19 crc kubenswrapper[4991]: I1206 01:37:19.592928 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fcgx6" event={"ID":"8ae62c44-69d1-4cbf-abb9-491082bf83e3","Type":"ContainerStarted","Data":"36c0c74585aa16b769a079c7ef49d60567b2f86e961664e4c9cb425eb37af5a3"} Dec 06 01:37:19 crc kubenswrapper[4991]: I1206 01:37:19.593031 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zpp6g" podUID="05904923-5e00-4c85-bfbe-e7699eaa6f44" containerName="registry-server" containerID="cri-o://877363a28238b9dc4e07185b81dd33615d370ac6fe32cc910980dd5359b5738a" gracePeriod=2 Dec 06 01:37:19 crc kubenswrapper[4991]: I1206 01:37:19.623600 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fcgx6" podStartSLOduration=2.447441947 podStartE2EDuration="2.623575957s" podCreationTimestamp="2025-12-06 01:37:17 +0000 UTC" firstStartedPulling="2025-12-06 01:37:18.690588109 +0000 UTC m=+2112.040878587" lastFinishedPulling="2025-12-06 01:37:18.866722129 +0000 UTC m=+2112.217012597" observedRunningTime="2025-12-06 01:37:19.612821083 +0000 UTC m=+2112.963111551" watchObservedRunningTime="2025-12-06 01:37:19.623575957 +0000 UTC m=+2112.973866435" Dec 06 01:37:20 crc kubenswrapper[4991]: I1206 01:37:20.182560 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zpp6g" Dec 06 01:37:20 crc kubenswrapper[4991]: I1206 01:37:20.317835 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05904923-5e00-4c85-bfbe-e7699eaa6f44-catalog-content\") pod \"05904923-5e00-4c85-bfbe-e7699eaa6f44\" (UID: \"05904923-5e00-4c85-bfbe-e7699eaa6f44\") " Dec 06 01:37:20 crc kubenswrapper[4991]: I1206 01:37:20.318226 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05904923-5e00-4c85-bfbe-e7699eaa6f44-utilities\") pod \"05904923-5e00-4c85-bfbe-e7699eaa6f44\" (UID: \"05904923-5e00-4c85-bfbe-e7699eaa6f44\") " Dec 06 01:37:20 crc kubenswrapper[4991]: I1206 01:37:20.318525 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mtxd\" (UniqueName: \"kubernetes.io/projected/05904923-5e00-4c85-bfbe-e7699eaa6f44-kube-api-access-7mtxd\") pod \"05904923-5e00-4c85-bfbe-e7699eaa6f44\" (UID: \"05904923-5e00-4c85-bfbe-e7699eaa6f44\") " Dec 06 01:37:20 crc kubenswrapper[4991]: I1206 01:37:20.318846 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05904923-5e00-4c85-bfbe-e7699eaa6f44-utilities" (OuterVolumeSpecName: "utilities") pod "05904923-5e00-4c85-bfbe-e7699eaa6f44" (UID: "05904923-5e00-4c85-bfbe-e7699eaa6f44"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:37:20 crc kubenswrapper[4991]: I1206 01:37:20.319209 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05904923-5e00-4c85-bfbe-e7699eaa6f44-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 01:37:20 crc kubenswrapper[4991]: I1206 01:37:20.324888 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05904923-5e00-4c85-bfbe-e7699eaa6f44-kube-api-access-7mtxd" (OuterVolumeSpecName: "kube-api-access-7mtxd") pod "05904923-5e00-4c85-bfbe-e7699eaa6f44" (UID: "05904923-5e00-4c85-bfbe-e7699eaa6f44"). InnerVolumeSpecName "kube-api-access-7mtxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:37:20 crc kubenswrapper[4991]: I1206 01:37:20.417186 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05904923-5e00-4c85-bfbe-e7699eaa6f44-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "05904923-5e00-4c85-bfbe-e7699eaa6f44" (UID: "05904923-5e00-4c85-bfbe-e7699eaa6f44"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:37:20 crc kubenswrapper[4991]: I1206 01:37:20.422123 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mtxd\" (UniqueName: \"kubernetes.io/projected/05904923-5e00-4c85-bfbe-e7699eaa6f44-kube-api-access-7mtxd\") on node \"crc\" DevicePath \"\"" Dec 06 01:37:20 crc kubenswrapper[4991]: I1206 01:37:20.422277 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05904923-5e00-4c85-bfbe-e7699eaa6f44-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 01:37:20 crc kubenswrapper[4991]: I1206 01:37:20.604635 4991 generic.go:334] "Generic (PLEG): container finished" podID="05904923-5e00-4c85-bfbe-e7699eaa6f44" containerID="877363a28238b9dc4e07185b81dd33615d370ac6fe32cc910980dd5359b5738a" exitCode=0 Dec 06 01:37:20 crc kubenswrapper[4991]: I1206 01:37:20.604739 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zpp6g" Dec 06 01:37:20 crc kubenswrapper[4991]: I1206 01:37:20.604935 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zpp6g" event={"ID":"05904923-5e00-4c85-bfbe-e7699eaa6f44","Type":"ContainerDied","Data":"877363a28238b9dc4e07185b81dd33615d370ac6fe32cc910980dd5359b5738a"} Dec 06 01:37:20 crc kubenswrapper[4991]: I1206 01:37:20.605073 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zpp6g" event={"ID":"05904923-5e00-4c85-bfbe-e7699eaa6f44","Type":"ContainerDied","Data":"a7bb0fa971fb02bff7dea16bcf8d614050a6a412b91a28c2c624214cb41af79d"} Dec 06 01:37:20 crc kubenswrapper[4991]: I1206 01:37:20.605156 4991 scope.go:117] "RemoveContainer" containerID="877363a28238b9dc4e07185b81dd33615d370ac6fe32cc910980dd5359b5738a" Dec 06 01:37:20 crc kubenswrapper[4991]: I1206 01:37:20.634450 4991 scope.go:117] "RemoveContainer" containerID="e03641d5d9749e9c8a5a105b4a46e94fb206f59ba48467d26b1fa6068a8fcfb9" Dec 06 01:37:20 crc kubenswrapper[4991]: I1206 01:37:20.639829 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zpp6g"] Dec 06 01:37:20 crc kubenswrapper[4991]: I1206 01:37:20.648252 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zpp6g"] Dec 06 01:37:20 crc kubenswrapper[4991]: I1206 01:37:20.673221 4991 scope.go:117] "RemoveContainer" containerID="0dc9fd200a513d48f0320e8b69bffeee8d0b3b3ee2239b07dde94543d705b1ad" Dec 06 01:37:20 crc kubenswrapper[4991]: I1206 01:37:20.712431 4991 scope.go:117] "RemoveContainer" containerID="877363a28238b9dc4e07185b81dd33615d370ac6fe32cc910980dd5359b5738a" Dec 06 01:37:20 crc kubenswrapper[4991]: E1206 01:37:20.713055 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"877363a28238b9dc4e07185b81dd33615d370ac6fe32cc910980dd5359b5738a\": container with ID starting with 877363a28238b9dc4e07185b81dd33615d370ac6fe32cc910980dd5359b5738a not found: ID does not exist" containerID="877363a28238b9dc4e07185b81dd33615d370ac6fe32cc910980dd5359b5738a" Dec 06 01:37:20 crc kubenswrapper[4991]: I1206 01:37:20.713093 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"877363a28238b9dc4e07185b81dd33615d370ac6fe32cc910980dd5359b5738a"} err="failed to get container status \"877363a28238b9dc4e07185b81dd33615d370ac6fe32cc910980dd5359b5738a\": rpc error: code = NotFound desc = could not find container \"877363a28238b9dc4e07185b81dd33615d370ac6fe32cc910980dd5359b5738a\": container with ID starting with 877363a28238b9dc4e07185b81dd33615d370ac6fe32cc910980dd5359b5738a not found: ID does not exist" Dec 06 01:37:20 crc kubenswrapper[4991]: I1206 01:37:20.713121 4991 scope.go:117] "RemoveContainer" containerID="e03641d5d9749e9c8a5a105b4a46e94fb206f59ba48467d26b1fa6068a8fcfb9" Dec 06 01:37:20 crc kubenswrapper[4991]: E1206 01:37:20.713673 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e03641d5d9749e9c8a5a105b4a46e94fb206f59ba48467d26b1fa6068a8fcfb9\": container with ID starting with e03641d5d9749e9c8a5a105b4a46e94fb206f59ba48467d26b1fa6068a8fcfb9 not found: ID does not exist" containerID="e03641d5d9749e9c8a5a105b4a46e94fb206f59ba48467d26b1fa6068a8fcfb9" Dec 06 01:37:20 crc kubenswrapper[4991]: I1206 01:37:20.713704 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e03641d5d9749e9c8a5a105b4a46e94fb206f59ba48467d26b1fa6068a8fcfb9"} err="failed to get container status \"e03641d5d9749e9c8a5a105b4a46e94fb206f59ba48467d26b1fa6068a8fcfb9\": rpc error: code = NotFound desc = could not find container \"e03641d5d9749e9c8a5a105b4a46e94fb206f59ba48467d26b1fa6068a8fcfb9\": container with ID starting with e03641d5d9749e9c8a5a105b4a46e94fb206f59ba48467d26b1fa6068a8fcfb9 not found: ID does not exist" Dec 06 01:37:20 crc kubenswrapper[4991]: I1206 01:37:20.713726 4991 scope.go:117] "RemoveContainer" containerID="0dc9fd200a513d48f0320e8b69bffeee8d0b3b3ee2239b07dde94543d705b1ad" Dec 06 01:37:20 crc kubenswrapper[4991]: E1206 01:37:20.714034 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dc9fd200a513d48f0320e8b69bffeee8d0b3b3ee2239b07dde94543d705b1ad\": container with ID starting with 0dc9fd200a513d48f0320e8b69bffeee8d0b3b3ee2239b07dde94543d705b1ad not found: ID does not exist" containerID="0dc9fd200a513d48f0320e8b69bffeee8d0b3b3ee2239b07dde94543d705b1ad" Dec 06 01:37:20 crc kubenswrapper[4991]: I1206 01:37:20.714062 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dc9fd200a513d48f0320e8b69bffeee8d0b3b3ee2239b07dde94543d705b1ad"} err="failed to get container status \"0dc9fd200a513d48f0320e8b69bffeee8d0b3b3ee2239b07dde94543d705b1ad\": rpc error: code = NotFound desc = could not find container \"0dc9fd200a513d48f0320e8b69bffeee8d0b3b3ee2239b07dde94543d705b1ad\": container with ID starting with 0dc9fd200a513d48f0320e8b69bffeee8d0b3b3ee2239b07dde94543d705b1ad not found: ID does not exist" Dec 06 01:37:21 crc kubenswrapper[4991]: I1206 01:37:21.050388 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05904923-5e00-4c85-bfbe-e7699eaa6f44" path="/var/lib/kubelet/pods/05904923-5e00-4c85-bfbe-e7699eaa6f44/volumes" Dec 06 01:37:46 crc kubenswrapper[4991]: I1206 01:37:46.744367 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 01:37:46 crc kubenswrapper[4991]: I1206 01:37:46.744957 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 01:38:16 crc kubenswrapper[4991]: I1206 01:38:16.739924 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 01:38:16 crc kubenswrapper[4991]: I1206 01:38:16.740814 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 01:38:27 crc kubenswrapper[4991]: I1206 01:38:27.418725 4991 generic.go:334] "Generic (PLEG): container finished" podID="8ae62c44-69d1-4cbf-abb9-491082bf83e3" containerID="f9bb76df1ecbeff2386e9e9a9c794adb62bd3192f75afe16b39977e1ecd39689" exitCode=0 Dec 06 01:38:27 crc kubenswrapper[4991]: I1206 01:38:27.418845 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fcgx6" event={"ID":"8ae62c44-69d1-4cbf-abb9-491082bf83e3","Type":"ContainerDied","Data":"f9bb76df1ecbeff2386e9e9a9c794adb62bd3192f75afe16b39977e1ecd39689"} Dec 06 01:38:28 crc kubenswrapper[4991]: I1206 01:38:28.918149 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fcgx6" Dec 06 01:38:28 crc kubenswrapper[4991]: I1206 01:38:28.997438 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8ae62c44-69d1-4cbf-abb9-491082bf83e3-ovncontroller-config-0\") pod \"8ae62c44-69d1-4cbf-abb9-491082bf83e3\" (UID: \"8ae62c44-69d1-4cbf-abb9-491082bf83e3\") " Dec 06 01:38:28 crc kubenswrapper[4991]: I1206 01:38:28.997657 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8ae62c44-69d1-4cbf-abb9-491082bf83e3-inventory\") pod \"8ae62c44-69d1-4cbf-abb9-491082bf83e3\" (UID: \"8ae62c44-69d1-4cbf-abb9-491082bf83e3\") " Dec 06 01:38:28 crc kubenswrapper[4991]: I1206 01:38:28.997725 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxbmj\" (UniqueName: \"kubernetes.io/projected/8ae62c44-69d1-4cbf-abb9-491082bf83e3-kube-api-access-kxbmj\") pod \"8ae62c44-69d1-4cbf-abb9-491082bf83e3\" (UID: \"8ae62c44-69d1-4cbf-abb9-491082bf83e3\") " Dec 06 01:38:28 crc kubenswrapper[4991]: I1206 01:38:28.997886 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8ae62c44-69d1-4cbf-abb9-491082bf83e3-ssh-key\") pod \"8ae62c44-69d1-4cbf-abb9-491082bf83e3\" (UID: \"8ae62c44-69d1-4cbf-abb9-491082bf83e3\") " Dec 06 01:38:28 crc kubenswrapper[4991]: I1206 01:38:28.998183 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ae62c44-69d1-4cbf-abb9-491082bf83e3-ovn-combined-ca-bundle\") pod \"8ae62c44-69d1-4cbf-abb9-491082bf83e3\" (UID: \"8ae62c44-69d1-4cbf-abb9-491082bf83e3\") " Dec 06 01:38:29 crc kubenswrapper[4991]: I1206 01:38:29.009121 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ae62c44-69d1-4cbf-abb9-491082bf83e3-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "8ae62c44-69d1-4cbf-abb9-491082bf83e3" (UID: "8ae62c44-69d1-4cbf-abb9-491082bf83e3"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:38:29 crc kubenswrapper[4991]: I1206 01:38:29.010169 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ae62c44-69d1-4cbf-abb9-491082bf83e3-kube-api-access-kxbmj" (OuterVolumeSpecName: "kube-api-access-kxbmj") pod "8ae62c44-69d1-4cbf-abb9-491082bf83e3" (UID: "8ae62c44-69d1-4cbf-abb9-491082bf83e3"). InnerVolumeSpecName "kube-api-access-kxbmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:38:29 crc kubenswrapper[4991]: I1206 01:38:29.043931 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ae62c44-69d1-4cbf-abb9-491082bf83e3-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "8ae62c44-69d1-4cbf-abb9-491082bf83e3" (UID: "8ae62c44-69d1-4cbf-abb9-491082bf83e3"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:38:29 crc kubenswrapper[4991]: I1206 01:38:29.046606 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ae62c44-69d1-4cbf-abb9-491082bf83e3-inventory" (OuterVolumeSpecName: "inventory") pod "8ae62c44-69d1-4cbf-abb9-491082bf83e3" (UID: "8ae62c44-69d1-4cbf-abb9-491082bf83e3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:38:29 crc kubenswrapper[4991]: I1206 01:38:29.047513 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ae62c44-69d1-4cbf-abb9-491082bf83e3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8ae62c44-69d1-4cbf-abb9-491082bf83e3" (UID: "8ae62c44-69d1-4cbf-abb9-491082bf83e3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:38:29 crc kubenswrapper[4991]: I1206 01:38:29.103809 4991 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8ae62c44-69d1-4cbf-abb9-491082bf83e3-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 06 01:38:29 crc kubenswrapper[4991]: I1206 01:38:29.104183 4991 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8ae62c44-69d1-4cbf-abb9-491082bf83e3-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 01:38:29 crc kubenswrapper[4991]: I1206 01:38:29.104199 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxbmj\" (UniqueName: \"kubernetes.io/projected/8ae62c44-69d1-4cbf-abb9-491082bf83e3-kube-api-access-kxbmj\") on node \"crc\" DevicePath \"\"" Dec 06 01:38:29 crc kubenswrapper[4991]: I1206 01:38:29.104207 4991 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8ae62c44-69d1-4cbf-abb9-491082bf83e3-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 01:38:29 crc kubenswrapper[4991]: I1206 01:38:29.104218 4991 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ae62c44-69d1-4cbf-abb9-491082bf83e3-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:38:29 crc kubenswrapper[4991]: I1206 01:38:29.453969 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fcgx6" event={"ID":"8ae62c44-69d1-4cbf-abb9-491082bf83e3","Type":"ContainerDied","Data":"36c0c74585aa16b769a079c7ef49d60567b2f86e961664e4c9cb425eb37af5a3"} Dec 06 01:38:29 crc kubenswrapper[4991]: I1206 01:38:29.454035 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36c0c74585aa16b769a079c7ef49d60567b2f86e961664e4c9cb425eb37af5a3" Dec 06 01:38:29 crc kubenswrapper[4991]: I1206 01:38:29.454105 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fcgx6" Dec 06 01:38:29 crc kubenswrapper[4991]: I1206 01:38:29.638684 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq"] Dec 06 01:38:29 crc kubenswrapper[4991]: E1206 01:38:29.639122 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ae62c44-69d1-4cbf-abb9-491082bf83e3" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 06 01:38:29 crc kubenswrapper[4991]: I1206 01:38:29.639140 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ae62c44-69d1-4cbf-abb9-491082bf83e3" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 06 01:38:29 crc kubenswrapper[4991]: E1206 01:38:29.639158 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05904923-5e00-4c85-bfbe-e7699eaa6f44" containerName="registry-server" Dec 06 01:38:29 crc kubenswrapper[4991]: I1206 01:38:29.639164 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="05904923-5e00-4c85-bfbe-e7699eaa6f44" containerName="registry-server" Dec 06 01:38:29 crc kubenswrapper[4991]: E1206 01:38:29.639180 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05904923-5e00-4c85-bfbe-e7699eaa6f44" containerName="extract-content" Dec 06 01:38:29 crc kubenswrapper[4991]: I1206 01:38:29.639188 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="05904923-5e00-4c85-bfbe-e7699eaa6f44" containerName="extract-content" Dec 06 01:38:29 crc kubenswrapper[4991]: E1206 01:38:29.639200 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05904923-5e00-4c85-bfbe-e7699eaa6f44" containerName="extract-utilities" Dec 06 01:38:29 crc kubenswrapper[4991]: I1206 01:38:29.639208 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="05904923-5e00-4c85-bfbe-e7699eaa6f44" containerName="extract-utilities" Dec 06 01:38:29 crc kubenswrapper[4991]: I1206 01:38:29.639409 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ae62c44-69d1-4cbf-abb9-491082bf83e3" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 06 01:38:29 crc kubenswrapper[4991]: I1206 01:38:29.639435 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="05904923-5e00-4c85-bfbe-e7699eaa6f44" containerName="registry-server" Dec 06 01:38:29 crc kubenswrapper[4991]: I1206 01:38:29.640190 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq" Dec 06 01:38:29 crc kubenswrapper[4991]: I1206 01:38:29.645135 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9tlxv" Dec 06 01:38:29 crc kubenswrapper[4991]: I1206 01:38:29.645184 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 01:38:29 crc kubenswrapper[4991]: I1206 01:38:29.645355 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 01:38:29 crc kubenswrapper[4991]: I1206 01:38:29.645458 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 01:38:29 crc kubenswrapper[4991]: I1206 01:38:29.645458 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 06 01:38:29 crc kubenswrapper[4991]: I1206 01:38:29.645493 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 06 01:38:29 crc kubenswrapper[4991]: I1206 01:38:29.664615 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq"] Dec 06 01:38:29 crc kubenswrapper[4991]: I1206 01:38:29.717739 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjrhm\" (UniqueName: \"kubernetes.io/projected/46509879-7d53-4e4c-b581-9c48e1b350ad-kube-api-access-xjrhm\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq\" (UID: \"46509879-7d53-4e4c-b581-9c48e1b350ad\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq" Dec 06 01:38:29 crc kubenswrapper[4991]: I1206 01:38:29.717894 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/46509879-7d53-4e4c-b581-9c48e1b350ad-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq\" (UID: \"46509879-7d53-4e4c-b581-9c48e1b350ad\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq" Dec 06 01:38:29 crc kubenswrapper[4991]: I1206 01:38:29.717945 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/46509879-7d53-4e4c-b581-9c48e1b350ad-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq\" (UID: \"46509879-7d53-4e4c-b581-9c48e1b350ad\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq" Dec 06 01:38:29 crc kubenswrapper[4991]: I1206 01:38:29.718007 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/46509879-7d53-4e4c-b581-9c48e1b350ad-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq\" (UID: \"46509879-7d53-4e4c-b581-9c48e1b350ad\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq" Dec 06 01:38:29 crc kubenswrapper[4991]: I1206 01:38:29.718038 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/46509879-7d53-4e4c-b581-9c48e1b350ad-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq\" (UID: \"46509879-7d53-4e4c-b581-9c48e1b350ad\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq" Dec 06 01:38:29 crc kubenswrapper[4991]: I1206 01:38:29.718064 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46509879-7d53-4e4c-b581-9c48e1b350ad-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq\" (UID: \"46509879-7d53-4e4c-b581-9c48e1b350ad\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq" Dec 06 01:38:29 crc kubenswrapper[4991]: I1206 01:38:29.821197 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/46509879-7d53-4e4c-b581-9c48e1b350ad-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq\" (UID: \"46509879-7d53-4e4c-b581-9c48e1b350ad\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq" Dec 06 01:38:29 crc kubenswrapper[4991]: I1206 01:38:29.821289 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/46509879-7d53-4e4c-b581-9c48e1b350ad-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq\" (UID: \"46509879-7d53-4e4c-b581-9c48e1b350ad\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq" Dec 06 01:38:29 crc kubenswrapper[4991]: I1206 01:38:29.821337 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/46509879-7d53-4e4c-b581-9c48e1b350ad-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq\" (UID: \"46509879-7d53-4e4c-b581-9c48e1b350ad\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq" Dec 06 01:38:29 crc kubenswrapper[4991]: I1206 01:38:29.821372 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/46509879-7d53-4e4c-b581-9c48e1b350ad-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq\" (UID: \"46509879-7d53-4e4c-b581-9c48e1b350ad\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq" Dec 06 01:38:29 crc kubenswrapper[4991]: I1206 01:38:29.821400 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46509879-7d53-4e4c-b581-9c48e1b350ad-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq\" (UID: \"46509879-7d53-4e4c-b581-9c48e1b350ad\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq" Dec 06 01:38:29 crc kubenswrapper[4991]: I1206 01:38:29.821431 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjrhm\" (UniqueName: \"kubernetes.io/projected/46509879-7d53-4e4c-b581-9c48e1b350ad-kube-api-access-xjrhm\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq\" (UID: \"46509879-7d53-4e4c-b581-9c48e1b350ad\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq" Dec 06 01:38:29 crc kubenswrapper[4991]: I1206 01:38:29.827498 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/46509879-7d53-4e4c-b581-9c48e1b350ad-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq\" (UID: \"46509879-7d53-4e4c-b581-9c48e1b350ad\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq" Dec 06 01:38:29 crc kubenswrapper[4991]: I1206 01:38:29.827550 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/46509879-7d53-4e4c-b581-9c48e1b350ad-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq\" (UID: \"46509879-7d53-4e4c-b581-9c48e1b350ad\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq" Dec 06 01:38:29 crc kubenswrapper[4991]: I1206 01:38:29.828293 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/46509879-7d53-4e4c-b581-9c48e1b350ad-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq\" (UID: \"46509879-7d53-4e4c-b581-9c48e1b350ad\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq" Dec 06 01:38:29 crc kubenswrapper[4991]: I1206 01:38:29.828301 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46509879-7d53-4e4c-b581-9c48e1b350ad-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq\" (UID: \"46509879-7d53-4e4c-b581-9c48e1b350ad\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq" Dec 06 01:38:29 crc kubenswrapper[4991]: I1206 01:38:29.830568 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/46509879-7d53-4e4c-b581-9c48e1b350ad-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq\" (UID: \"46509879-7d53-4e4c-b581-9c48e1b350ad\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq" Dec 06 01:38:29 crc kubenswrapper[4991]: I1206 01:38:29.856692 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjrhm\" (UniqueName: \"kubernetes.io/projected/46509879-7d53-4e4c-b581-9c48e1b350ad-kube-api-access-xjrhm\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq\" (UID: \"46509879-7d53-4e4c-b581-9c48e1b350ad\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq" Dec 06 01:38:29 crc kubenswrapper[4991]: I1206 01:38:29.959465 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq" Dec 06 01:38:30 crc kubenswrapper[4991]: I1206 01:38:30.496681 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq"] Dec 06 01:38:30 crc kubenswrapper[4991]: I1206 01:38:30.498809 4991 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 01:38:31 crc kubenswrapper[4991]: I1206 01:38:31.480456 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq" event={"ID":"46509879-7d53-4e4c-b581-9c48e1b350ad","Type":"ContainerStarted","Data":"4f6d0025bab6ccb1e1f7427dcd0ad9d4db4969823dfc338d3327d3d8c22eb85c"} Dec 06 01:38:31 crc kubenswrapper[4991]: I1206 01:38:31.480796 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq" event={"ID":"46509879-7d53-4e4c-b581-9c48e1b350ad","Type":"ContainerStarted","Data":"4fe18c617dd667518001dfd519d186c458e89a653244e0b150f82086202553ca"} Dec 06 01:38:31 crc kubenswrapper[4991]: I1206 01:38:31.507063 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq" podStartSLOduration=2.322246346 podStartE2EDuration="2.507023839s" podCreationTimestamp="2025-12-06 01:38:29 +0000 UTC" firstStartedPulling="2025-12-06 01:38:30.498577701 +0000 UTC m=+2183.848868169" lastFinishedPulling="2025-12-06 01:38:30.683355154 +0000 UTC m=+2184.033645662" observedRunningTime="2025-12-06 01:38:31.499823168 +0000 UTC m=+2184.850113716" watchObservedRunningTime="2025-12-06 01:38:31.507023839 +0000 UTC m=+2184.857314307" Dec 06 01:38:46 crc kubenswrapper[4991]: I1206 01:38:46.739762 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 01:38:46 crc kubenswrapper[4991]: I1206 01:38:46.742362 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 01:38:46 crc kubenswrapper[4991]: I1206 01:38:46.742488 4991 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" Dec 06 01:38:46 crc kubenswrapper[4991]: I1206 01:38:46.743837 4991 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a03b675513fc3791b58894fb873d91c11d8a1aef5118ca63efd76f9359bb700d"} pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 01:38:46 crc kubenswrapper[4991]: I1206 01:38:46.743950 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" containerID="cri-o://a03b675513fc3791b58894fb873d91c11d8a1aef5118ca63efd76f9359bb700d" gracePeriod=600 Dec 06 01:38:47 crc kubenswrapper[4991]: I1206 01:38:47.695711 4991 generic.go:334] "Generic (PLEG): container finished" podID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerID="a03b675513fc3791b58894fb873d91c11d8a1aef5118ca63efd76f9359bb700d" exitCode=0 Dec 06 01:38:47 crc kubenswrapper[4991]: I1206 01:38:47.695774 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" event={"ID":"f35cea72-9e31-4432-9e3b-16a50ebce794","Type":"ContainerDied","Data":"a03b675513fc3791b58894fb873d91c11d8a1aef5118ca63efd76f9359bb700d"} Dec 06 01:38:47 crc kubenswrapper[4991]: I1206 01:38:47.696105 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" event={"ID":"f35cea72-9e31-4432-9e3b-16a50ebce794","Type":"ContainerStarted","Data":"1a1ae10ecce63eccf3bd1a24ea89c8ff30bbd03594b3f18dc363c75cc9304fa8"} Dec 06 01:38:47 crc kubenswrapper[4991]: I1206 01:38:47.696131 4991 scope.go:117] "RemoveContainer" containerID="1191fa5f428c9ddfe26ebd749363cce958056a5e267b2384684767a98228d9bb" Dec 06 01:39:22 crc kubenswrapper[4991]: E1206 01:39:22.292881 4991 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46509879_7d53_4e4c_b581_9c48e1b350ad.slice/crio-conmon-4f6d0025bab6ccb1e1f7427dcd0ad9d4db4969823dfc338d3327d3d8c22eb85c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46509879_7d53_4e4c_b581_9c48e1b350ad.slice/crio-4f6d0025bab6ccb1e1f7427dcd0ad9d4db4969823dfc338d3327d3d8c22eb85c.scope\": RecentStats: unable to find data in memory cache]" Dec 06 01:39:22 crc kubenswrapper[4991]: I1206 01:39:22.449434 4991 generic.go:334] "Generic (PLEG): container finished" podID="46509879-7d53-4e4c-b581-9c48e1b350ad" containerID="4f6d0025bab6ccb1e1f7427dcd0ad9d4db4969823dfc338d3327d3d8c22eb85c" exitCode=0 Dec 06 01:39:22 crc kubenswrapper[4991]: I1206 01:39:22.449596 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq" event={"ID":"46509879-7d53-4e4c-b581-9c48e1b350ad","Type":"ContainerDied","Data":"4f6d0025bab6ccb1e1f7427dcd0ad9d4db4969823dfc338d3327d3d8c22eb85c"} Dec 06 01:39:23 crc kubenswrapper[4991]: I1206 01:39:23.990387 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq" Dec 06 01:39:24 crc kubenswrapper[4991]: I1206 01:39:24.166546 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46509879-7d53-4e4c-b581-9c48e1b350ad-neutron-metadata-combined-ca-bundle\") pod \"46509879-7d53-4e4c-b581-9c48e1b350ad\" (UID: \"46509879-7d53-4e4c-b581-9c48e1b350ad\") " Dec 06 01:39:24 crc kubenswrapper[4991]: I1206 01:39:24.166622 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjrhm\" (UniqueName: \"kubernetes.io/projected/46509879-7d53-4e4c-b581-9c48e1b350ad-kube-api-access-xjrhm\") pod \"46509879-7d53-4e4c-b581-9c48e1b350ad\" (UID: \"46509879-7d53-4e4c-b581-9c48e1b350ad\") " Dec 06 01:39:24 crc kubenswrapper[4991]: I1206 01:39:24.166673 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/46509879-7d53-4e4c-b581-9c48e1b350ad-neutron-ovn-metadata-agent-neutron-config-0\") pod \"46509879-7d53-4e4c-b581-9c48e1b350ad\" (UID: \"46509879-7d53-4e4c-b581-9c48e1b350ad\") " Dec 06 01:39:24 crc kubenswrapper[4991]: I1206 01:39:24.166696 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/46509879-7d53-4e4c-b581-9c48e1b350ad-ssh-key\") pod \"46509879-7d53-4e4c-b581-9c48e1b350ad\" (UID: \"46509879-7d53-4e4c-b581-9c48e1b350ad\") " Dec 06 01:39:24 crc kubenswrapper[4991]: I1206 01:39:24.166726 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/46509879-7d53-4e4c-b581-9c48e1b350ad-nova-metadata-neutron-config-0\") pod \"46509879-7d53-4e4c-b581-9c48e1b350ad\" (UID: \"46509879-7d53-4e4c-b581-9c48e1b350ad\") " Dec 06 01:39:24 crc kubenswrapper[4991]: I1206 01:39:24.166771 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/46509879-7d53-4e4c-b581-9c48e1b350ad-inventory\") pod \"46509879-7d53-4e4c-b581-9c48e1b350ad\" (UID: \"46509879-7d53-4e4c-b581-9c48e1b350ad\") " Dec 06 01:39:24 crc kubenswrapper[4991]: I1206 01:39:24.172786 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46509879-7d53-4e4c-b581-9c48e1b350ad-kube-api-access-xjrhm" (OuterVolumeSpecName: "kube-api-access-xjrhm") pod "46509879-7d53-4e4c-b581-9c48e1b350ad" (UID: "46509879-7d53-4e4c-b581-9c48e1b350ad"). InnerVolumeSpecName "kube-api-access-xjrhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:39:24 crc kubenswrapper[4991]: I1206 01:39:24.181159 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46509879-7d53-4e4c-b581-9c48e1b350ad-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "46509879-7d53-4e4c-b581-9c48e1b350ad" (UID: "46509879-7d53-4e4c-b581-9c48e1b350ad"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:39:24 crc kubenswrapper[4991]: I1206 01:39:24.197363 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46509879-7d53-4e4c-b581-9c48e1b350ad-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "46509879-7d53-4e4c-b581-9c48e1b350ad" (UID: "46509879-7d53-4e4c-b581-9c48e1b350ad"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:39:24 crc kubenswrapper[4991]: I1206 01:39:24.206135 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46509879-7d53-4e4c-b581-9c48e1b350ad-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "46509879-7d53-4e4c-b581-9c48e1b350ad" (UID: "46509879-7d53-4e4c-b581-9c48e1b350ad"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:39:24 crc kubenswrapper[4991]: I1206 01:39:24.211971 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46509879-7d53-4e4c-b581-9c48e1b350ad-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "46509879-7d53-4e4c-b581-9c48e1b350ad" (UID: "46509879-7d53-4e4c-b581-9c48e1b350ad"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:39:24 crc kubenswrapper[4991]: I1206 01:39:24.214137 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46509879-7d53-4e4c-b581-9c48e1b350ad-inventory" (OuterVolumeSpecName: "inventory") pod "46509879-7d53-4e4c-b581-9c48e1b350ad" (UID: "46509879-7d53-4e4c-b581-9c48e1b350ad"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:39:24 crc kubenswrapper[4991]: I1206 01:39:24.269672 4991 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46509879-7d53-4e4c-b581-9c48e1b350ad-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:39:24 crc kubenswrapper[4991]: I1206 01:39:24.269720 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjrhm\" (UniqueName: \"kubernetes.io/projected/46509879-7d53-4e4c-b581-9c48e1b350ad-kube-api-access-xjrhm\") on node \"crc\" DevicePath \"\"" Dec 06 01:39:24 crc kubenswrapper[4991]: I1206 01:39:24.269737 4991 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/46509879-7d53-4e4c-b581-9c48e1b350ad-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 06 01:39:24 crc kubenswrapper[4991]: I1206 01:39:24.269754 4991 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/46509879-7d53-4e4c-b581-9c48e1b350ad-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 01:39:24 crc kubenswrapper[4991]: I1206 01:39:24.269768 4991 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/46509879-7d53-4e4c-b581-9c48e1b350ad-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 06 01:39:24 crc kubenswrapper[4991]: I1206 01:39:24.269782 4991 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/46509879-7d53-4e4c-b581-9c48e1b350ad-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 01:39:24 crc kubenswrapper[4991]: I1206 01:39:24.481823 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq" event={"ID":"46509879-7d53-4e4c-b581-9c48e1b350ad","Type":"ContainerDied","Data":"4fe18c617dd667518001dfd519d186c458e89a653244e0b150f82086202553ca"} Dec 06 01:39:24 crc kubenswrapper[4991]: I1206 01:39:24.481886 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4fe18c617dd667518001dfd519d186c458e89a653244e0b150f82086202553ca" Dec 06 01:39:24 crc kubenswrapper[4991]: I1206 01:39:24.481904 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq" Dec 06 01:39:24 crc kubenswrapper[4991]: I1206 01:39:24.607998 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rzttv"] Dec 06 01:39:24 crc kubenswrapper[4991]: E1206 01:39:24.608628 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46509879-7d53-4e4c-b581-9c48e1b350ad" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 06 01:39:24 crc kubenswrapper[4991]: I1206 01:39:24.608647 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="46509879-7d53-4e4c-b581-9c48e1b350ad" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 06 01:39:24 crc kubenswrapper[4991]: I1206 01:39:24.608842 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="46509879-7d53-4e4c-b581-9c48e1b350ad" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 06 01:39:24 crc kubenswrapper[4991]: I1206 01:39:24.609453 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rzttv" Dec 06 01:39:24 crc kubenswrapper[4991]: I1206 01:39:24.611702 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 01:39:24 crc kubenswrapper[4991]: I1206 01:39:24.612508 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 01:39:24 crc kubenswrapper[4991]: I1206 01:39:24.612853 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 06 01:39:24 crc kubenswrapper[4991]: I1206 01:39:24.613766 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 01:39:24 crc kubenswrapper[4991]: I1206 01:39:24.617284 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9tlxv" Dec 06 01:39:24 crc kubenswrapper[4991]: I1206 01:39:24.636747 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rzttv"] Dec 06 01:39:24 crc kubenswrapper[4991]: I1206 01:39:24.779615 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/908dd8e5-0b1f-494f-8e10-5ee48d1faeb4-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rzttv\" (UID: \"908dd8e5-0b1f-494f-8e10-5ee48d1faeb4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rzttv" Dec 06 01:39:24 crc kubenswrapper[4991]: I1206 01:39:24.779784 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/908dd8e5-0b1f-494f-8e10-5ee48d1faeb4-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rzttv\" (UID: \"908dd8e5-0b1f-494f-8e10-5ee48d1faeb4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rzttv" Dec 06 01:39:24 crc kubenswrapper[4991]: I1206 01:39:24.779866 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/908dd8e5-0b1f-494f-8e10-5ee48d1faeb4-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rzttv\" (UID: \"908dd8e5-0b1f-494f-8e10-5ee48d1faeb4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rzttv" Dec 06 01:39:24 crc kubenswrapper[4991]: I1206 01:39:24.780065 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6jjq\" (UniqueName: \"kubernetes.io/projected/908dd8e5-0b1f-494f-8e10-5ee48d1faeb4-kube-api-access-s6jjq\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rzttv\" (UID: \"908dd8e5-0b1f-494f-8e10-5ee48d1faeb4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rzttv" Dec 06 01:39:24 crc kubenswrapper[4991]: I1206 01:39:24.780120 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/908dd8e5-0b1f-494f-8e10-5ee48d1faeb4-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rzttv\" (UID: \"908dd8e5-0b1f-494f-8e10-5ee48d1faeb4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rzttv" Dec 06 01:39:24 crc kubenswrapper[4991]: I1206 01:39:24.881540 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/908dd8e5-0b1f-494f-8e10-5ee48d1faeb4-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rzttv\" (UID: \"908dd8e5-0b1f-494f-8e10-5ee48d1faeb4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rzttv" Dec 06 01:39:24 crc kubenswrapper[4991]: I1206 01:39:24.881591 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/908dd8e5-0b1f-494f-8e10-5ee48d1faeb4-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rzttv\" (UID: \"908dd8e5-0b1f-494f-8e10-5ee48d1faeb4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rzttv" Dec 06 01:39:24 crc kubenswrapper[4991]: I1206 01:39:24.881660 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6jjq\" (UniqueName: \"kubernetes.io/projected/908dd8e5-0b1f-494f-8e10-5ee48d1faeb4-kube-api-access-s6jjq\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rzttv\" (UID: \"908dd8e5-0b1f-494f-8e10-5ee48d1faeb4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rzttv" Dec 06 01:39:24 crc kubenswrapper[4991]: I1206 01:39:24.881685 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/908dd8e5-0b1f-494f-8e10-5ee48d1faeb4-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rzttv\" (UID: \"908dd8e5-0b1f-494f-8e10-5ee48d1faeb4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rzttv" Dec 06 01:39:24 crc kubenswrapper[4991]: I1206 01:39:24.881715 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/908dd8e5-0b1f-494f-8e10-5ee48d1faeb4-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rzttv\" (UID: \"908dd8e5-0b1f-494f-8e10-5ee48d1faeb4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rzttv" Dec 06 01:39:24 crc kubenswrapper[4991]: I1206 01:39:24.886941 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/908dd8e5-0b1f-494f-8e10-5ee48d1faeb4-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rzttv\" (UID: \"908dd8e5-0b1f-494f-8e10-5ee48d1faeb4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rzttv" Dec 06 01:39:24 crc kubenswrapper[4991]: I1206 01:39:24.887025 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/908dd8e5-0b1f-494f-8e10-5ee48d1faeb4-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rzttv\" (UID: \"908dd8e5-0b1f-494f-8e10-5ee48d1faeb4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rzttv" Dec 06 01:39:24 crc kubenswrapper[4991]: I1206 01:39:24.888205 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/908dd8e5-0b1f-494f-8e10-5ee48d1faeb4-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rzttv\" (UID: \"908dd8e5-0b1f-494f-8e10-5ee48d1faeb4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rzttv" Dec 06 01:39:24 crc kubenswrapper[4991]: I1206 01:39:24.888425 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/908dd8e5-0b1f-494f-8e10-5ee48d1faeb4-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rzttv\" (UID: \"908dd8e5-0b1f-494f-8e10-5ee48d1faeb4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rzttv" Dec 06 01:39:24 crc kubenswrapper[4991]: I1206 01:39:24.903265 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6jjq\" (UniqueName: \"kubernetes.io/projected/908dd8e5-0b1f-494f-8e10-5ee48d1faeb4-kube-api-access-s6jjq\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rzttv\" (UID: \"908dd8e5-0b1f-494f-8e10-5ee48d1faeb4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rzttv" Dec 06 01:39:24 crc kubenswrapper[4991]: I1206 01:39:24.933150 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rzttv" Dec 06 01:39:25 crc kubenswrapper[4991]: I1206 01:39:25.480091 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rzttv"] Dec 06 01:39:25 crc kubenswrapper[4991]: I1206 01:39:25.491673 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rzttv" event={"ID":"908dd8e5-0b1f-494f-8e10-5ee48d1faeb4","Type":"ContainerStarted","Data":"da824567ab535f5b0ad629d07be50725c719a3b15219766e87bd38845dc7e0a7"} Dec 06 01:39:26 crc kubenswrapper[4991]: I1206 01:39:26.509811 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rzttv" event={"ID":"908dd8e5-0b1f-494f-8e10-5ee48d1faeb4","Type":"ContainerStarted","Data":"d0e5902e0c3b23bbe62fdd2af31b7576c7709e93004af3a537285ae6e92d114d"} Dec 06 01:39:26 crc kubenswrapper[4991]: I1206 01:39:26.537634 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rzttv" podStartSLOduration=2.343931662 podStartE2EDuration="2.537611502s" podCreationTimestamp="2025-12-06 01:39:24 +0000 UTC" firstStartedPulling="2025-12-06 01:39:25.480413091 +0000 UTC m=+2238.830703559" lastFinishedPulling="2025-12-06 01:39:25.674092931 +0000 UTC m=+2239.024383399" observedRunningTime="2025-12-06 01:39:26.530901684 +0000 UTC m=+2239.881192182" watchObservedRunningTime="2025-12-06 01:39:26.537611502 +0000 UTC m=+2239.887901980" Dec 06 01:41:16 crc kubenswrapper[4991]: I1206 01:41:16.740160 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 01:41:16 crc kubenswrapper[4991]: I1206 01:41:16.740868 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 01:41:46 crc kubenswrapper[4991]: I1206 01:41:46.739503 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 01:41:46 crc kubenswrapper[4991]: I1206 01:41:46.740429 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 01:42:16 crc kubenswrapper[4991]: I1206 01:42:16.739758 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 01:42:16 crc kubenswrapper[4991]: I1206 01:42:16.742508 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 01:42:16 crc kubenswrapper[4991]: I1206 01:42:16.742786 4991 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" Dec 06 01:42:16 crc kubenswrapper[4991]: I1206 01:42:16.744551 4991 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1a1ae10ecce63eccf3bd1a24ea89c8ff30bbd03594b3f18dc363c75cc9304fa8"} pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 01:42:16 crc kubenswrapper[4991]: I1206 01:42:16.744837 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" containerID="cri-o://1a1ae10ecce63eccf3bd1a24ea89c8ff30bbd03594b3f18dc363c75cc9304fa8" gracePeriod=600 Dec 06 01:42:16 crc kubenswrapper[4991]: E1206 01:42:16.872801 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:42:17 crc kubenswrapper[4991]: I1206 01:42:17.427788 4991 generic.go:334] "Generic (PLEG): container finished" podID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerID="1a1ae10ecce63eccf3bd1a24ea89c8ff30bbd03594b3f18dc363c75cc9304fa8" exitCode=0 Dec 06 01:42:17 crc kubenswrapper[4991]: I1206 01:42:17.427835 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" event={"ID":"f35cea72-9e31-4432-9e3b-16a50ebce794","Type":"ContainerDied","Data":"1a1ae10ecce63eccf3bd1a24ea89c8ff30bbd03594b3f18dc363c75cc9304fa8"} Dec 06 01:42:17 crc kubenswrapper[4991]: I1206 01:42:17.427871 4991 scope.go:117] "RemoveContainer" containerID="a03b675513fc3791b58894fb873d91c11d8a1aef5118ca63efd76f9359bb700d" Dec 06 01:42:17 crc kubenswrapper[4991]: I1206 01:42:17.428956 4991 scope.go:117] "RemoveContainer" containerID="1a1ae10ecce63eccf3bd1a24ea89c8ff30bbd03594b3f18dc363c75cc9304fa8" Dec 06 01:42:17 crc kubenswrapper[4991]: E1206 01:42:17.429346 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:42:31 crc kubenswrapper[4991]: I1206 01:42:31.038597 4991 scope.go:117] "RemoveContainer" containerID="1a1ae10ecce63eccf3bd1a24ea89c8ff30bbd03594b3f18dc363c75cc9304fa8" Dec 06 01:42:31 crc kubenswrapper[4991]: E1206 01:42:31.039927 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:42:33 crc kubenswrapper[4991]: I1206 01:42:33.730592 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jv6zs"] Dec 06 01:42:33 crc kubenswrapper[4991]: I1206 01:42:33.736325 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jv6zs" Dec 06 01:42:33 crc kubenswrapper[4991]: I1206 01:42:33.744157 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jv6zs"] Dec 06 01:42:33 crc kubenswrapper[4991]: I1206 01:42:33.813855 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d1ebd61-4db6-43d5-8c72-a403f77ecc08-utilities\") pod \"redhat-marketplace-jv6zs\" (UID: \"9d1ebd61-4db6-43d5-8c72-a403f77ecc08\") " pod="openshift-marketplace/redhat-marketplace-jv6zs" Dec 06 01:42:33 crc kubenswrapper[4991]: I1206 01:42:33.814501 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4bbw\" (UniqueName: \"kubernetes.io/projected/9d1ebd61-4db6-43d5-8c72-a403f77ecc08-kube-api-access-c4bbw\") pod \"redhat-marketplace-jv6zs\" (UID: \"9d1ebd61-4db6-43d5-8c72-a403f77ecc08\") " pod="openshift-marketplace/redhat-marketplace-jv6zs" Dec 06 01:42:33 crc kubenswrapper[4991]: I1206 01:42:33.815031 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d1ebd61-4db6-43d5-8c72-a403f77ecc08-catalog-content\") pod \"redhat-marketplace-jv6zs\" (UID: \"9d1ebd61-4db6-43d5-8c72-a403f77ecc08\") " pod="openshift-marketplace/redhat-marketplace-jv6zs" Dec 06 01:42:33 crc kubenswrapper[4991]: I1206 01:42:33.917049 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d1ebd61-4db6-43d5-8c72-a403f77ecc08-catalog-content\") pod \"redhat-marketplace-jv6zs\" (UID: \"9d1ebd61-4db6-43d5-8c72-a403f77ecc08\") " pod="openshift-marketplace/redhat-marketplace-jv6zs" Dec 06 01:42:33 crc kubenswrapper[4991]: I1206 01:42:33.917343 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d1ebd61-4db6-43d5-8c72-a403f77ecc08-utilities\") pod \"redhat-marketplace-jv6zs\" (UID: \"9d1ebd61-4db6-43d5-8c72-a403f77ecc08\") " pod="openshift-marketplace/redhat-marketplace-jv6zs" Dec 06 01:42:33 crc kubenswrapper[4991]: I1206 01:42:33.917496 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4bbw\" (UniqueName: \"kubernetes.io/projected/9d1ebd61-4db6-43d5-8c72-a403f77ecc08-kube-api-access-c4bbw\") pod \"redhat-marketplace-jv6zs\" (UID: \"9d1ebd61-4db6-43d5-8c72-a403f77ecc08\") " pod="openshift-marketplace/redhat-marketplace-jv6zs" Dec 06 01:42:33 crc kubenswrapper[4991]: I1206 01:42:33.917519 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d1ebd61-4db6-43d5-8c72-a403f77ecc08-catalog-content\") pod \"redhat-marketplace-jv6zs\" (UID: \"9d1ebd61-4db6-43d5-8c72-a403f77ecc08\") " pod="openshift-marketplace/redhat-marketplace-jv6zs" Dec 06 01:42:33 crc kubenswrapper[4991]: I1206 01:42:33.917794 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d1ebd61-4db6-43d5-8c72-a403f77ecc08-utilities\") pod \"redhat-marketplace-jv6zs\" (UID: \"9d1ebd61-4db6-43d5-8c72-a403f77ecc08\") " pod="openshift-marketplace/redhat-marketplace-jv6zs" Dec 06 01:42:33 crc kubenswrapper[4991]: I1206 01:42:33.937038 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4bbw\" (UniqueName: \"kubernetes.io/projected/9d1ebd61-4db6-43d5-8c72-a403f77ecc08-kube-api-access-c4bbw\") pod \"redhat-marketplace-jv6zs\" (UID: \"9d1ebd61-4db6-43d5-8c72-a403f77ecc08\") " pod="openshift-marketplace/redhat-marketplace-jv6zs" Dec 06 01:42:34 crc kubenswrapper[4991]: I1206 01:42:34.082376 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jv6zs" Dec 06 01:42:34 crc kubenswrapper[4991]: I1206 01:42:34.566924 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jv6zs"] Dec 06 01:42:34 crc kubenswrapper[4991]: I1206 01:42:34.621181 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jv6zs" event={"ID":"9d1ebd61-4db6-43d5-8c72-a403f77ecc08","Type":"ContainerStarted","Data":"3be94e65339b23245eabc0f5d3b0b4da0efd18cb2f4ac25e106da266fcdf24ee"} Dec 06 01:42:35 crc kubenswrapper[4991]: I1206 01:42:35.638872 4991 generic.go:334] "Generic (PLEG): container finished" podID="9d1ebd61-4db6-43d5-8c72-a403f77ecc08" containerID="9b51358fe671c6d99f6141bddc1efd84b98a626a919d9a4509d6da5c5e7643c5" exitCode=0 Dec 06 01:42:35 crc kubenswrapper[4991]: I1206 01:42:35.639019 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jv6zs" event={"ID":"9d1ebd61-4db6-43d5-8c72-a403f77ecc08","Type":"ContainerDied","Data":"9b51358fe671c6d99f6141bddc1efd84b98a626a919d9a4509d6da5c5e7643c5"} Dec 06 01:42:36 crc kubenswrapper[4991]: I1206 01:42:36.658836 4991 generic.go:334] "Generic (PLEG): container finished" podID="9d1ebd61-4db6-43d5-8c72-a403f77ecc08" containerID="84cea02690fc3f204290330a464bbe89479e6a2a29c382ac6db5fbc6a5f28c9e" exitCode=0 Dec 06 01:42:36 crc kubenswrapper[4991]: I1206 01:42:36.658904 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jv6zs" event={"ID":"9d1ebd61-4db6-43d5-8c72-a403f77ecc08","Type":"ContainerDied","Data":"84cea02690fc3f204290330a464bbe89479e6a2a29c382ac6db5fbc6a5f28c9e"} Dec 06 01:42:37 crc kubenswrapper[4991]: I1206 01:42:37.682342 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jv6zs" event={"ID":"9d1ebd61-4db6-43d5-8c72-a403f77ecc08","Type":"ContainerStarted","Data":"d440a61104c87dc3d97fc795efb04fd1ed3b4f37ef0e6d292615b0e90ee57242"} Dec 06 01:42:37 crc kubenswrapper[4991]: I1206 01:42:37.713095 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jv6zs" podStartSLOduration=3.279913846 podStartE2EDuration="4.713070901s" podCreationTimestamp="2025-12-06 01:42:33 +0000 UTC" firstStartedPulling="2025-12-06 01:42:35.643042923 +0000 UTC m=+2428.993333421" lastFinishedPulling="2025-12-06 01:42:37.076199998 +0000 UTC m=+2430.426490476" observedRunningTime="2025-12-06 01:42:37.701745871 +0000 UTC m=+2431.052036339" watchObservedRunningTime="2025-12-06 01:42:37.713070901 +0000 UTC m=+2431.063361379" Dec 06 01:42:43 crc kubenswrapper[4991]: I1206 01:42:43.039927 4991 scope.go:117] "RemoveContainer" containerID="1a1ae10ecce63eccf3bd1a24ea89c8ff30bbd03594b3f18dc363c75cc9304fa8" Dec 06 01:42:43 crc kubenswrapper[4991]: E1206 01:42:43.041208 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:42:44 crc kubenswrapper[4991]: I1206 01:42:44.083657 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jv6zs" Dec 06 01:42:44 crc kubenswrapper[4991]: I1206 01:42:44.084527 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jv6zs" Dec 06 01:42:44 crc kubenswrapper[4991]: I1206 01:42:44.161747 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jv6zs" Dec 06 01:42:44 crc kubenswrapper[4991]: I1206 01:42:44.867814 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jv6zs" Dec 06 01:42:44 crc kubenswrapper[4991]: I1206 01:42:44.934253 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jv6zs"] Dec 06 01:42:46 crc kubenswrapper[4991]: I1206 01:42:46.789249 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jv6zs" podUID="9d1ebd61-4db6-43d5-8c72-a403f77ecc08" containerName="registry-server" containerID="cri-o://d440a61104c87dc3d97fc795efb04fd1ed3b4f37ef0e6d292615b0e90ee57242" gracePeriod=2 Dec 06 01:42:47 crc kubenswrapper[4991]: I1206 01:42:47.320137 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jv6zs" Dec 06 01:42:47 crc kubenswrapper[4991]: I1206 01:42:47.431624 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d1ebd61-4db6-43d5-8c72-a403f77ecc08-utilities\") pod \"9d1ebd61-4db6-43d5-8c72-a403f77ecc08\" (UID: \"9d1ebd61-4db6-43d5-8c72-a403f77ecc08\") " Dec 06 01:42:47 crc kubenswrapper[4991]: I1206 01:42:47.431747 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d1ebd61-4db6-43d5-8c72-a403f77ecc08-catalog-content\") pod \"9d1ebd61-4db6-43d5-8c72-a403f77ecc08\" (UID: \"9d1ebd61-4db6-43d5-8c72-a403f77ecc08\") " Dec 06 01:42:47 crc kubenswrapper[4991]: I1206 01:42:47.431785 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4bbw\" (UniqueName: \"kubernetes.io/projected/9d1ebd61-4db6-43d5-8c72-a403f77ecc08-kube-api-access-c4bbw\") pod \"9d1ebd61-4db6-43d5-8c72-a403f77ecc08\" (UID: \"9d1ebd61-4db6-43d5-8c72-a403f77ecc08\") " Dec 06 01:42:47 crc kubenswrapper[4991]: I1206 01:42:47.432564 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d1ebd61-4db6-43d5-8c72-a403f77ecc08-utilities" (OuterVolumeSpecName: "utilities") pod "9d1ebd61-4db6-43d5-8c72-a403f77ecc08" (UID: "9d1ebd61-4db6-43d5-8c72-a403f77ecc08"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:42:47 crc kubenswrapper[4991]: I1206 01:42:47.440512 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d1ebd61-4db6-43d5-8c72-a403f77ecc08-kube-api-access-c4bbw" (OuterVolumeSpecName: "kube-api-access-c4bbw") pod "9d1ebd61-4db6-43d5-8c72-a403f77ecc08" (UID: "9d1ebd61-4db6-43d5-8c72-a403f77ecc08"). InnerVolumeSpecName "kube-api-access-c4bbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:42:47 crc kubenswrapper[4991]: I1206 01:42:47.453118 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d1ebd61-4db6-43d5-8c72-a403f77ecc08-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9d1ebd61-4db6-43d5-8c72-a403f77ecc08" (UID: "9d1ebd61-4db6-43d5-8c72-a403f77ecc08"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:42:47 crc kubenswrapper[4991]: I1206 01:42:47.534588 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d1ebd61-4db6-43d5-8c72-a403f77ecc08-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 01:42:47 crc kubenswrapper[4991]: I1206 01:42:47.534953 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d1ebd61-4db6-43d5-8c72-a403f77ecc08-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 01:42:47 crc kubenswrapper[4991]: I1206 01:42:47.535124 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4bbw\" (UniqueName: \"kubernetes.io/projected/9d1ebd61-4db6-43d5-8c72-a403f77ecc08-kube-api-access-c4bbw\") on node \"crc\" DevicePath \"\"" Dec 06 01:42:47 crc kubenswrapper[4991]: I1206 01:42:47.824302 4991 generic.go:334] "Generic (PLEG): container finished" podID="9d1ebd61-4db6-43d5-8c72-a403f77ecc08" containerID="d440a61104c87dc3d97fc795efb04fd1ed3b4f37ef0e6d292615b0e90ee57242" exitCode=0 Dec 06 01:42:47 crc kubenswrapper[4991]: I1206 01:42:47.824375 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jv6zs" event={"ID":"9d1ebd61-4db6-43d5-8c72-a403f77ecc08","Type":"ContainerDied","Data":"d440a61104c87dc3d97fc795efb04fd1ed3b4f37ef0e6d292615b0e90ee57242"} Dec 06 01:42:47 crc kubenswrapper[4991]: I1206 01:42:47.824395 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jv6zs" Dec 06 01:42:47 crc kubenswrapper[4991]: I1206 01:42:47.824427 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jv6zs" event={"ID":"9d1ebd61-4db6-43d5-8c72-a403f77ecc08","Type":"ContainerDied","Data":"3be94e65339b23245eabc0f5d3b0b4da0efd18cb2f4ac25e106da266fcdf24ee"} Dec 06 01:42:47 crc kubenswrapper[4991]: I1206 01:42:47.824468 4991 scope.go:117] "RemoveContainer" containerID="d440a61104c87dc3d97fc795efb04fd1ed3b4f37ef0e6d292615b0e90ee57242" Dec 06 01:42:47 crc kubenswrapper[4991]: I1206 01:42:47.889903 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jv6zs"] Dec 06 01:42:47 crc kubenswrapper[4991]: I1206 01:42:47.895293 4991 scope.go:117] "RemoveContainer" containerID="84cea02690fc3f204290330a464bbe89479e6a2a29c382ac6db5fbc6a5f28c9e" Dec 06 01:42:47 crc kubenswrapper[4991]: I1206 01:42:47.899385 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jv6zs"] Dec 06 01:42:47 crc kubenswrapper[4991]: I1206 01:42:47.924905 4991 scope.go:117] "RemoveContainer" containerID="9b51358fe671c6d99f6141bddc1efd84b98a626a919d9a4509d6da5c5e7643c5" Dec 06 01:42:47 crc kubenswrapper[4991]: I1206 01:42:47.970658 4991 scope.go:117] "RemoveContainer" containerID="d440a61104c87dc3d97fc795efb04fd1ed3b4f37ef0e6d292615b0e90ee57242" Dec 06 01:42:47 crc kubenswrapper[4991]: E1206 01:42:47.971250 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d440a61104c87dc3d97fc795efb04fd1ed3b4f37ef0e6d292615b0e90ee57242\": container with ID starting with d440a61104c87dc3d97fc795efb04fd1ed3b4f37ef0e6d292615b0e90ee57242 not found: ID does not exist" containerID="d440a61104c87dc3d97fc795efb04fd1ed3b4f37ef0e6d292615b0e90ee57242" Dec 06 01:42:47 crc kubenswrapper[4991]: I1206 01:42:47.971290 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d440a61104c87dc3d97fc795efb04fd1ed3b4f37ef0e6d292615b0e90ee57242"} err="failed to get container status \"d440a61104c87dc3d97fc795efb04fd1ed3b4f37ef0e6d292615b0e90ee57242\": rpc error: code = NotFound desc = could not find container \"d440a61104c87dc3d97fc795efb04fd1ed3b4f37ef0e6d292615b0e90ee57242\": container with ID starting with d440a61104c87dc3d97fc795efb04fd1ed3b4f37ef0e6d292615b0e90ee57242 not found: ID does not exist" Dec 06 01:42:47 crc kubenswrapper[4991]: I1206 01:42:47.971313 4991 scope.go:117] "RemoveContainer" containerID="84cea02690fc3f204290330a464bbe89479e6a2a29c382ac6db5fbc6a5f28c9e" Dec 06 01:42:47 crc kubenswrapper[4991]: E1206 01:42:47.971711 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84cea02690fc3f204290330a464bbe89479e6a2a29c382ac6db5fbc6a5f28c9e\": container with ID starting with 84cea02690fc3f204290330a464bbe89479e6a2a29c382ac6db5fbc6a5f28c9e not found: ID does not exist" containerID="84cea02690fc3f204290330a464bbe89479e6a2a29c382ac6db5fbc6a5f28c9e" Dec 06 01:42:47 crc kubenswrapper[4991]: I1206 01:42:47.971766 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84cea02690fc3f204290330a464bbe89479e6a2a29c382ac6db5fbc6a5f28c9e"} err="failed to get container status \"84cea02690fc3f204290330a464bbe89479e6a2a29c382ac6db5fbc6a5f28c9e\": rpc error: code = NotFound desc = could not find container \"84cea02690fc3f204290330a464bbe89479e6a2a29c382ac6db5fbc6a5f28c9e\": container with ID starting with 84cea02690fc3f204290330a464bbe89479e6a2a29c382ac6db5fbc6a5f28c9e not found: ID does not exist" Dec 06 01:42:47 crc kubenswrapper[4991]: I1206 01:42:47.971802 4991 scope.go:117] "RemoveContainer" containerID="9b51358fe671c6d99f6141bddc1efd84b98a626a919d9a4509d6da5c5e7643c5" Dec 06 01:42:47 crc kubenswrapper[4991]: E1206 01:42:47.972313 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b51358fe671c6d99f6141bddc1efd84b98a626a919d9a4509d6da5c5e7643c5\": container with ID starting with 9b51358fe671c6d99f6141bddc1efd84b98a626a919d9a4509d6da5c5e7643c5 not found: ID does not exist" containerID="9b51358fe671c6d99f6141bddc1efd84b98a626a919d9a4509d6da5c5e7643c5" Dec 06 01:42:47 crc kubenswrapper[4991]: I1206 01:42:47.972341 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b51358fe671c6d99f6141bddc1efd84b98a626a919d9a4509d6da5c5e7643c5"} err="failed to get container status \"9b51358fe671c6d99f6141bddc1efd84b98a626a919d9a4509d6da5c5e7643c5\": rpc error: code = NotFound desc = could not find container \"9b51358fe671c6d99f6141bddc1efd84b98a626a919d9a4509d6da5c5e7643c5\": container with ID starting with 9b51358fe671c6d99f6141bddc1efd84b98a626a919d9a4509d6da5c5e7643c5 not found: ID does not exist" Dec 06 01:42:47 crc kubenswrapper[4991]: E1206 01:42:47.983858 4991 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d1ebd61_4db6_43d5_8c72_a403f77ecc08.slice/crio-3be94e65339b23245eabc0f5d3b0b4da0efd18cb2f4ac25e106da266fcdf24ee\": RecentStats: unable to find data in memory cache]" Dec 06 01:42:49 crc kubenswrapper[4991]: I1206 01:42:49.071449 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d1ebd61-4db6-43d5-8c72-a403f77ecc08" path="/var/lib/kubelet/pods/9d1ebd61-4db6-43d5-8c72-a403f77ecc08/volumes" Dec 06 01:42:54 crc kubenswrapper[4991]: I1206 01:42:54.037728 4991 scope.go:117] "RemoveContainer" containerID="1a1ae10ecce63eccf3bd1a24ea89c8ff30bbd03594b3f18dc363c75cc9304fa8" Dec 06 01:42:54 crc kubenswrapper[4991]: E1206 01:42:54.038402 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:43:09 crc kubenswrapper[4991]: I1206 01:43:09.038686 4991 scope.go:117] "RemoveContainer" containerID="1a1ae10ecce63eccf3bd1a24ea89c8ff30bbd03594b3f18dc363c75cc9304fa8" Dec 06 01:43:09 crc kubenswrapper[4991]: E1206 01:43:09.040089 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:43:22 crc kubenswrapper[4991]: I1206 01:43:22.038353 4991 scope.go:117] "RemoveContainer" containerID="1a1ae10ecce63eccf3bd1a24ea89c8ff30bbd03594b3f18dc363c75cc9304fa8" Dec 06 01:43:22 crc kubenswrapper[4991]: E1206 01:43:22.039227 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:43:33 crc kubenswrapper[4991]: I1206 01:43:33.038930 4991 scope.go:117] "RemoveContainer" containerID="1a1ae10ecce63eccf3bd1a24ea89c8ff30bbd03594b3f18dc363c75cc9304fa8" Dec 06 01:43:33 crc kubenswrapper[4991]: E1206 01:43:33.040046 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:43:41 crc kubenswrapper[4991]: I1206 01:43:41.451777 4991 generic.go:334] "Generic (PLEG): container finished" podID="908dd8e5-0b1f-494f-8e10-5ee48d1faeb4" containerID="d0e5902e0c3b23bbe62fdd2af31b7576c7709e93004af3a537285ae6e92d114d" exitCode=0 Dec 06 01:43:41 crc kubenswrapper[4991]: I1206 01:43:41.451847 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rzttv" event={"ID":"908dd8e5-0b1f-494f-8e10-5ee48d1faeb4","Type":"ContainerDied","Data":"d0e5902e0c3b23bbe62fdd2af31b7576c7709e93004af3a537285ae6e92d114d"} Dec 06 01:43:42 crc kubenswrapper[4991]: I1206 01:43:42.972916 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rzttv" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.058467 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/908dd8e5-0b1f-494f-8e10-5ee48d1faeb4-libvirt-secret-0\") pod \"908dd8e5-0b1f-494f-8e10-5ee48d1faeb4\" (UID: \"908dd8e5-0b1f-494f-8e10-5ee48d1faeb4\") " Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.058531 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6jjq\" (UniqueName: \"kubernetes.io/projected/908dd8e5-0b1f-494f-8e10-5ee48d1faeb4-kube-api-access-s6jjq\") pod \"908dd8e5-0b1f-494f-8e10-5ee48d1faeb4\" (UID: \"908dd8e5-0b1f-494f-8e10-5ee48d1faeb4\") " Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.058675 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/908dd8e5-0b1f-494f-8e10-5ee48d1faeb4-libvirt-combined-ca-bundle\") pod \"908dd8e5-0b1f-494f-8e10-5ee48d1faeb4\" (UID: \"908dd8e5-0b1f-494f-8e10-5ee48d1faeb4\") " Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.058708 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/908dd8e5-0b1f-494f-8e10-5ee48d1faeb4-inventory\") pod \"908dd8e5-0b1f-494f-8e10-5ee48d1faeb4\" (UID: \"908dd8e5-0b1f-494f-8e10-5ee48d1faeb4\") " Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.058739 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/908dd8e5-0b1f-494f-8e10-5ee48d1faeb4-ssh-key\") pod \"908dd8e5-0b1f-494f-8e10-5ee48d1faeb4\" (UID: \"908dd8e5-0b1f-494f-8e10-5ee48d1faeb4\") " Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.066955 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/908dd8e5-0b1f-494f-8e10-5ee48d1faeb4-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "908dd8e5-0b1f-494f-8e10-5ee48d1faeb4" (UID: "908dd8e5-0b1f-494f-8e10-5ee48d1faeb4"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.069185 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/908dd8e5-0b1f-494f-8e10-5ee48d1faeb4-kube-api-access-s6jjq" (OuterVolumeSpecName: "kube-api-access-s6jjq") pod "908dd8e5-0b1f-494f-8e10-5ee48d1faeb4" (UID: "908dd8e5-0b1f-494f-8e10-5ee48d1faeb4"). InnerVolumeSpecName "kube-api-access-s6jjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.092519 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/908dd8e5-0b1f-494f-8e10-5ee48d1faeb4-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "908dd8e5-0b1f-494f-8e10-5ee48d1faeb4" (UID: "908dd8e5-0b1f-494f-8e10-5ee48d1faeb4"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.093750 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/908dd8e5-0b1f-494f-8e10-5ee48d1faeb4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "908dd8e5-0b1f-494f-8e10-5ee48d1faeb4" (UID: "908dd8e5-0b1f-494f-8e10-5ee48d1faeb4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.095345 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/908dd8e5-0b1f-494f-8e10-5ee48d1faeb4-inventory" (OuterVolumeSpecName: "inventory") pod "908dd8e5-0b1f-494f-8e10-5ee48d1faeb4" (UID: "908dd8e5-0b1f-494f-8e10-5ee48d1faeb4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.162001 4991 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/908dd8e5-0b1f-494f-8e10-5ee48d1faeb4-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.162061 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6jjq\" (UniqueName: \"kubernetes.io/projected/908dd8e5-0b1f-494f-8e10-5ee48d1faeb4-kube-api-access-s6jjq\") on node \"crc\" DevicePath \"\"" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.162081 4991 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/908dd8e5-0b1f-494f-8e10-5ee48d1faeb4-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.162100 4991 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/908dd8e5-0b1f-494f-8e10-5ee48d1faeb4-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.162115 4991 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/908dd8e5-0b1f-494f-8e10-5ee48d1faeb4-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.476524 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rzttv" event={"ID":"908dd8e5-0b1f-494f-8e10-5ee48d1faeb4","Type":"ContainerDied","Data":"da824567ab535f5b0ad629d07be50725c719a3b15219766e87bd38845dc7e0a7"} Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.476584 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rzttv" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.476595 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da824567ab535f5b0ad629d07be50725c719a3b15219766e87bd38845dc7e0a7" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.590515 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-jntfk"] Dec 06 01:43:43 crc kubenswrapper[4991]: E1206 01:43:43.591564 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d1ebd61-4db6-43d5-8c72-a403f77ecc08" containerName="extract-utilities" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.591595 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d1ebd61-4db6-43d5-8c72-a403f77ecc08" containerName="extract-utilities" Dec 06 01:43:43 crc kubenswrapper[4991]: E1206 01:43:43.591614 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d1ebd61-4db6-43d5-8c72-a403f77ecc08" containerName="extract-content" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.591622 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d1ebd61-4db6-43d5-8c72-a403f77ecc08" containerName="extract-content" Dec 06 01:43:43 crc kubenswrapper[4991]: E1206 01:43:43.591637 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="908dd8e5-0b1f-494f-8e10-5ee48d1faeb4" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.591645 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="908dd8e5-0b1f-494f-8e10-5ee48d1faeb4" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 06 01:43:43 crc kubenswrapper[4991]: E1206 01:43:43.591675 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d1ebd61-4db6-43d5-8c72-a403f77ecc08" containerName="registry-server" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.591681 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d1ebd61-4db6-43d5-8c72-a403f77ecc08" containerName="registry-server" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.591920 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d1ebd61-4db6-43d5-8c72-a403f77ecc08" containerName="registry-server" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.591940 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="908dd8e5-0b1f-494f-8e10-5ee48d1faeb4" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.592844 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jntfk" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.595747 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.595968 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.596065 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9tlxv" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.595972 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.603245 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.604040 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.605757 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.607616 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-jntfk"] Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.676542 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jntfk\" (UID: \"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jntfk" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.676602 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jntfk\" (UID: \"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jntfk" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.676631 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jntfk\" (UID: \"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jntfk" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.676715 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jntfk\" (UID: \"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jntfk" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.676861 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jntfk\" (UID: \"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jntfk" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.676896 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jntfk\" (UID: \"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jntfk" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.676932 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf4pl\" (UniqueName: \"kubernetes.io/projected/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-kube-api-access-xf4pl\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jntfk\" (UID: \"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jntfk" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.676963 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jntfk\" (UID: \"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jntfk" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.677005 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jntfk\" (UID: \"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jntfk" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.778671 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jntfk\" (UID: \"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jntfk" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.778769 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jntfk\" (UID: \"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jntfk" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.778817 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jntfk\" (UID: \"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jntfk" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.778923 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jntfk\" (UID: \"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jntfk" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.779185 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jntfk\" (UID: \"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jntfk" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.779242 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jntfk\" (UID: \"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jntfk" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.779322 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf4pl\" (UniqueName: \"kubernetes.io/projected/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-kube-api-access-xf4pl\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jntfk\" (UID: \"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jntfk" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.779371 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jntfk\" (UID: \"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jntfk" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.779405 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jntfk\" (UID: \"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jntfk" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.781961 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jntfk\" (UID: \"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jntfk" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.784637 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jntfk\" (UID: \"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jntfk" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.785386 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jntfk\" (UID: \"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jntfk" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.785547 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jntfk\" (UID: \"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jntfk" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.785591 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jntfk\" (UID: \"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jntfk" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.786343 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jntfk\" (UID: \"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jntfk" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.788406 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jntfk\" (UID: \"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jntfk" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.806890 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jntfk\" (UID: \"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jntfk" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.811270 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf4pl\" (UniqueName: \"kubernetes.io/projected/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-kube-api-access-xf4pl\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jntfk\" (UID: \"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jntfk" Dec 06 01:43:43 crc kubenswrapper[4991]: I1206 01:43:43.976339 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jntfk" Dec 06 01:43:44 crc kubenswrapper[4991]: I1206 01:43:44.566034 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-jntfk"] Dec 06 01:43:44 crc kubenswrapper[4991]: I1206 01:43:44.575217 4991 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 01:43:45 crc kubenswrapper[4991]: I1206 01:43:45.500266 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jntfk" event={"ID":"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9","Type":"ContainerStarted","Data":"c42fb35293611b8cea9477c62692d190512de8a75f751d3391015d25d7287941"} Dec 06 01:43:45 crc kubenswrapper[4991]: I1206 01:43:45.501090 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jntfk" event={"ID":"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9","Type":"ContainerStarted","Data":"a2ee00e2a8b5039601805ac5e9c74a67416b6f9c638d07d1d4be6496db873503"} Dec 06 01:43:45 crc kubenswrapper[4991]: I1206 01:43:45.534067 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jntfk" podStartSLOduration=2.334034305 podStartE2EDuration="2.533960165s" podCreationTimestamp="2025-12-06 01:43:43 +0000 UTC" firstStartedPulling="2025-12-06 01:43:44.574931037 +0000 UTC m=+2497.925221505" lastFinishedPulling="2025-12-06 01:43:44.774856857 +0000 UTC m=+2498.125147365" observedRunningTime="2025-12-06 01:43:45.523753005 +0000 UTC m=+2498.874043573" watchObservedRunningTime="2025-12-06 01:43:45.533960165 +0000 UTC m=+2498.884250653" Dec 06 01:43:46 crc kubenswrapper[4991]: I1206 01:43:46.039615 4991 scope.go:117] "RemoveContainer" containerID="1a1ae10ecce63eccf3bd1a24ea89c8ff30bbd03594b3f18dc363c75cc9304fa8" Dec 06 01:43:46 crc kubenswrapper[4991]: E1206 01:43:46.040060 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:43:58 crc kubenswrapper[4991]: I1206 01:43:58.038611 4991 scope.go:117] "RemoveContainer" containerID="1a1ae10ecce63eccf3bd1a24ea89c8ff30bbd03594b3f18dc363c75cc9304fa8" Dec 06 01:43:58 crc kubenswrapper[4991]: E1206 01:43:58.040065 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:44:11 crc kubenswrapper[4991]: I1206 01:44:11.038011 4991 scope.go:117] "RemoveContainer" containerID="1a1ae10ecce63eccf3bd1a24ea89c8ff30bbd03594b3f18dc363c75cc9304fa8" Dec 06 01:44:11 crc kubenswrapper[4991]: E1206 01:44:11.038720 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:44:22 crc kubenswrapper[4991]: I1206 01:44:22.038860 4991 scope.go:117] "RemoveContainer" containerID="1a1ae10ecce63eccf3bd1a24ea89c8ff30bbd03594b3f18dc363c75cc9304fa8" Dec 06 01:44:22 crc kubenswrapper[4991]: E1206 01:44:22.040654 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:44:32 crc kubenswrapper[4991]: I1206 01:44:32.050281 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9mmr6"] Dec 06 01:44:32 crc kubenswrapper[4991]: I1206 01:44:32.055496 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9mmr6" Dec 06 01:44:32 crc kubenswrapper[4991]: I1206 01:44:32.068605 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9mmr6"] Dec 06 01:44:32 crc kubenswrapper[4991]: I1206 01:44:32.141492 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d571f8cb-d716-4c2a-8554-5483f1d029a5-catalog-content\") pod \"community-operators-9mmr6\" (UID: \"d571f8cb-d716-4c2a-8554-5483f1d029a5\") " pod="openshift-marketplace/community-operators-9mmr6" Dec 06 01:44:32 crc kubenswrapper[4991]: I1206 01:44:32.141874 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hls8p\" (UniqueName: \"kubernetes.io/projected/d571f8cb-d716-4c2a-8554-5483f1d029a5-kube-api-access-hls8p\") pod \"community-operators-9mmr6\" (UID: \"d571f8cb-d716-4c2a-8554-5483f1d029a5\") " pod="openshift-marketplace/community-operators-9mmr6" Dec 06 01:44:32 crc kubenswrapper[4991]: I1206 01:44:32.142022 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d571f8cb-d716-4c2a-8554-5483f1d029a5-utilities\") pod \"community-operators-9mmr6\" (UID: \"d571f8cb-d716-4c2a-8554-5483f1d029a5\") " pod="openshift-marketplace/community-operators-9mmr6" Dec 06 01:44:32 crc kubenswrapper[4991]: I1206 01:44:32.244199 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d571f8cb-d716-4c2a-8554-5483f1d029a5-utilities\") pod \"community-operators-9mmr6\" (UID: \"d571f8cb-d716-4c2a-8554-5483f1d029a5\") " pod="openshift-marketplace/community-operators-9mmr6" Dec 06 01:44:32 crc kubenswrapper[4991]: I1206 01:44:32.244454 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d571f8cb-d716-4c2a-8554-5483f1d029a5-catalog-content\") pod \"community-operators-9mmr6\" (UID: \"d571f8cb-d716-4c2a-8554-5483f1d029a5\") " pod="openshift-marketplace/community-operators-9mmr6" Dec 06 01:44:32 crc kubenswrapper[4991]: I1206 01:44:32.244519 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hls8p\" (UniqueName: \"kubernetes.io/projected/d571f8cb-d716-4c2a-8554-5483f1d029a5-kube-api-access-hls8p\") pod \"community-operators-9mmr6\" (UID: \"d571f8cb-d716-4c2a-8554-5483f1d029a5\") " pod="openshift-marketplace/community-operators-9mmr6" Dec 06 01:44:32 crc kubenswrapper[4991]: I1206 01:44:32.244825 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d571f8cb-d716-4c2a-8554-5483f1d029a5-utilities\") pod \"community-operators-9mmr6\" (UID: \"d571f8cb-d716-4c2a-8554-5483f1d029a5\") " pod="openshift-marketplace/community-operators-9mmr6" Dec 06 01:44:32 crc kubenswrapper[4991]: I1206 01:44:32.245294 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d571f8cb-d716-4c2a-8554-5483f1d029a5-catalog-content\") pod \"community-operators-9mmr6\" (UID: \"d571f8cb-d716-4c2a-8554-5483f1d029a5\") " pod="openshift-marketplace/community-operators-9mmr6" Dec 06 01:44:32 crc kubenswrapper[4991]: I1206 01:44:32.286224 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hls8p\" (UniqueName: \"kubernetes.io/projected/d571f8cb-d716-4c2a-8554-5483f1d029a5-kube-api-access-hls8p\") pod \"community-operators-9mmr6\" (UID: \"d571f8cb-d716-4c2a-8554-5483f1d029a5\") " pod="openshift-marketplace/community-operators-9mmr6" Dec 06 01:44:32 crc kubenswrapper[4991]: I1206 01:44:32.391716 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9mmr6" Dec 06 01:44:32 crc kubenswrapper[4991]: I1206 01:44:32.955226 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9mmr6"] Dec 06 01:44:33 crc kubenswrapper[4991]: I1206 01:44:33.539867 4991 generic.go:334] "Generic (PLEG): container finished" podID="d571f8cb-d716-4c2a-8554-5483f1d029a5" containerID="d33165b6a201be08af9a581e7439aec6ec549c8a3e7cfa1f11d30e211fa7d10e" exitCode=0 Dec 06 01:44:33 crc kubenswrapper[4991]: I1206 01:44:33.540110 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9mmr6" event={"ID":"d571f8cb-d716-4c2a-8554-5483f1d029a5","Type":"ContainerDied","Data":"d33165b6a201be08af9a581e7439aec6ec549c8a3e7cfa1f11d30e211fa7d10e"} Dec 06 01:44:33 crc kubenswrapper[4991]: I1206 01:44:33.540355 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9mmr6" event={"ID":"d571f8cb-d716-4c2a-8554-5483f1d029a5","Type":"ContainerStarted","Data":"bb13c4195b175796f13e3c7e1ed7f5f744084e1aaa1d966c4984a70d7d47f6f0"} Dec 06 01:44:34 crc kubenswrapper[4991]: I1206 01:44:34.038341 4991 scope.go:117] "RemoveContainer" containerID="1a1ae10ecce63eccf3bd1a24ea89c8ff30bbd03594b3f18dc363c75cc9304fa8" Dec 06 01:44:34 crc kubenswrapper[4991]: E1206 01:44:34.038629 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:44:34 crc kubenswrapper[4991]: I1206 01:44:34.555527 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9mmr6" event={"ID":"d571f8cb-d716-4c2a-8554-5483f1d029a5","Type":"ContainerStarted","Data":"611c40f421a1ed0e769d29904ade49c5547cfd690c7f3d0ec75f60b894922086"} Dec 06 01:44:35 crc kubenswrapper[4991]: I1206 01:44:35.569571 4991 generic.go:334] "Generic (PLEG): container finished" podID="d571f8cb-d716-4c2a-8554-5483f1d029a5" containerID="611c40f421a1ed0e769d29904ade49c5547cfd690c7f3d0ec75f60b894922086" exitCode=0 Dec 06 01:44:35 crc kubenswrapper[4991]: I1206 01:44:35.569620 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9mmr6" event={"ID":"d571f8cb-d716-4c2a-8554-5483f1d029a5","Type":"ContainerDied","Data":"611c40f421a1ed0e769d29904ade49c5547cfd690c7f3d0ec75f60b894922086"} Dec 06 01:44:36 crc kubenswrapper[4991]: I1206 01:44:36.590225 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9mmr6" event={"ID":"d571f8cb-d716-4c2a-8554-5483f1d029a5","Type":"ContainerStarted","Data":"a90e838343f1697db3b07253e4556450b7dc3a43b9ec1f43eb3fb1e0bc8bcb9b"} Dec 06 01:44:36 crc kubenswrapper[4991]: I1206 01:44:36.620052 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9mmr6" podStartSLOduration=2.178643699 podStartE2EDuration="4.620030554s" podCreationTimestamp="2025-12-06 01:44:32 +0000 UTC" firstStartedPulling="2025-12-06 01:44:33.543024178 +0000 UTC m=+2546.893314636" lastFinishedPulling="2025-12-06 01:44:35.984411023 +0000 UTC m=+2549.334701491" observedRunningTime="2025-12-06 01:44:36.618650757 +0000 UTC m=+2549.968941235" watchObservedRunningTime="2025-12-06 01:44:36.620030554 +0000 UTC m=+2549.970321022" Dec 06 01:44:42 crc kubenswrapper[4991]: I1206 01:44:42.392236 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9mmr6" Dec 06 01:44:42 crc kubenswrapper[4991]: I1206 01:44:42.395243 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9mmr6" Dec 06 01:44:42 crc kubenswrapper[4991]: I1206 01:44:42.471353 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9mmr6" Dec 06 01:44:42 crc kubenswrapper[4991]: I1206 01:44:42.716538 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9mmr6" Dec 06 01:44:42 crc kubenswrapper[4991]: I1206 01:44:42.774493 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9mmr6"] Dec 06 01:44:44 crc kubenswrapper[4991]: I1206 01:44:44.672932 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9mmr6" podUID="d571f8cb-d716-4c2a-8554-5483f1d029a5" containerName="registry-server" containerID="cri-o://a90e838343f1697db3b07253e4556450b7dc3a43b9ec1f43eb3fb1e0bc8bcb9b" gracePeriod=2 Dec 06 01:44:45 crc kubenswrapper[4991]: I1206 01:44:45.271352 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9mmr6" Dec 06 01:44:45 crc kubenswrapper[4991]: I1206 01:44:45.392455 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hls8p\" (UniqueName: \"kubernetes.io/projected/d571f8cb-d716-4c2a-8554-5483f1d029a5-kube-api-access-hls8p\") pod \"d571f8cb-d716-4c2a-8554-5483f1d029a5\" (UID: \"d571f8cb-d716-4c2a-8554-5483f1d029a5\") " Dec 06 01:44:45 crc kubenswrapper[4991]: I1206 01:44:45.392804 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d571f8cb-d716-4c2a-8554-5483f1d029a5-utilities\") pod \"d571f8cb-d716-4c2a-8554-5483f1d029a5\" (UID: \"d571f8cb-d716-4c2a-8554-5483f1d029a5\") " Dec 06 01:44:45 crc kubenswrapper[4991]: I1206 01:44:45.393039 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d571f8cb-d716-4c2a-8554-5483f1d029a5-catalog-content\") pod \"d571f8cb-d716-4c2a-8554-5483f1d029a5\" (UID: \"d571f8cb-d716-4c2a-8554-5483f1d029a5\") " Dec 06 01:44:45 crc kubenswrapper[4991]: I1206 01:44:45.393916 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d571f8cb-d716-4c2a-8554-5483f1d029a5-utilities" (OuterVolumeSpecName: "utilities") pod "d571f8cb-d716-4c2a-8554-5483f1d029a5" (UID: "d571f8cb-d716-4c2a-8554-5483f1d029a5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:44:45 crc kubenswrapper[4991]: I1206 01:44:45.401234 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d571f8cb-d716-4c2a-8554-5483f1d029a5-kube-api-access-hls8p" (OuterVolumeSpecName: "kube-api-access-hls8p") pod "d571f8cb-d716-4c2a-8554-5483f1d029a5" (UID: "d571f8cb-d716-4c2a-8554-5483f1d029a5"). InnerVolumeSpecName "kube-api-access-hls8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:44:45 crc kubenswrapper[4991]: I1206 01:44:45.449575 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d571f8cb-d716-4c2a-8554-5483f1d029a5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d571f8cb-d716-4c2a-8554-5483f1d029a5" (UID: "d571f8cb-d716-4c2a-8554-5483f1d029a5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:44:45 crc kubenswrapper[4991]: I1206 01:44:45.495952 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hls8p\" (UniqueName: \"kubernetes.io/projected/d571f8cb-d716-4c2a-8554-5483f1d029a5-kube-api-access-hls8p\") on node \"crc\" DevicePath \"\"" Dec 06 01:44:45 crc kubenswrapper[4991]: I1206 01:44:45.496006 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d571f8cb-d716-4c2a-8554-5483f1d029a5-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 01:44:45 crc kubenswrapper[4991]: I1206 01:44:45.496020 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d571f8cb-d716-4c2a-8554-5483f1d029a5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 01:44:45 crc kubenswrapper[4991]: I1206 01:44:45.690309 4991 generic.go:334] "Generic (PLEG): container finished" podID="d571f8cb-d716-4c2a-8554-5483f1d029a5" containerID="a90e838343f1697db3b07253e4556450b7dc3a43b9ec1f43eb3fb1e0bc8bcb9b" exitCode=0 Dec 06 01:44:45 crc kubenswrapper[4991]: I1206 01:44:45.690384 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9mmr6" event={"ID":"d571f8cb-d716-4c2a-8554-5483f1d029a5","Type":"ContainerDied","Data":"a90e838343f1697db3b07253e4556450b7dc3a43b9ec1f43eb3fb1e0bc8bcb9b"} Dec 06 01:44:45 crc kubenswrapper[4991]: I1206 01:44:45.690425 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9mmr6" Dec 06 01:44:45 crc kubenswrapper[4991]: I1206 01:44:45.690472 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9mmr6" event={"ID":"d571f8cb-d716-4c2a-8554-5483f1d029a5","Type":"ContainerDied","Data":"bb13c4195b175796f13e3c7e1ed7f5f744084e1aaa1d966c4984a70d7d47f6f0"} Dec 06 01:44:45 crc kubenswrapper[4991]: I1206 01:44:45.690509 4991 scope.go:117] "RemoveContainer" containerID="a90e838343f1697db3b07253e4556450b7dc3a43b9ec1f43eb3fb1e0bc8bcb9b" Dec 06 01:44:45 crc kubenswrapper[4991]: I1206 01:44:45.720462 4991 scope.go:117] "RemoveContainer" containerID="611c40f421a1ed0e769d29904ade49c5547cfd690c7f3d0ec75f60b894922086" Dec 06 01:44:45 crc kubenswrapper[4991]: I1206 01:44:45.737647 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9mmr6"] Dec 06 01:44:45 crc kubenswrapper[4991]: I1206 01:44:45.746841 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9mmr6"] Dec 06 01:44:45 crc kubenswrapper[4991]: I1206 01:44:45.764420 4991 scope.go:117] "RemoveContainer" containerID="d33165b6a201be08af9a581e7439aec6ec549c8a3e7cfa1f11d30e211fa7d10e" Dec 06 01:44:45 crc kubenswrapper[4991]: I1206 01:44:45.785611 4991 scope.go:117] "RemoveContainer" containerID="a90e838343f1697db3b07253e4556450b7dc3a43b9ec1f43eb3fb1e0bc8bcb9b" Dec 06 01:44:45 crc kubenswrapper[4991]: E1206 01:44:45.786317 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a90e838343f1697db3b07253e4556450b7dc3a43b9ec1f43eb3fb1e0bc8bcb9b\": container with ID starting with a90e838343f1697db3b07253e4556450b7dc3a43b9ec1f43eb3fb1e0bc8bcb9b not found: ID does not exist" containerID="a90e838343f1697db3b07253e4556450b7dc3a43b9ec1f43eb3fb1e0bc8bcb9b" Dec 06 01:44:45 crc kubenswrapper[4991]: I1206 01:44:45.786358 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a90e838343f1697db3b07253e4556450b7dc3a43b9ec1f43eb3fb1e0bc8bcb9b"} err="failed to get container status \"a90e838343f1697db3b07253e4556450b7dc3a43b9ec1f43eb3fb1e0bc8bcb9b\": rpc error: code = NotFound desc = could not find container \"a90e838343f1697db3b07253e4556450b7dc3a43b9ec1f43eb3fb1e0bc8bcb9b\": container with ID starting with a90e838343f1697db3b07253e4556450b7dc3a43b9ec1f43eb3fb1e0bc8bcb9b not found: ID does not exist" Dec 06 01:44:45 crc kubenswrapper[4991]: I1206 01:44:45.786380 4991 scope.go:117] "RemoveContainer" containerID="611c40f421a1ed0e769d29904ade49c5547cfd690c7f3d0ec75f60b894922086" Dec 06 01:44:45 crc kubenswrapper[4991]: E1206 01:44:45.786645 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"611c40f421a1ed0e769d29904ade49c5547cfd690c7f3d0ec75f60b894922086\": container with ID starting with 611c40f421a1ed0e769d29904ade49c5547cfd690c7f3d0ec75f60b894922086 not found: ID does not exist" containerID="611c40f421a1ed0e769d29904ade49c5547cfd690c7f3d0ec75f60b894922086" Dec 06 01:44:45 crc kubenswrapper[4991]: I1206 01:44:45.786698 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"611c40f421a1ed0e769d29904ade49c5547cfd690c7f3d0ec75f60b894922086"} err="failed to get container status \"611c40f421a1ed0e769d29904ade49c5547cfd690c7f3d0ec75f60b894922086\": rpc error: code = NotFound desc = could not find container \"611c40f421a1ed0e769d29904ade49c5547cfd690c7f3d0ec75f60b894922086\": container with ID starting with 611c40f421a1ed0e769d29904ade49c5547cfd690c7f3d0ec75f60b894922086 not found: ID does not exist" Dec 06 01:44:45 crc kubenswrapper[4991]: I1206 01:44:45.786719 4991 scope.go:117] "RemoveContainer" containerID="d33165b6a201be08af9a581e7439aec6ec549c8a3e7cfa1f11d30e211fa7d10e" Dec 06 01:44:45 crc kubenswrapper[4991]: E1206 01:44:45.787011 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d33165b6a201be08af9a581e7439aec6ec549c8a3e7cfa1f11d30e211fa7d10e\": container with ID starting with d33165b6a201be08af9a581e7439aec6ec549c8a3e7cfa1f11d30e211fa7d10e not found: ID does not exist" containerID="d33165b6a201be08af9a581e7439aec6ec549c8a3e7cfa1f11d30e211fa7d10e" Dec 06 01:44:45 crc kubenswrapper[4991]: I1206 01:44:45.787034 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d33165b6a201be08af9a581e7439aec6ec549c8a3e7cfa1f11d30e211fa7d10e"} err="failed to get container status \"d33165b6a201be08af9a581e7439aec6ec549c8a3e7cfa1f11d30e211fa7d10e\": rpc error: code = NotFound desc = could not find container \"d33165b6a201be08af9a581e7439aec6ec549c8a3e7cfa1f11d30e211fa7d10e\": container with ID starting with d33165b6a201be08af9a581e7439aec6ec549c8a3e7cfa1f11d30e211fa7d10e not found: ID does not exist" Dec 06 01:44:47 crc kubenswrapper[4991]: I1206 01:44:47.052112 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d571f8cb-d716-4c2a-8554-5483f1d029a5" path="/var/lib/kubelet/pods/d571f8cb-d716-4c2a-8554-5483f1d029a5/volumes" Dec 06 01:44:49 crc kubenswrapper[4991]: I1206 01:44:49.038801 4991 scope.go:117] "RemoveContainer" containerID="1a1ae10ecce63eccf3bd1a24ea89c8ff30bbd03594b3f18dc363c75cc9304fa8" Dec 06 01:44:49 crc kubenswrapper[4991]: E1206 01:44:49.039745 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:44:59 crc kubenswrapper[4991]: I1206 01:44:59.920925 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-frzg8"] Dec 06 01:44:59 crc kubenswrapper[4991]: E1206 01:44:59.924689 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d571f8cb-d716-4c2a-8554-5483f1d029a5" containerName="extract-utilities" Dec 06 01:44:59 crc kubenswrapper[4991]: I1206 01:44:59.924730 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="d571f8cb-d716-4c2a-8554-5483f1d029a5" containerName="extract-utilities" Dec 06 01:44:59 crc kubenswrapper[4991]: E1206 01:44:59.924751 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d571f8cb-d716-4c2a-8554-5483f1d029a5" containerName="extract-content" Dec 06 01:44:59 crc kubenswrapper[4991]: I1206 01:44:59.924759 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="d571f8cb-d716-4c2a-8554-5483f1d029a5" containerName="extract-content" Dec 06 01:44:59 crc kubenswrapper[4991]: E1206 01:44:59.924786 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d571f8cb-d716-4c2a-8554-5483f1d029a5" containerName="registry-server" Dec 06 01:44:59 crc kubenswrapper[4991]: I1206 01:44:59.924794 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="d571f8cb-d716-4c2a-8554-5483f1d029a5" containerName="registry-server" Dec 06 01:44:59 crc kubenswrapper[4991]: I1206 01:44:59.925131 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="d571f8cb-d716-4c2a-8554-5483f1d029a5" containerName="registry-server" Dec 06 01:44:59 crc kubenswrapper[4991]: I1206 01:44:59.926959 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-frzg8" Dec 06 01:44:59 crc kubenswrapper[4991]: I1206 01:44:59.958125 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-frzg8"] Dec 06 01:45:00 crc kubenswrapper[4991]: I1206 01:45:00.125817 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93438f02-eb70-44a6-a6d0-6852cf59e670-utilities\") pod \"certified-operators-frzg8\" (UID: \"93438f02-eb70-44a6-a6d0-6852cf59e670\") " pod="openshift-marketplace/certified-operators-frzg8" Dec 06 01:45:00 crc kubenswrapper[4991]: I1206 01:45:00.126281 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcxfg\" (UniqueName: \"kubernetes.io/projected/93438f02-eb70-44a6-a6d0-6852cf59e670-kube-api-access-xcxfg\") pod \"certified-operators-frzg8\" (UID: \"93438f02-eb70-44a6-a6d0-6852cf59e670\") " pod="openshift-marketplace/certified-operators-frzg8" Dec 06 01:45:00 crc kubenswrapper[4991]: I1206 01:45:00.126372 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93438f02-eb70-44a6-a6d0-6852cf59e670-catalog-content\") pod \"certified-operators-frzg8\" (UID: \"93438f02-eb70-44a6-a6d0-6852cf59e670\") " pod="openshift-marketplace/certified-operators-frzg8" Dec 06 01:45:00 crc kubenswrapper[4991]: I1206 01:45:00.155943 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416425-qgpf7"] Dec 06 01:45:00 crc kubenswrapper[4991]: I1206 01:45:00.157370 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416425-qgpf7" Dec 06 01:45:00 crc kubenswrapper[4991]: I1206 01:45:00.163060 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 06 01:45:00 crc kubenswrapper[4991]: I1206 01:45:00.166252 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 06 01:45:00 crc kubenswrapper[4991]: I1206 01:45:00.178287 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416425-qgpf7"] Dec 06 01:45:00 crc kubenswrapper[4991]: I1206 01:45:00.228747 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcxfg\" (UniqueName: \"kubernetes.io/projected/93438f02-eb70-44a6-a6d0-6852cf59e670-kube-api-access-xcxfg\") pod \"certified-operators-frzg8\" (UID: \"93438f02-eb70-44a6-a6d0-6852cf59e670\") " pod="openshift-marketplace/certified-operators-frzg8" Dec 06 01:45:00 crc kubenswrapper[4991]: I1206 01:45:00.229265 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93438f02-eb70-44a6-a6d0-6852cf59e670-catalog-content\") pod \"certified-operators-frzg8\" (UID: \"93438f02-eb70-44a6-a6d0-6852cf59e670\") " pod="openshift-marketplace/certified-operators-frzg8" Dec 06 01:45:00 crc kubenswrapper[4991]: I1206 01:45:00.229612 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93438f02-eb70-44a6-a6d0-6852cf59e670-catalog-content\") pod \"certified-operators-frzg8\" (UID: \"93438f02-eb70-44a6-a6d0-6852cf59e670\") " pod="openshift-marketplace/certified-operators-frzg8" Dec 06 01:45:00 crc kubenswrapper[4991]: I1206 01:45:00.229930 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93438f02-eb70-44a6-a6d0-6852cf59e670-utilities\") pod \"certified-operators-frzg8\" (UID: \"93438f02-eb70-44a6-a6d0-6852cf59e670\") " pod="openshift-marketplace/certified-operators-frzg8" Dec 06 01:45:00 crc kubenswrapper[4991]: I1206 01:45:00.230897 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93438f02-eb70-44a6-a6d0-6852cf59e670-utilities\") pod \"certified-operators-frzg8\" (UID: \"93438f02-eb70-44a6-a6d0-6852cf59e670\") " pod="openshift-marketplace/certified-operators-frzg8" Dec 06 01:45:00 crc kubenswrapper[4991]: I1206 01:45:00.265100 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcxfg\" (UniqueName: \"kubernetes.io/projected/93438f02-eb70-44a6-a6d0-6852cf59e670-kube-api-access-xcxfg\") pod \"certified-operators-frzg8\" (UID: \"93438f02-eb70-44a6-a6d0-6852cf59e670\") " pod="openshift-marketplace/certified-operators-frzg8" Dec 06 01:45:00 crc kubenswrapper[4991]: I1206 01:45:00.333192 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ab28f64-0d4f-49fe-ac08-b9ff4011070a-secret-volume\") pod \"collect-profiles-29416425-qgpf7\" (UID: \"1ab28f64-0d4f-49fe-ac08-b9ff4011070a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416425-qgpf7" Dec 06 01:45:00 crc kubenswrapper[4991]: I1206 01:45:00.333379 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ab28f64-0d4f-49fe-ac08-b9ff4011070a-config-volume\") pod \"collect-profiles-29416425-qgpf7\" (UID: \"1ab28f64-0d4f-49fe-ac08-b9ff4011070a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416425-qgpf7" Dec 06 01:45:00 crc kubenswrapper[4991]: I1206 01:45:00.333471 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpd6w\" (UniqueName: \"kubernetes.io/projected/1ab28f64-0d4f-49fe-ac08-b9ff4011070a-kube-api-access-jpd6w\") pod \"collect-profiles-29416425-qgpf7\" (UID: \"1ab28f64-0d4f-49fe-ac08-b9ff4011070a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416425-qgpf7" Dec 06 01:45:00 crc kubenswrapper[4991]: I1206 01:45:00.435786 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ab28f64-0d4f-49fe-ac08-b9ff4011070a-config-volume\") pod \"collect-profiles-29416425-qgpf7\" (UID: \"1ab28f64-0d4f-49fe-ac08-b9ff4011070a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416425-qgpf7" Dec 06 01:45:00 crc kubenswrapper[4991]: I1206 01:45:00.435859 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpd6w\" (UniqueName: \"kubernetes.io/projected/1ab28f64-0d4f-49fe-ac08-b9ff4011070a-kube-api-access-jpd6w\") pod \"collect-profiles-29416425-qgpf7\" (UID: \"1ab28f64-0d4f-49fe-ac08-b9ff4011070a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416425-qgpf7" Dec 06 01:45:00 crc kubenswrapper[4991]: I1206 01:45:00.435924 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ab28f64-0d4f-49fe-ac08-b9ff4011070a-secret-volume\") pod \"collect-profiles-29416425-qgpf7\" (UID: \"1ab28f64-0d4f-49fe-ac08-b9ff4011070a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416425-qgpf7" Dec 06 01:45:00 crc kubenswrapper[4991]: I1206 01:45:00.437242 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ab28f64-0d4f-49fe-ac08-b9ff4011070a-config-volume\") pod \"collect-profiles-29416425-qgpf7\" (UID: \"1ab28f64-0d4f-49fe-ac08-b9ff4011070a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416425-qgpf7" Dec 06 01:45:00 crc kubenswrapper[4991]: I1206 01:45:00.440403 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ab28f64-0d4f-49fe-ac08-b9ff4011070a-secret-volume\") pod \"collect-profiles-29416425-qgpf7\" (UID: \"1ab28f64-0d4f-49fe-ac08-b9ff4011070a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416425-qgpf7" Dec 06 01:45:00 crc kubenswrapper[4991]: I1206 01:45:00.463685 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpd6w\" (UniqueName: \"kubernetes.io/projected/1ab28f64-0d4f-49fe-ac08-b9ff4011070a-kube-api-access-jpd6w\") pod \"collect-profiles-29416425-qgpf7\" (UID: \"1ab28f64-0d4f-49fe-ac08-b9ff4011070a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416425-qgpf7" Dec 06 01:45:00 crc kubenswrapper[4991]: I1206 01:45:00.480692 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416425-qgpf7" Dec 06 01:45:00 crc kubenswrapper[4991]: I1206 01:45:00.550500 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-frzg8" Dec 06 01:45:01 crc kubenswrapper[4991]: I1206 01:45:01.040616 4991 scope.go:117] "RemoveContainer" containerID="1a1ae10ecce63eccf3bd1a24ea89c8ff30bbd03594b3f18dc363c75cc9304fa8" Dec 06 01:45:01 crc kubenswrapper[4991]: E1206 01:45:01.041478 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:45:01 crc kubenswrapper[4991]: I1206 01:45:01.081917 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416425-qgpf7"] Dec 06 01:45:01 crc kubenswrapper[4991]: I1206 01:45:01.127412 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-frzg8"] Dec 06 01:45:01 crc kubenswrapper[4991]: I1206 01:45:01.907504 4991 generic.go:334] "Generic (PLEG): container finished" podID="1ab28f64-0d4f-49fe-ac08-b9ff4011070a" containerID="5f92a85b49a2223835b2acc4de757a076ccc452d5dc82184459e50880f057da7" exitCode=0 Dec 06 01:45:01 crc kubenswrapper[4991]: I1206 01:45:01.907635 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416425-qgpf7" event={"ID":"1ab28f64-0d4f-49fe-ac08-b9ff4011070a","Type":"ContainerDied","Data":"5f92a85b49a2223835b2acc4de757a076ccc452d5dc82184459e50880f057da7"} Dec 06 01:45:01 crc kubenswrapper[4991]: I1206 01:45:01.908227 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416425-qgpf7" event={"ID":"1ab28f64-0d4f-49fe-ac08-b9ff4011070a","Type":"ContainerStarted","Data":"aafaafbeb92eb956ae60248dbf91adadd346c71330cb2952f41c4da695e64b7e"} Dec 06 01:45:01 crc kubenswrapper[4991]: I1206 01:45:01.911414 4991 generic.go:334] "Generic (PLEG): container finished" podID="93438f02-eb70-44a6-a6d0-6852cf59e670" containerID="6198ac83f762a1114d16fcf94f2d74cad380f83db60c797e76d48780dba27790" exitCode=0 Dec 06 01:45:01 crc kubenswrapper[4991]: I1206 01:45:01.911497 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frzg8" event={"ID":"93438f02-eb70-44a6-a6d0-6852cf59e670","Type":"ContainerDied","Data":"6198ac83f762a1114d16fcf94f2d74cad380f83db60c797e76d48780dba27790"} Dec 06 01:45:01 crc kubenswrapper[4991]: I1206 01:45:01.911569 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frzg8" event={"ID":"93438f02-eb70-44a6-a6d0-6852cf59e670","Type":"ContainerStarted","Data":"f662aa425aeb0a01a1aa93880b7b34c63c9aa52cddda0bac11ee22d46678e45b"} Dec 06 01:45:03 crc kubenswrapper[4991]: I1206 01:45:03.367020 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416425-qgpf7" Dec 06 01:45:03 crc kubenswrapper[4991]: I1206 01:45:03.406856 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ab28f64-0d4f-49fe-ac08-b9ff4011070a-config-volume\") pod \"1ab28f64-0d4f-49fe-ac08-b9ff4011070a\" (UID: \"1ab28f64-0d4f-49fe-ac08-b9ff4011070a\") " Dec 06 01:45:03 crc kubenswrapper[4991]: I1206 01:45:03.407108 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpd6w\" (UniqueName: \"kubernetes.io/projected/1ab28f64-0d4f-49fe-ac08-b9ff4011070a-kube-api-access-jpd6w\") pod \"1ab28f64-0d4f-49fe-ac08-b9ff4011070a\" (UID: \"1ab28f64-0d4f-49fe-ac08-b9ff4011070a\") " Dec 06 01:45:03 crc kubenswrapper[4991]: I1206 01:45:03.407255 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ab28f64-0d4f-49fe-ac08-b9ff4011070a-secret-volume\") pod \"1ab28f64-0d4f-49fe-ac08-b9ff4011070a\" (UID: \"1ab28f64-0d4f-49fe-ac08-b9ff4011070a\") " Dec 06 01:45:03 crc kubenswrapper[4991]: I1206 01:45:03.407777 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ab28f64-0d4f-49fe-ac08-b9ff4011070a-config-volume" (OuterVolumeSpecName: "config-volume") pod "1ab28f64-0d4f-49fe-ac08-b9ff4011070a" (UID: "1ab28f64-0d4f-49fe-ac08-b9ff4011070a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:45:03 crc kubenswrapper[4991]: I1206 01:45:03.407972 4991 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ab28f64-0d4f-49fe-ac08-b9ff4011070a-config-volume\") on node \"crc\" DevicePath \"\"" Dec 06 01:45:03 crc kubenswrapper[4991]: I1206 01:45:03.414109 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ab28f64-0d4f-49fe-ac08-b9ff4011070a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1ab28f64-0d4f-49fe-ac08-b9ff4011070a" (UID: "1ab28f64-0d4f-49fe-ac08-b9ff4011070a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:45:03 crc kubenswrapper[4991]: I1206 01:45:03.415272 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ab28f64-0d4f-49fe-ac08-b9ff4011070a-kube-api-access-jpd6w" (OuterVolumeSpecName: "kube-api-access-jpd6w") pod "1ab28f64-0d4f-49fe-ac08-b9ff4011070a" (UID: "1ab28f64-0d4f-49fe-ac08-b9ff4011070a"). InnerVolumeSpecName "kube-api-access-jpd6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:45:03 crc kubenswrapper[4991]: I1206 01:45:03.509620 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpd6w\" (UniqueName: \"kubernetes.io/projected/1ab28f64-0d4f-49fe-ac08-b9ff4011070a-kube-api-access-jpd6w\") on node \"crc\" DevicePath \"\"" Dec 06 01:45:03 crc kubenswrapper[4991]: I1206 01:45:03.509665 4991 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ab28f64-0d4f-49fe-ac08-b9ff4011070a-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 06 01:45:03 crc kubenswrapper[4991]: I1206 01:45:03.939149 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416425-qgpf7" event={"ID":"1ab28f64-0d4f-49fe-ac08-b9ff4011070a","Type":"ContainerDied","Data":"aafaafbeb92eb956ae60248dbf91adadd346c71330cb2952f41c4da695e64b7e"} Dec 06 01:45:03 crc kubenswrapper[4991]: I1206 01:45:03.939486 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aafaafbeb92eb956ae60248dbf91adadd346c71330cb2952f41c4da695e64b7e" Dec 06 01:45:03 crc kubenswrapper[4991]: I1206 01:45:03.939253 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416425-qgpf7" Dec 06 01:45:03 crc kubenswrapper[4991]: I1206 01:45:03.942496 4991 generic.go:334] "Generic (PLEG): container finished" podID="93438f02-eb70-44a6-a6d0-6852cf59e670" containerID="603fdcfc2b52ec605ffd998854af183b08697910b67d420624c3d02a6da478c6" exitCode=0 Dec 06 01:45:03 crc kubenswrapper[4991]: I1206 01:45:03.942560 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frzg8" event={"ID":"93438f02-eb70-44a6-a6d0-6852cf59e670","Type":"ContainerDied","Data":"603fdcfc2b52ec605ffd998854af183b08697910b67d420624c3d02a6da478c6"} Dec 06 01:45:04 crc kubenswrapper[4991]: I1206 01:45:04.448318 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416380-k4nkv"] Dec 06 01:45:04 crc kubenswrapper[4991]: I1206 01:45:04.456199 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416380-k4nkv"] Dec 06 01:45:04 crc kubenswrapper[4991]: I1206 01:45:04.954524 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frzg8" event={"ID":"93438f02-eb70-44a6-a6d0-6852cf59e670","Type":"ContainerStarted","Data":"b4762b176c39c0c7e5c47d075517bf2998f7ed10846a3fb436b887536d7b471f"} Dec 06 01:45:05 crc kubenswrapper[4991]: I1206 01:45:05.049616 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d67d95f-81bb-491f-87ca-e15aba253031" path="/var/lib/kubelet/pods/0d67d95f-81bb-491f-87ca-e15aba253031/volumes" Dec 06 01:45:10 crc kubenswrapper[4991]: I1206 01:45:10.551598 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-frzg8" Dec 06 01:45:10 crc kubenswrapper[4991]: I1206 01:45:10.554597 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-frzg8" Dec 06 01:45:10 crc kubenswrapper[4991]: I1206 01:45:10.631450 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-frzg8" Dec 06 01:45:10 crc kubenswrapper[4991]: I1206 01:45:10.659642 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-frzg8" podStartSLOduration=9.227117438 podStartE2EDuration="11.659624718s" podCreationTimestamp="2025-12-06 01:44:59 +0000 UTC" firstStartedPulling="2025-12-06 01:45:01.915126737 +0000 UTC m=+2575.265417245" lastFinishedPulling="2025-12-06 01:45:04.347634057 +0000 UTC m=+2577.697924525" observedRunningTime="2025-12-06 01:45:04.974089675 +0000 UTC m=+2578.324380163" watchObservedRunningTime="2025-12-06 01:45:10.659624718 +0000 UTC m=+2584.009915186" Dec 06 01:45:11 crc kubenswrapper[4991]: I1206 01:45:11.093693 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-frzg8" Dec 06 01:45:11 crc kubenswrapper[4991]: I1206 01:45:11.162476 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-frzg8"] Dec 06 01:45:13 crc kubenswrapper[4991]: I1206 01:45:13.037967 4991 scope.go:117] "RemoveContainer" containerID="1a1ae10ecce63eccf3bd1a24ea89c8ff30bbd03594b3f18dc363c75cc9304fa8" Dec 06 01:45:13 crc kubenswrapper[4991]: E1206 01:45:13.038832 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:45:13 crc kubenswrapper[4991]: I1206 01:45:13.049979 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-frzg8" podUID="93438f02-eb70-44a6-a6d0-6852cf59e670" containerName="registry-server" containerID="cri-o://b4762b176c39c0c7e5c47d075517bf2998f7ed10846a3fb436b887536d7b471f" gracePeriod=2 Dec 06 01:45:13 crc kubenswrapper[4991]: I1206 01:45:13.534538 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-frzg8" Dec 06 01:45:13 crc kubenswrapper[4991]: I1206 01:45:13.638953 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93438f02-eb70-44a6-a6d0-6852cf59e670-catalog-content\") pod \"93438f02-eb70-44a6-a6d0-6852cf59e670\" (UID: \"93438f02-eb70-44a6-a6d0-6852cf59e670\") " Dec 06 01:45:13 crc kubenswrapper[4991]: I1206 01:45:13.639340 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcxfg\" (UniqueName: \"kubernetes.io/projected/93438f02-eb70-44a6-a6d0-6852cf59e670-kube-api-access-xcxfg\") pod \"93438f02-eb70-44a6-a6d0-6852cf59e670\" (UID: \"93438f02-eb70-44a6-a6d0-6852cf59e670\") " Dec 06 01:45:13 crc kubenswrapper[4991]: I1206 01:45:13.639573 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93438f02-eb70-44a6-a6d0-6852cf59e670-utilities\") pod \"93438f02-eb70-44a6-a6d0-6852cf59e670\" (UID: \"93438f02-eb70-44a6-a6d0-6852cf59e670\") " Dec 06 01:45:13 crc kubenswrapper[4991]: I1206 01:45:13.640465 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93438f02-eb70-44a6-a6d0-6852cf59e670-utilities" (OuterVolumeSpecName: "utilities") pod "93438f02-eb70-44a6-a6d0-6852cf59e670" (UID: "93438f02-eb70-44a6-a6d0-6852cf59e670"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:45:13 crc kubenswrapper[4991]: I1206 01:45:13.647458 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93438f02-eb70-44a6-a6d0-6852cf59e670-kube-api-access-xcxfg" (OuterVolumeSpecName: "kube-api-access-xcxfg") pod "93438f02-eb70-44a6-a6d0-6852cf59e670" (UID: "93438f02-eb70-44a6-a6d0-6852cf59e670"). InnerVolumeSpecName "kube-api-access-xcxfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:45:13 crc kubenswrapper[4991]: I1206 01:45:13.743398 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcxfg\" (UniqueName: \"kubernetes.io/projected/93438f02-eb70-44a6-a6d0-6852cf59e670-kube-api-access-xcxfg\") on node \"crc\" DevicePath \"\"" Dec 06 01:45:13 crc kubenswrapper[4991]: I1206 01:45:13.744394 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93438f02-eb70-44a6-a6d0-6852cf59e670-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 01:45:13 crc kubenswrapper[4991]: I1206 01:45:13.872432 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93438f02-eb70-44a6-a6d0-6852cf59e670-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "93438f02-eb70-44a6-a6d0-6852cf59e670" (UID: "93438f02-eb70-44a6-a6d0-6852cf59e670"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:45:13 crc kubenswrapper[4991]: I1206 01:45:13.949731 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93438f02-eb70-44a6-a6d0-6852cf59e670-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 01:45:14 crc kubenswrapper[4991]: I1206 01:45:14.067930 4991 generic.go:334] "Generic (PLEG): container finished" podID="93438f02-eb70-44a6-a6d0-6852cf59e670" containerID="b4762b176c39c0c7e5c47d075517bf2998f7ed10846a3fb436b887536d7b471f" exitCode=0 Dec 06 01:45:14 crc kubenswrapper[4991]: I1206 01:45:14.067996 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frzg8" event={"ID":"93438f02-eb70-44a6-a6d0-6852cf59e670","Type":"ContainerDied","Data":"b4762b176c39c0c7e5c47d075517bf2998f7ed10846a3fb436b887536d7b471f"} Dec 06 01:45:14 crc kubenswrapper[4991]: I1206 01:45:14.068036 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frzg8" event={"ID":"93438f02-eb70-44a6-a6d0-6852cf59e670","Type":"ContainerDied","Data":"f662aa425aeb0a01a1aa93880b7b34c63c9aa52cddda0bac11ee22d46678e45b"} Dec 06 01:45:14 crc kubenswrapper[4991]: I1206 01:45:14.068059 4991 scope.go:117] "RemoveContainer" containerID="b4762b176c39c0c7e5c47d075517bf2998f7ed10846a3fb436b887536d7b471f" Dec 06 01:45:14 crc kubenswrapper[4991]: I1206 01:45:14.068092 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-frzg8" Dec 06 01:45:14 crc kubenswrapper[4991]: I1206 01:45:14.117553 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-frzg8"] Dec 06 01:45:14 crc kubenswrapper[4991]: I1206 01:45:14.118345 4991 scope.go:117] "RemoveContainer" containerID="603fdcfc2b52ec605ffd998854af183b08697910b67d420624c3d02a6da478c6" Dec 06 01:45:14 crc kubenswrapper[4991]: I1206 01:45:14.133478 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-frzg8"] Dec 06 01:45:14 crc kubenswrapper[4991]: I1206 01:45:14.151080 4991 scope.go:117] "RemoveContainer" containerID="6198ac83f762a1114d16fcf94f2d74cad380f83db60c797e76d48780dba27790" Dec 06 01:45:14 crc kubenswrapper[4991]: I1206 01:45:14.198748 4991 scope.go:117] "RemoveContainer" containerID="b4762b176c39c0c7e5c47d075517bf2998f7ed10846a3fb436b887536d7b471f" Dec 06 01:45:14 crc kubenswrapper[4991]: E1206 01:45:14.199243 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4762b176c39c0c7e5c47d075517bf2998f7ed10846a3fb436b887536d7b471f\": container with ID starting with b4762b176c39c0c7e5c47d075517bf2998f7ed10846a3fb436b887536d7b471f not found: ID does not exist" containerID="b4762b176c39c0c7e5c47d075517bf2998f7ed10846a3fb436b887536d7b471f" Dec 06 01:45:14 crc kubenswrapper[4991]: I1206 01:45:14.199319 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4762b176c39c0c7e5c47d075517bf2998f7ed10846a3fb436b887536d7b471f"} err="failed to get container status \"b4762b176c39c0c7e5c47d075517bf2998f7ed10846a3fb436b887536d7b471f\": rpc error: code = NotFound desc = could not find container \"b4762b176c39c0c7e5c47d075517bf2998f7ed10846a3fb436b887536d7b471f\": container with ID starting with b4762b176c39c0c7e5c47d075517bf2998f7ed10846a3fb436b887536d7b471f not found: ID does not exist" Dec 06 01:45:14 crc kubenswrapper[4991]: I1206 01:45:14.199356 4991 scope.go:117] "RemoveContainer" containerID="603fdcfc2b52ec605ffd998854af183b08697910b67d420624c3d02a6da478c6" Dec 06 01:45:14 crc kubenswrapper[4991]: E1206 01:45:14.199748 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"603fdcfc2b52ec605ffd998854af183b08697910b67d420624c3d02a6da478c6\": container with ID starting with 603fdcfc2b52ec605ffd998854af183b08697910b67d420624c3d02a6da478c6 not found: ID does not exist" containerID="603fdcfc2b52ec605ffd998854af183b08697910b67d420624c3d02a6da478c6" Dec 06 01:45:14 crc kubenswrapper[4991]: I1206 01:45:14.199804 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"603fdcfc2b52ec605ffd998854af183b08697910b67d420624c3d02a6da478c6"} err="failed to get container status \"603fdcfc2b52ec605ffd998854af183b08697910b67d420624c3d02a6da478c6\": rpc error: code = NotFound desc = could not find container \"603fdcfc2b52ec605ffd998854af183b08697910b67d420624c3d02a6da478c6\": container with ID starting with 603fdcfc2b52ec605ffd998854af183b08697910b67d420624c3d02a6da478c6 not found: ID does not exist" Dec 06 01:45:14 crc kubenswrapper[4991]: I1206 01:45:14.199837 4991 scope.go:117] "RemoveContainer" containerID="6198ac83f762a1114d16fcf94f2d74cad380f83db60c797e76d48780dba27790" Dec 06 01:45:14 crc kubenswrapper[4991]: E1206 01:45:14.200181 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6198ac83f762a1114d16fcf94f2d74cad380f83db60c797e76d48780dba27790\": container with ID starting with 6198ac83f762a1114d16fcf94f2d74cad380f83db60c797e76d48780dba27790 not found: ID does not exist" containerID="6198ac83f762a1114d16fcf94f2d74cad380f83db60c797e76d48780dba27790" Dec 06 01:45:14 crc kubenswrapper[4991]: I1206 01:45:14.200222 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6198ac83f762a1114d16fcf94f2d74cad380f83db60c797e76d48780dba27790"} err="failed to get container status \"6198ac83f762a1114d16fcf94f2d74cad380f83db60c797e76d48780dba27790\": rpc error: code = NotFound desc = could not find container \"6198ac83f762a1114d16fcf94f2d74cad380f83db60c797e76d48780dba27790\": container with ID starting with 6198ac83f762a1114d16fcf94f2d74cad380f83db60c797e76d48780dba27790 not found: ID does not exist" Dec 06 01:45:15 crc kubenswrapper[4991]: I1206 01:45:15.056885 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93438f02-eb70-44a6-a6d0-6852cf59e670" path="/var/lib/kubelet/pods/93438f02-eb70-44a6-a6d0-6852cf59e670/volumes" Dec 06 01:45:17 crc kubenswrapper[4991]: I1206 01:45:17.928533 4991 scope.go:117] "RemoveContainer" containerID="418db56e6e056c536019e0f33d387cd948704052752b80a1e797bb4c32ee374a" Dec 06 01:45:26 crc kubenswrapper[4991]: I1206 01:45:26.038465 4991 scope.go:117] "RemoveContainer" containerID="1a1ae10ecce63eccf3bd1a24ea89c8ff30bbd03594b3f18dc363c75cc9304fa8" Dec 06 01:45:26 crc kubenswrapper[4991]: E1206 01:45:26.039638 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:45:41 crc kubenswrapper[4991]: I1206 01:45:41.039162 4991 scope.go:117] "RemoveContainer" containerID="1a1ae10ecce63eccf3bd1a24ea89c8ff30bbd03594b3f18dc363c75cc9304fa8" Dec 06 01:45:41 crc kubenswrapper[4991]: E1206 01:45:41.042273 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:45:55 crc kubenswrapper[4991]: I1206 01:45:55.038513 4991 scope.go:117] "RemoveContainer" containerID="1a1ae10ecce63eccf3bd1a24ea89c8ff30bbd03594b3f18dc363c75cc9304fa8" Dec 06 01:45:55 crc kubenswrapper[4991]: E1206 01:45:55.039553 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:46:07 crc kubenswrapper[4991]: I1206 01:46:07.053935 4991 scope.go:117] "RemoveContainer" containerID="1a1ae10ecce63eccf3bd1a24ea89c8ff30bbd03594b3f18dc363c75cc9304fa8" Dec 06 01:46:07 crc kubenswrapper[4991]: E1206 01:46:07.054867 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:46:20 crc kubenswrapper[4991]: I1206 01:46:20.039081 4991 scope.go:117] "RemoveContainer" containerID="1a1ae10ecce63eccf3bd1a24ea89c8ff30bbd03594b3f18dc363c75cc9304fa8" Dec 06 01:46:20 crc kubenswrapper[4991]: E1206 01:46:20.040582 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:46:31 crc kubenswrapper[4991]: I1206 01:46:31.038301 4991 scope.go:117] "RemoveContainer" containerID="1a1ae10ecce63eccf3bd1a24ea89c8ff30bbd03594b3f18dc363c75cc9304fa8" Dec 06 01:46:31 crc kubenswrapper[4991]: E1206 01:46:31.039039 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:46:36 crc kubenswrapper[4991]: I1206 01:46:36.057910 4991 generic.go:334] "Generic (PLEG): container finished" podID="3b3f0520-92f4-4e3e-aa3e-8950cdc437e9" containerID="c42fb35293611b8cea9477c62692d190512de8a75f751d3391015d25d7287941" exitCode=0 Dec 06 01:46:36 crc kubenswrapper[4991]: I1206 01:46:36.058012 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jntfk" event={"ID":"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9","Type":"ContainerDied","Data":"c42fb35293611b8cea9477c62692d190512de8a75f751d3391015d25d7287941"} Dec 06 01:46:37 crc kubenswrapper[4991]: I1206 01:46:37.580570 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jntfk" Dec 06 01:46:37 crc kubenswrapper[4991]: I1206 01:46:37.680411 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-inventory\") pod \"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9\" (UID: \"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9\") " Dec 06 01:46:37 crc kubenswrapper[4991]: I1206 01:46:37.680484 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-nova-extra-config-0\") pod \"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9\" (UID: \"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9\") " Dec 06 01:46:37 crc kubenswrapper[4991]: I1206 01:46:37.680582 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-nova-combined-ca-bundle\") pod \"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9\" (UID: \"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9\") " Dec 06 01:46:37 crc kubenswrapper[4991]: I1206 01:46:37.680624 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-nova-cell1-compute-config-0\") pod \"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9\" (UID: \"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9\") " Dec 06 01:46:37 crc kubenswrapper[4991]: I1206 01:46:37.680650 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xf4pl\" (UniqueName: \"kubernetes.io/projected/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-kube-api-access-xf4pl\") pod \"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9\" (UID: \"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9\") " Dec 06 01:46:37 crc kubenswrapper[4991]: I1206 01:46:37.680684 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-ssh-key\") pod \"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9\" (UID: \"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9\") " Dec 06 01:46:37 crc kubenswrapper[4991]: I1206 01:46:37.680714 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-nova-migration-ssh-key-1\") pod \"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9\" (UID: \"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9\") " Dec 06 01:46:37 crc kubenswrapper[4991]: I1206 01:46:37.680770 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-nova-cell1-compute-config-1\") pod \"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9\" (UID: \"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9\") " Dec 06 01:46:37 crc kubenswrapper[4991]: I1206 01:46:37.680840 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-nova-migration-ssh-key-0\") pod \"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9\" (UID: \"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9\") " Dec 06 01:46:37 crc kubenswrapper[4991]: I1206 01:46:37.687589 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-kube-api-access-xf4pl" (OuterVolumeSpecName: "kube-api-access-xf4pl") pod "3b3f0520-92f4-4e3e-aa3e-8950cdc437e9" (UID: "3b3f0520-92f4-4e3e-aa3e-8950cdc437e9"). InnerVolumeSpecName "kube-api-access-xf4pl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:46:37 crc kubenswrapper[4991]: I1206 01:46:37.688928 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "3b3f0520-92f4-4e3e-aa3e-8950cdc437e9" (UID: "3b3f0520-92f4-4e3e-aa3e-8950cdc437e9"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:46:37 crc kubenswrapper[4991]: I1206 01:46:37.713471 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "3b3f0520-92f4-4e3e-aa3e-8950cdc437e9" (UID: "3b3f0520-92f4-4e3e-aa3e-8950cdc437e9"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 01:46:37 crc kubenswrapper[4991]: I1206 01:46:37.714240 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "3b3f0520-92f4-4e3e-aa3e-8950cdc437e9" (UID: "3b3f0520-92f4-4e3e-aa3e-8950cdc437e9"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:46:37 crc kubenswrapper[4991]: I1206 01:46:37.715340 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "3b3f0520-92f4-4e3e-aa3e-8950cdc437e9" (UID: "3b3f0520-92f4-4e3e-aa3e-8950cdc437e9"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:46:37 crc kubenswrapper[4991]: I1206 01:46:37.722122 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3b3f0520-92f4-4e3e-aa3e-8950cdc437e9" (UID: "3b3f0520-92f4-4e3e-aa3e-8950cdc437e9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:46:37 crc kubenswrapper[4991]: I1206 01:46:37.727714 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-inventory" (OuterVolumeSpecName: "inventory") pod "3b3f0520-92f4-4e3e-aa3e-8950cdc437e9" (UID: "3b3f0520-92f4-4e3e-aa3e-8950cdc437e9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:46:37 crc kubenswrapper[4991]: I1206 01:46:37.739375 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "3b3f0520-92f4-4e3e-aa3e-8950cdc437e9" (UID: "3b3f0520-92f4-4e3e-aa3e-8950cdc437e9"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:46:37 crc kubenswrapper[4991]: I1206 01:46:37.757859 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "3b3f0520-92f4-4e3e-aa3e-8950cdc437e9" (UID: "3b3f0520-92f4-4e3e-aa3e-8950cdc437e9"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:46:37 crc kubenswrapper[4991]: I1206 01:46:37.783542 4991 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 01:46:37 crc kubenswrapper[4991]: I1206 01:46:37.783588 4991 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Dec 06 01:46:37 crc kubenswrapper[4991]: I1206 01:46:37.783603 4991 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:46:37 crc kubenswrapper[4991]: I1206 01:46:37.783612 4991 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 06 01:46:37 crc kubenswrapper[4991]: I1206 01:46:37.783623 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xf4pl\" (UniqueName: \"kubernetes.io/projected/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-kube-api-access-xf4pl\") on node \"crc\" DevicePath \"\"" Dec 06 01:46:37 crc kubenswrapper[4991]: I1206 01:46:37.783632 4991 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 01:46:37 crc kubenswrapper[4991]: I1206 01:46:37.783642 4991 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 06 01:46:37 crc kubenswrapper[4991]: I1206 01:46:37.783650 4991 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 06 01:46:37 crc kubenswrapper[4991]: I1206 01:46:37.783658 4991 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/3b3f0520-92f4-4e3e-aa3e-8950cdc437e9-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 06 01:46:38 crc kubenswrapper[4991]: I1206 01:46:38.081571 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jntfk" event={"ID":"3b3f0520-92f4-4e3e-aa3e-8950cdc437e9","Type":"ContainerDied","Data":"a2ee00e2a8b5039601805ac5e9c74a67416b6f9c638d07d1d4be6496db873503"} Dec 06 01:46:38 crc kubenswrapper[4991]: I1206 01:46:38.081968 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2ee00e2a8b5039601805ac5e9c74a67416b6f9c638d07d1d4be6496db873503" Dec 06 01:46:38 crc kubenswrapper[4991]: I1206 01:46:38.081624 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jntfk" Dec 06 01:46:38 crc kubenswrapper[4991]: I1206 01:46:38.204293 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9"] Dec 06 01:46:38 crc kubenswrapper[4991]: E1206 01:46:38.205031 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ab28f64-0d4f-49fe-ac08-b9ff4011070a" containerName="collect-profiles" Dec 06 01:46:38 crc kubenswrapper[4991]: I1206 01:46:38.205057 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ab28f64-0d4f-49fe-ac08-b9ff4011070a" containerName="collect-profiles" Dec 06 01:46:38 crc kubenswrapper[4991]: E1206 01:46:38.205071 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93438f02-eb70-44a6-a6d0-6852cf59e670" containerName="registry-server" Dec 06 01:46:38 crc kubenswrapper[4991]: I1206 01:46:38.205078 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="93438f02-eb70-44a6-a6d0-6852cf59e670" containerName="registry-server" Dec 06 01:46:38 crc kubenswrapper[4991]: E1206 01:46:38.205107 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b3f0520-92f4-4e3e-aa3e-8950cdc437e9" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 06 01:46:38 crc kubenswrapper[4991]: I1206 01:46:38.205114 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b3f0520-92f4-4e3e-aa3e-8950cdc437e9" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 06 01:46:38 crc kubenswrapper[4991]: E1206 01:46:38.205132 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93438f02-eb70-44a6-a6d0-6852cf59e670" containerName="extract-content" Dec 06 01:46:38 crc kubenswrapper[4991]: I1206 01:46:38.205139 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="93438f02-eb70-44a6-a6d0-6852cf59e670" containerName="extract-content" Dec 06 01:46:38 crc kubenswrapper[4991]: E1206 01:46:38.205153 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93438f02-eb70-44a6-a6d0-6852cf59e670" containerName="extract-utilities" Dec 06 01:46:38 crc kubenswrapper[4991]: I1206 01:46:38.205161 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="93438f02-eb70-44a6-a6d0-6852cf59e670" containerName="extract-utilities" Dec 06 01:46:38 crc kubenswrapper[4991]: I1206 01:46:38.205389 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b3f0520-92f4-4e3e-aa3e-8950cdc437e9" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 06 01:46:38 crc kubenswrapper[4991]: I1206 01:46:38.205411 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="93438f02-eb70-44a6-a6d0-6852cf59e670" containerName="registry-server" Dec 06 01:46:38 crc kubenswrapper[4991]: I1206 01:46:38.205435 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ab28f64-0d4f-49fe-ac08-b9ff4011070a" containerName="collect-profiles" Dec 06 01:46:38 crc kubenswrapper[4991]: I1206 01:46:38.206487 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9" Dec 06 01:46:38 crc kubenswrapper[4991]: I1206 01:46:38.210469 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9tlxv" Dec 06 01:46:38 crc kubenswrapper[4991]: I1206 01:46:38.210740 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 01:46:38 crc kubenswrapper[4991]: I1206 01:46:38.210952 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 06 01:46:38 crc kubenswrapper[4991]: I1206 01:46:38.211132 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 01:46:38 crc kubenswrapper[4991]: I1206 01:46:38.211269 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 01:46:38 crc kubenswrapper[4991]: I1206 01:46:38.216530 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9"] Dec 06 01:46:38 crc kubenswrapper[4991]: I1206 01:46:38.290651 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/3067d2e3-f5b3-45c8-9e37-8a29a2dd153f-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9\" (UID: \"3067d2e3-f5b3-45c8-9e37-8a29a2dd153f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9" Dec 06 01:46:38 crc kubenswrapper[4991]: I1206 01:46:38.290720 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3067d2e3-f5b3-45c8-9e37-8a29a2dd153f-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9\" (UID: \"3067d2e3-f5b3-45c8-9e37-8a29a2dd153f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9" Dec 06 01:46:38 crc kubenswrapper[4991]: I1206 01:46:38.290868 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/3067d2e3-f5b3-45c8-9e37-8a29a2dd153f-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9\" (UID: \"3067d2e3-f5b3-45c8-9e37-8a29a2dd153f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9" Dec 06 01:46:38 crc kubenswrapper[4991]: I1206 01:46:38.290896 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3067d2e3-f5b3-45c8-9e37-8a29a2dd153f-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9\" (UID: \"3067d2e3-f5b3-45c8-9e37-8a29a2dd153f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9" Dec 06 01:46:38 crc kubenswrapper[4991]: I1206 01:46:38.291064 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/3067d2e3-f5b3-45c8-9e37-8a29a2dd153f-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9\" (UID: \"3067d2e3-f5b3-45c8-9e37-8a29a2dd153f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9" Dec 06 01:46:38 crc kubenswrapper[4991]: I1206 01:46:38.291135 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3067d2e3-f5b3-45c8-9e37-8a29a2dd153f-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9\" (UID: \"3067d2e3-f5b3-45c8-9e37-8a29a2dd153f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9" Dec 06 01:46:38 crc kubenswrapper[4991]: I1206 01:46:38.291229 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g87bx\" (UniqueName: \"kubernetes.io/projected/3067d2e3-f5b3-45c8-9e37-8a29a2dd153f-kube-api-access-g87bx\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9\" (UID: \"3067d2e3-f5b3-45c8-9e37-8a29a2dd153f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9" Dec 06 01:46:38 crc kubenswrapper[4991]: I1206 01:46:38.392704 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g87bx\" (UniqueName: \"kubernetes.io/projected/3067d2e3-f5b3-45c8-9e37-8a29a2dd153f-kube-api-access-g87bx\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9\" (UID: \"3067d2e3-f5b3-45c8-9e37-8a29a2dd153f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9" Dec 06 01:46:38 crc kubenswrapper[4991]: I1206 01:46:38.392762 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/3067d2e3-f5b3-45c8-9e37-8a29a2dd153f-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9\" (UID: \"3067d2e3-f5b3-45c8-9e37-8a29a2dd153f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9" Dec 06 01:46:38 crc kubenswrapper[4991]: I1206 01:46:38.392790 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3067d2e3-f5b3-45c8-9e37-8a29a2dd153f-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9\" (UID: \"3067d2e3-f5b3-45c8-9e37-8a29a2dd153f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9" Dec 06 01:46:38 crc kubenswrapper[4991]: I1206 01:46:38.392843 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/3067d2e3-f5b3-45c8-9e37-8a29a2dd153f-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9\" (UID: \"3067d2e3-f5b3-45c8-9e37-8a29a2dd153f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9" Dec 06 01:46:38 crc kubenswrapper[4991]: I1206 01:46:38.392886 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3067d2e3-f5b3-45c8-9e37-8a29a2dd153f-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9\" (UID: \"3067d2e3-f5b3-45c8-9e37-8a29a2dd153f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9" Dec 06 01:46:38 crc kubenswrapper[4991]: I1206 01:46:38.392950 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/3067d2e3-f5b3-45c8-9e37-8a29a2dd153f-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9\" (UID: \"3067d2e3-f5b3-45c8-9e37-8a29a2dd153f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9" Dec 06 01:46:38 crc kubenswrapper[4991]: I1206 01:46:38.392973 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3067d2e3-f5b3-45c8-9e37-8a29a2dd153f-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9\" (UID: \"3067d2e3-f5b3-45c8-9e37-8a29a2dd153f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9" Dec 06 01:46:38 crc kubenswrapper[4991]: I1206 01:46:38.397927 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3067d2e3-f5b3-45c8-9e37-8a29a2dd153f-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9\" (UID: \"3067d2e3-f5b3-45c8-9e37-8a29a2dd153f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9" Dec 06 01:46:38 crc kubenswrapper[4991]: I1206 01:46:38.397977 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/3067d2e3-f5b3-45c8-9e37-8a29a2dd153f-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9\" (UID: \"3067d2e3-f5b3-45c8-9e37-8a29a2dd153f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9" Dec 06 01:46:38 crc kubenswrapper[4991]: I1206 01:46:38.397928 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3067d2e3-f5b3-45c8-9e37-8a29a2dd153f-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9\" (UID: \"3067d2e3-f5b3-45c8-9e37-8a29a2dd153f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9" Dec 06 01:46:38 crc kubenswrapper[4991]: I1206 01:46:38.398920 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/3067d2e3-f5b3-45c8-9e37-8a29a2dd153f-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9\" (UID: \"3067d2e3-f5b3-45c8-9e37-8a29a2dd153f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9" Dec 06 01:46:38 crc kubenswrapper[4991]: I1206 01:46:38.400022 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/3067d2e3-f5b3-45c8-9e37-8a29a2dd153f-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9\" (UID: \"3067d2e3-f5b3-45c8-9e37-8a29a2dd153f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9" Dec 06 01:46:38 crc kubenswrapper[4991]: I1206 01:46:38.401102 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3067d2e3-f5b3-45c8-9e37-8a29a2dd153f-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9\" (UID: \"3067d2e3-f5b3-45c8-9e37-8a29a2dd153f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9" Dec 06 01:46:38 crc kubenswrapper[4991]: I1206 01:46:38.412148 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g87bx\" (UniqueName: \"kubernetes.io/projected/3067d2e3-f5b3-45c8-9e37-8a29a2dd153f-kube-api-access-g87bx\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9\" (UID: \"3067d2e3-f5b3-45c8-9e37-8a29a2dd153f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9" Dec 06 01:46:38 crc kubenswrapper[4991]: I1206 01:46:38.529065 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9" Dec 06 01:46:39 crc kubenswrapper[4991]: I1206 01:46:39.114726 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9"] Dec 06 01:46:40 crc kubenswrapper[4991]: I1206 01:46:40.100552 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9" event={"ID":"3067d2e3-f5b3-45c8-9e37-8a29a2dd153f","Type":"ContainerStarted","Data":"7c59865f310d90662f5ab53de72c506ac13eed5c8e70c110e4feee28e7cc3f83"} Dec 06 01:46:40 crc kubenswrapper[4991]: I1206 01:46:40.100944 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9" event={"ID":"3067d2e3-f5b3-45c8-9e37-8a29a2dd153f","Type":"ContainerStarted","Data":"48d2908d1efd0e75c21588d6b2ec533b46ed5a0cf951aec649d74d76dd2b2e5d"} Dec 06 01:46:40 crc kubenswrapper[4991]: I1206 01:46:40.123758 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9" podStartSLOduration=1.905638296 podStartE2EDuration="2.123719937s" podCreationTimestamp="2025-12-06 01:46:38 +0000 UTC" firstStartedPulling="2025-12-06 01:46:39.128666707 +0000 UTC m=+2672.478957185" lastFinishedPulling="2025-12-06 01:46:39.346748348 +0000 UTC m=+2672.697038826" observedRunningTime="2025-12-06 01:46:40.119930557 +0000 UTC m=+2673.470221025" watchObservedRunningTime="2025-12-06 01:46:40.123719937 +0000 UTC m=+2673.474010445" Dec 06 01:46:43 crc kubenswrapper[4991]: I1206 01:46:43.038824 4991 scope.go:117] "RemoveContainer" containerID="1a1ae10ecce63eccf3bd1a24ea89c8ff30bbd03594b3f18dc363c75cc9304fa8" Dec 06 01:46:43 crc kubenswrapper[4991]: E1206 01:46:43.039770 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:46:54 crc kubenswrapper[4991]: I1206 01:46:54.039195 4991 scope.go:117] "RemoveContainer" containerID="1a1ae10ecce63eccf3bd1a24ea89c8ff30bbd03594b3f18dc363c75cc9304fa8" Dec 06 01:46:54 crc kubenswrapper[4991]: E1206 01:46:54.040066 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:47:07 crc kubenswrapper[4991]: I1206 01:47:07.050066 4991 scope.go:117] "RemoveContainer" containerID="1a1ae10ecce63eccf3bd1a24ea89c8ff30bbd03594b3f18dc363c75cc9304fa8" Dec 06 01:47:07 crc kubenswrapper[4991]: E1206 01:47:07.059151 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:47:21 crc kubenswrapper[4991]: I1206 01:47:21.042438 4991 scope.go:117] "RemoveContainer" containerID="1a1ae10ecce63eccf3bd1a24ea89c8ff30bbd03594b3f18dc363c75cc9304fa8" Dec 06 01:47:21 crc kubenswrapper[4991]: I1206 01:47:21.542718 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" event={"ID":"f35cea72-9e31-4432-9e3b-16a50ebce794","Type":"ContainerStarted","Data":"ff82cbcfd79f5972d1bcb99cecde7613433b7eb24ecbbd15610b87dc68e48070"} Dec 06 01:47:25 crc kubenswrapper[4991]: I1206 01:47:25.111258 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-k5rpp"] Dec 06 01:47:25 crc kubenswrapper[4991]: I1206 01:47:25.114423 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k5rpp" Dec 06 01:47:25 crc kubenswrapper[4991]: I1206 01:47:25.126187 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k5rpp"] Dec 06 01:47:25 crc kubenswrapper[4991]: I1206 01:47:25.302516 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/057565a9-e54d-41b2-9780-5ee4c1dca98f-utilities\") pod \"redhat-operators-k5rpp\" (UID: \"057565a9-e54d-41b2-9780-5ee4c1dca98f\") " pod="openshift-marketplace/redhat-operators-k5rpp" Dec 06 01:47:25 crc kubenswrapper[4991]: I1206 01:47:25.302922 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdqgx\" (UniqueName: \"kubernetes.io/projected/057565a9-e54d-41b2-9780-5ee4c1dca98f-kube-api-access-zdqgx\") pod \"redhat-operators-k5rpp\" (UID: \"057565a9-e54d-41b2-9780-5ee4c1dca98f\") " pod="openshift-marketplace/redhat-operators-k5rpp" Dec 06 01:47:25 crc kubenswrapper[4991]: I1206 01:47:25.302964 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/057565a9-e54d-41b2-9780-5ee4c1dca98f-catalog-content\") pod \"redhat-operators-k5rpp\" (UID: \"057565a9-e54d-41b2-9780-5ee4c1dca98f\") " pod="openshift-marketplace/redhat-operators-k5rpp" Dec 06 01:47:25 crc kubenswrapper[4991]: I1206 01:47:25.404650 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/057565a9-e54d-41b2-9780-5ee4c1dca98f-utilities\") pod \"redhat-operators-k5rpp\" (UID: \"057565a9-e54d-41b2-9780-5ee4c1dca98f\") " pod="openshift-marketplace/redhat-operators-k5rpp" Dec 06 01:47:25 crc kubenswrapper[4991]: I1206 01:47:25.405438 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/057565a9-e54d-41b2-9780-5ee4c1dca98f-utilities\") pod \"redhat-operators-k5rpp\" (UID: \"057565a9-e54d-41b2-9780-5ee4c1dca98f\") " pod="openshift-marketplace/redhat-operators-k5rpp" Dec 06 01:47:25 crc kubenswrapper[4991]: I1206 01:47:25.405716 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdqgx\" (UniqueName: \"kubernetes.io/projected/057565a9-e54d-41b2-9780-5ee4c1dca98f-kube-api-access-zdqgx\") pod \"redhat-operators-k5rpp\" (UID: \"057565a9-e54d-41b2-9780-5ee4c1dca98f\") " pod="openshift-marketplace/redhat-operators-k5rpp" Dec 06 01:47:25 crc kubenswrapper[4991]: I1206 01:47:25.405813 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/057565a9-e54d-41b2-9780-5ee4c1dca98f-catalog-content\") pod \"redhat-operators-k5rpp\" (UID: \"057565a9-e54d-41b2-9780-5ee4c1dca98f\") " pod="openshift-marketplace/redhat-operators-k5rpp" Dec 06 01:47:25 crc kubenswrapper[4991]: I1206 01:47:25.406355 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/057565a9-e54d-41b2-9780-5ee4c1dca98f-catalog-content\") pod \"redhat-operators-k5rpp\" (UID: \"057565a9-e54d-41b2-9780-5ee4c1dca98f\") " pod="openshift-marketplace/redhat-operators-k5rpp" Dec 06 01:47:25 crc kubenswrapper[4991]: I1206 01:47:25.424911 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdqgx\" (UniqueName: \"kubernetes.io/projected/057565a9-e54d-41b2-9780-5ee4c1dca98f-kube-api-access-zdqgx\") pod \"redhat-operators-k5rpp\" (UID: \"057565a9-e54d-41b2-9780-5ee4c1dca98f\") " pod="openshift-marketplace/redhat-operators-k5rpp" Dec 06 01:47:25 crc kubenswrapper[4991]: I1206 01:47:25.479221 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k5rpp" Dec 06 01:47:25 crc kubenswrapper[4991]: I1206 01:47:25.980267 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k5rpp"] Dec 06 01:47:26 crc kubenswrapper[4991]: I1206 01:47:26.595806 4991 generic.go:334] "Generic (PLEG): container finished" podID="057565a9-e54d-41b2-9780-5ee4c1dca98f" containerID="c0f5de9f0b3974315260625dc7780b92513f94e6272ade0077b43be0d4e5ae04" exitCode=0 Dec 06 01:47:26 crc kubenswrapper[4991]: I1206 01:47:26.596204 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k5rpp" event={"ID":"057565a9-e54d-41b2-9780-5ee4c1dca98f","Type":"ContainerDied","Data":"c0f5de9f0b3974315260625dc7780b92513f94e6272ade0077b43be0d4e5ae04"} Dec 06 01:47:26 crc kubenswrapper[4991]: I1206 01:47:26.596237 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k5rpp" event={"ID":"057565a9-e54d-41b2-9780-5ee4c1dca98f","Type":"ContainerStarted","Data":"56e7ee6fe64e9606d90ac3cc68e5decc0e07cfa4028a2ae8e065eeec4fafbd41"} Dec 06 01:47:27 crc kubenswrapper[4991]: I1206 01:47:27.610526 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k5rpp" event={"ID":"057565a9-e54d-41b2-9780-5ee4c1dca98f","Type":"ContainerStarted","Data":"69cd124617c134920c07c610a9393653b8d2647d094d8d1b12aa429f2ff281a8"} Dec 06 01:47:29 crc kubenswrapper[4991]: I1206 01:47:29.632348 4991 generic.go:334] "Generic (PLEG): container finished" podID="057565a9-e54d-41b2-9780-5ee4c1dca98f" containerID="69cd124617c134920c07c610a9393653b8d2647d094d8d1b12aa429f2ff281a8" exitCode=0 Dec 06 01:47:29 crc kubenswrapper[4991]: I1206 01:47:29.632449 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k5rpp" event={"ID":"057565a9-e54d-41b2-9780-5ee4c1dca98f","Type":"ContainerDied","Data":"69cd124617c134920c07c610a9393653b8d2647d094d8d1b12aa429f2ff281a8"} Dec 06 01:47:31 crc kubenswrapper[4991]: I1206 01:47:31.662106 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k5rpp" event={"ID":"057565a9-e54d-41b2-9780-5ee4c1dca98f","Type":"ContainerStarted","Data":"4278a9afcebf9f9f97d583adc901885fe817040fd90f82d7b42ac060deac2003"} Dec 06 01:47:31 crc kubenswrapper[4991]: I1206 01:47:31.690811 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-k5rpp" podStartSLOduration=2.514979972 podStartE2EDuration="6.690790485s" podCreationTimestamp="2025-12-06 01:47:25 +0000 UTC" firstStartedPulling="2025-12-06 01:47:26.599012833 +0000 UTC m=+2719.949303301" lastFinishedPulling="2025-12-06 01:47:30.774823336 +0000 UTC m=+2724.125113814" observedRunningTime="2025-12-06 01:47:31.680846922 +0000 UTC m=+2725.031137430" watchObservedRunningTime="2025-12-06 01:47:31.690790485 +0000 UTC m=+2725.041080973" Dec 06 01:47:35 crc kubenswrapper[4991]: I1206 01:47:35.479889 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-k5rpp" Dec 06 01:47:35 crc kubenswrapper[4991]: I1206 01:47:35.480483 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-k5rpp" Dec 06 01:47:36 crc kubenswrapper[4991]: I1206 01:47:36.550537 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k5rpp" podUID="057565a9-e54d-41b2-9780-5ee4c1dca98f" containerName="registry-server" probeResult="failure" output=< Dec 06 01:47:36 crc kubenswrapper[4991]: timeout: failed to connect service ":50051" within 1s Dec 06 01:47:36 crc kubenswrapper[4991]: > Dec 06 01:47:45 crc kubenswrapper[4991]: I1206 01:47:45.556024 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-k5rpp" Dec 06 01:47:45 crc kubenswrapper[4991]: I1206 01:47:45.642532 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-k5rpp" Dec 06 01:47:45 crc kubenswrapper[4991]: I1206 01:47:45.802861 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k5rpp"] Dec 06 01:47:46 crc kubenswrapper[4991]: I1206 01:47:46.817058 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-k5rpp" podUID="057565a9-e54d-41b2-9780-5ee4c1dca98f" containerName="registry-server" containerID="cri-o://4278a9afcebf9f9f97d583adc901885fe817040fd90f82d7b42ac060deac2003" gracePeriod=2 Dec 06 01:47:47 crc kubenswrapper[4991]: I1206 01:47:47.410655 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k5rpp" Dec 06 01:47:47 crc kubenswrapper[4991]: I1206 01:47:47.460046 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/057565a9-e54d-41b2-9780-5ee4c1dca98f-utilities\") pod \"057565a9-e54d-41b2-9780-5ee4c1dca98f\" (UID: \"057565a9-e54d-41b2-9780-5ee4c1dca98f\") " Dec 06 01:47:47 crc kubenswrapper[4991]: I1206 01:47:47.460103 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/057565a9-e54d-41b2-9780-5ee4c1dca98f-catalog-content\") pod \"057565a9-e54d-41b2-9780-5ee4c1dca98f\" (UID: \"057565a9-e54d-41b2-9780-5ee4c1dca98f\") " Dec 06 01:47:47 crc kubenswrapper[4991]: I1206 01:47:47.460202 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdqgx\" (UniqueName: \"kubernetes.io/projected/057565a9-e54d-41b2-9780-5ee4c1dca98f-kube-api-access-zdqgx\") pod \"057565a9-e54d-41b2-9780-5ee4c1dca98f\" (UID: \"057565a9-e54d-41b2-9780-5ee4c1dca98f\") " Dec 06 01:47:47 crc kubenswrapper[4991]: I1206 01:47:47.461803 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/057565a9-e54d-41b2-9780-5ee4c1dca98f-utilities" (OuterVolumeSpecName: "utilities") pod "057565a9-e54d-41b2-9780-5ee4c1dca98f" (UID: "057565a9-e54d-41b2-9780-5ee4c1dca98f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:47:47 crc kubenswrapper[4991]: I1206 01:47:47.485434 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/057565a9-e54d-41b2-9780-5ee4c1dca98f-kube-api-access-zdqgx" (OuterVolumeSpecName: "kube-api-access-zdqgx") pod "057565a9-e54d-41b2-9780-5ee4c1dca98f" (UID: "057565a9-e54d-41b2-9780-5ee4c1dca98f"). InnerVolumeSpecName "kube-api-access-zdqgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:47:47 crc kubenswrapper[4991]: I1206 01:47:47.563268 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/057565a9-e54d-41b2-9780-5ee4c1dca98f-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 01:47:47 crc kubenswrapper[4991]: I1206 01:47:47.563327 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdqgx\" (UniqueName: \"kubernetes.io/projected/057565a9-e54d-41b2-9780-5ee4c1dca98f-kube-api-access-zdqgx\") on node \"crc\" DevicePath \"\"" Dec 06 01:47:47 crc kubenswrapper[4991]: I1206 01:47:47.577552 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/057565a9-e54d-41b2-9780-5ee4c1dca98f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "057565a9-e54d-41b2-9780-5ee4c1dca98f" (UID: "057565a9-e54d-41b2-9780-5ee4c1dca98f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:47:47 crc kubenswrapper[4991]: I1206 01:47:47.665421 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/057565a9-e54d-41b2-9780-5ee4c1dca98f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 01:47:47 crc kubenswrapper[4991]: I1206 01:47:47.835790 4991 generic.go:334] "Generic (PLEG): container finished" podID="057565a9-e54d-41b2-9780-5ee4c1dca98f" containerID="4278a9afcebf9f9f97d583adc901885fe817040fd90f82d7b42ac060deac2003" exitCode=0 Dec 06 01:47:47 crc kubenswrapper[4991]: I1206 01:47:47.836250 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k5rpp" event={"ID":"057565a9-e54d-41b2-9780-5ee4c1dca98f","Type":"ContainerDied","Data":"4278a9afcebf9f9f97d583adc901885fe817040fd90f82d7b42ac060deac2003"} Dec 06 01:47:47 crc kubenswrapper[4991]: I1206 01:47:47.836298 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k5rpp" event={"ID":"057565a9-e54d-41b2-9780-5ee4c1dca98f","Type":"ContainerDied","Data":"56e7ee6fe64e9606d90ac3cc68e5decc0e07cfa4028a2ae8e065eeec4fafbd41"} Dec 06 01:47:47 crc kubenswrapper[4991]: I1206 01:47:47.836330 4991 scope.go:117] "RemoveContainer" containerID="4278a9afcebf9f9f97d583adc901885fe817040fd90f82d7b42ac060deac2003" Dec 06 01:47:47 crc kubenswrapper[4991]: I1206 01:47:47.836531 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k5rpp" Dec 06 01:47:47 crc kubenswrapper[4991]: I1206 01:47:47.897666 4991 scope.go:117] "RemoveContainer" containerID="69cd124617c134920c07c610a9393653b8d2647d094d8d1b12aa429f2ff281a8" Dec 06 01:47:47 crc kubenswrapper[4991]: I1206 01:47:47.900540 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k5rpp"] Dec 06 01:47:47 crc kubenswrapper[4991]: I1206 01:47:47.914107 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-k5rpp"] Dec 06 01:47:47 crc kubenswrapper[4991]: I1206 01:47:47.961694 4991 scope.go:117] "RemoveContainer" containerID="c0f5de9f0b3974315260625dc7780b92513f94e6272ade0077b43be0d4e5ae04" Dec 06 01:47:48 crc kubenswrapper[4991]: I1206 01:47:48.006036 4991 scope.go:117] "RemoveContainer" containerID="4278a9afcebf9f9f97d583adc901885fe817040fd90f82d7b42ac060deac2003" Dec 06 01:47:48 crc kubenswrapper[4991]: E1206 01:47:48.006544 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4278a9afcebf9f9f97d583adc901885fe817040fd90f82d7b42ac060deac2003\": container with ID starting with 4278a9afcebf9f9f97d583adc901885fe817040fd90f82d7b42ac060deac2003 not found: ID does not exist" containerID="4278a9afcebf9f9f97d583adc901885fe817040fd90f82d7b42ac060deac2003" Dec 06 01:47:48 crc kubenswrapper[4991]: I1206 01:47:48.006580 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4278a9afcebf9f9f97d583adc901885fe817040fd90f82d7b42ac060deac2003"} err="failed to get container status \"4278a9afcebf9f9f97d583adc901885fe817040fd90f82d7b42ac060deac2003\": rpc error: code = NotFound desc = could not find container \"4278a9afcebf9f9f97d583adc901885fe817040fd90f82d7b42ac060deac2003\": container with ID starting with 4278a9afcebf9f9f97d583adc901885fe817040fd90f82d7b42ac060deac2003 not found: ID does not exist" Dec 06 01:47:48 crc kubenswrapper[4991]: I1206 01:47:48.006603 4991 scope.go:117] "RemoveContainer" containerID="69cd124617c134920c07c610a9393653b8d2647d094d8d1b12aa429f2ff281a8" Dec 06 01:47:48 crc kubenswrapper[4991]: E1206 01:47:48.006993 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69cd124617c134920c07c610a9393653b8d2647d094d8d1b12aa429f2ff281a8\": container with ID starting with 69cd124617c134920c07c610a9393653b8d2647d094d8d1b12aa429f2ff281a8 not found: ID does not exist" containerID="69cd124617c134920c07c610a9393653b8d2647d094d8d1b12aa429f2ff281a8" Dec 06 01:47:48 crc kubenswrapper[4991]: I1206 01:47:48.007053 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69cd124617c134920c07c610a9393653b8d2647d094d8d1b12aa429f2ff281a8"} err="failed to get container status \"69cd124617c134920c07c610a9393653b8d2647d094d8d1b12aa429f2ff281a8\": rpc error: code = NotFound desc = could not find container \"69cd124617c134920c07c610a9393653b8d2647d094d8d1b12aa429f2ff281a8\": container with ID starting with 69cd124617c134920c07c610a9393653b8d2647d094d8d1b12aa429f2ff281a8 not found: ID does not exist" Dec 06 01:47:48 crc kubenswrapper[4991]: I1206 01:47:48.007097 4991 scope.go:117] "RemoveContainer" containerID="c0f5de9f0b3974315260625dc7780b92513f94e6272ade0077b43be0d4e5ae04" Dec 06 01:47:48 crc kubenswrapper[4991]: E1206 01:47:48.007838 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0f5de9f0b3974315260625dc7780b92513f94e6272ade0077b43be0d4e5ae04\": container with ID starting with c0f5de9f0b3974315260625dc7780b92513f94e6272ade0077b43be0d4e5ae04 not found: ID does not exist" containerID="c0f5de9f0b3974315260625dc7780b92513f94e6272ade0077b43be0d4e5ae04" Dec 06 01:47:48 crc kubenswrapper[4991]: I1206 01:47:48.007869 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0f5de9f0b3974315260625dc7780b92513f94e6272ade0077b43be0d4e5ae04"} err="failed to get container status \"c0f5de9f0b3974315260625dc7780b92513f94e6272ade0077b43be0d4e5ae04\": rpc error: code = NotFound desc = could not find container \"c0f5de9f0b3974315260625dc7780b92513f94e6272ade0077b43be0d4e5ae04\": container with ID starting with c0f5de9f0b3974315260625dc7780b92513f94e6272ade0077b43be0d4e5ae04 not found: ID does not exist" Dec 06 01:47:49 crc kubenswrapper[4991]: I1206 01:47:49.062702 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="057565a9-e54d-41b2-9780-5ee4c1dca98f" path="/var/lib/kubelet/pods/057565a9-e54d-41b2-9780-5ee4c1dca98f/volumes" Dec 06 01:49:29 crc kubenswrapper[4991]: I1206 01:49:29.058974 4991 generic.go:334] "Generic (PLEG): container finished" podID="3067d2e3-f5b3-45c8-9e37-8a29a2dd153f" containerID="7c59865f310d90662f5ab53de72c506ac13eed5c8e70c110e4feee28e7cc3f83" exitCode=0 Dec 06 01:49:29 crc kubenswrapper[4991]: I1206 01:49:29.059066 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9" event={"ID":"3067d2e3-f5b3-45c8-9e37-8a29a2dd153f","Type":"ContainerDied","Data":"7c59865f310d90662f5ab53de72c506ac13eed5c8e70c110e4feee28e7cc3f83"} Dec 06 01:49:30 crc kubenswrapper[4991]: I1206 01:49:30.568956 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9" Dec 06 01:49:30 crc kubenswrapper[4991]: I1206 01:49:30.662037 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3067d2e3-f5b3-45c8-9e37-8a29a2dd153f-telemetry-combined-ca-bundle\") pod \"3067d2e3-f5b3-45c8-9e37-8a29a2dd153f\" (UID: \"3067d2e3-f5b3-45c8-9e37-8a29a2dd153f\") " Dec 06 01:49:30 crc kubenswrapper[4991]: I1206 01:49:30.662114 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/3067d2e3-f5b3-45c8-9e37-8a29a2dd153f-ceilometer-compute-config-data-1\") pod \"3067d2e3-f5b3-45c8-9e37-8a29a2dd153f\" (UID: \"3067d2e3-f5b3-45c8-9e37-8a29a2dd153f\") " Dec 06 01:49:30 crc kubenswrapper[4991]: I1206 01:49:30.662172 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3067d2e3-f5b3-45c8-9e37-8a29a2dd153f-ssh-key\") pod \"3067d2e3-f5b3-45c8-9e37-8a29a2dd153f\" (UID: \"3067d2e3-f5b3-45c8-9e37-8a29a2dd153f\") " Dec 06 01:49:30 crc kubenswrapper[4991]: I1206 01:49:30.662202 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g87bx\" (UniqueName: \"kubernetes.io/projected/3067d2e3-f5b3-45c8-9e37-8a29a2dd153f-kube-api-access-g87bx\") pod \"3067d2e3-f5b3-45c8-9e37-8a29a2dd153f\" (UID: \"3067d2e3-f5b3-45c8-9e37-8a29a2dd153f\") " Dec 06 01:49:30 crc kubenswrapper[4991]: I1206 01:49:30.662249 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/3067d2e3-f5b3-45c8-9e37-8a29a2dd153f-ceilometer-compute-config-data-2\") pod \"3067d2e3-f5b3-45c8-9e37-8a29a2dd153f\" (UID: \"3067d2e3-f5b3-45c8-9e37-8a29a2dd153f\") " Dec 06 01:49:30 crc kubenswrapper[4991]: I1206 01:49:30.662269 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3067d2e3-f5b3-45c8-9e37-8a29a2dd153f-inventory\") pod \"3067d2e3-f5b3-45c8-9e37-8a29a2dd153f\" (UID: \"3067d2e3-f5b3-45c8-9e37-8a29a2dd153f\") " Dec 06 01:49:30 crc kubenswrapper[4991]: I1206 01:49:30.662312 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/3067d2e3-f5b3-45c8-9e37-8a29a2dd153f-ceilometer-compute-config-data-0\") pod \"3067d2e3-f5b3-45c8-9e37-8a29a2dd153f\" (UID: \"3067d2e3-f5b3-45c8-9e37-8a29a2dd153f\") " Dec 06 01:49:30 crc kubenswrapper[4991]: I1206 01:49:30.683782 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3067d2e3-f5b3-45c8-9e37-8a29a2dd153f-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "3067d2e3-f5b3-45c8-9e37-8a29a2dd153f" (UID: "3067d2e3-f5b3-45c8-9e37-8a29a2dd153f"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:49:30 crc kubenswrapper[4991]: I1206 01:49:30.693503 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3067d2e3-f5b3-45c8-9e37-8a29a2dd153f-kube-api-access-g87bx" (OuterVolumeSpecName: "kube-api-access-g87bx") pod "3067d2e3-f5b3-45c8-9e37-8a29a2dd153f" (UID: "3067d2e3-f5b3-45c8-9e37-8a29a2dd153f"). InnerVolumeSpecName "kube-api-access-g87bx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:49:30 crc kubenswrapper[4991]: I1206 01:49:30.702663 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3067d2e3-f5b3-45c8-9e37-8a29a2dd153f-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "3067d2e3-f5b3-45c8-9e37-8a29a2dd153f" (UID: "3067d2e3-f5b3-45c8-9e37-8a29a2dd153f"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:49:30 crc kubenswrapper[4991]: I1206 01:49:30.703174 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3067d2e3-f5b3-45c8-9e37-8a29a2dd153f-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "3067d2e3-f5b3-45c8-9e37-8a29a2dd153f" (UID: "3067d2e3-f5b3-45c8-9e37-8a29a2dd153f"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:49:30 crc kubenswrapper[4991]: I1206 01:49:30.712338 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3067d2e3-f5b3-45c8-9e37-8a29a2dd153f-inventory" (OuterVolumeSpecName: "inventory") pod "3067d2e3-f5b3-45c8-9e37-8a29a2dd153f" (UID: "3067d2e3-f5b3-45c8-9e37-8a29a2dd153f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:49:30 crc kubenswrapper[4991]: I1206 01:49:30.718345 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3067d2e3-f5b3-45c8-9e37-8a29a2dd153f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3067d2e3-f5b3-45c8-9e37-8a29a2dd153f" (UID: "3067d2e3-f5b3-45c8-9e37-8a29a2dd153f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:49:30 crc kubenswrapper[4991]: I1206 01:49:30.737207 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3067d2e3-f5b3-45c8-9e37-8a29a2dd153f-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "3067d2e3-f5b3-45c8-9e37-8a29a2dd153f" (UID: "3067d2e3-f5b3-45c8-9e37-8a29a2dd153f"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 01:49:30 crc kubenswrapper[4991]: I1206 01:49:30.764162 4991 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/3067d2e3-f5b3-45c8-9e37-8a29a2dd153f-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 06 01:49:30 crc kubenswrapper[4991]: I1206 01:49:30.764192 4991 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3067d2e3-f5b3-45c8-9e37-8a29a2dd153f-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 01:49:30 crc kubenswrapper[4991]: I1206 01:49:30.764202 4991 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/3067d2e3-f5b3-45c8-9e37-8a29a2dd153f-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 06 01:49:30 crc kubenswrapper[4991]: I1206 01:49:30.764212 4991 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3067d2e3-f5b3-45c8-9e37-8a29a2dd153f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 01:49:30 crc kubenswrapper[4991]: I1206 01:49:30.764221 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g87bx\" (UniqueName: \"kubernetes.io/projected/3067d2e3-f5b3-45c8-9e37-8a29a2dd153f-kube-api-access-g87bx\") on node \"crc\" DevicePath \"\"" Dec 06 01:49:30 crc kubenswrapper[4991]: I1206 01:49:30.764230 4991 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/3067d2e3-f5b3-45c8-9e37-8a29a2dd153f-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 06 01:49:30 crc kubenswrapper[4991]: I1206 01:49:30.764239 4991 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3067d2e3-f5b3-45c8-9e37-8a29a2dd153f-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 01:49:31 crc kubenswrapper[4991]: I1206 01:49:31.080171 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9" event={"ID":"3067d2e3-f5b3-45c8-9e37-8a29a2dd153f","Type":"ContainerDied","Data":"48d2908d1efd0e75c21588d6b2ec533b46ed5a0cf951aec649d74d76dd2b2e5d"} Dec 06 01:49:31 crc kubenswrapper[4991]: I1206 01:49:31.080228 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48d2908d1efd0e75c21588d6b2ec533b46ed5a0cf951aec649d74d76dd2b2e5d" Dec 06 01:49:31 crc kubenswrapper[4991]: I1206 01:49:31.080230 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9" Dec 06 01:49:46 crc kubenswrapper[4991]: I1206 01:49:46.740093 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 01:49:46 crc kubenswrapper[4991]: I1206 01:49:46.740776 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 01:50:16 crc kubenswrapper[4991]: I1206 01:50:16.740003 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 01:50:16 crc kubenswrapper[4991]: I1206 01:50:16.740643 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 01:50:20 crc kubenswrapper[4991]: I1206 01:50:20.677489 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 06 01:50:20 crc kubenswrapper[4991]: E1206 01:50:20.678861 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="057565a9-e54d-41b2-9780-5ee4c1dca98f" containerName="extract-utilities" Dec 06 01:50:20 crc kubenswrapper[4991]: I1206 01:50:20.678878 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="057565a9-e54d-41b2-9780-5ee4c1dca98f" containerName="extract-utilities" Dec 06 01:50:20 crc kubenswrapper[4991]: E1206 01:50:20.678888 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="057565a9-e54d-41b2-9780-5ee4c1dca98f" containerName="registry-server" Dec 06 01:50:20 crc kubenswrapper[4991]: I1206 01:50:20.678894 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="057565a9-e54d-41b2-9780-5ee4c1dca98f" containerName="registry-server" Dec 06 01:50:20 crc kubenswrapper[4991]: E1206 01:50:20.678933 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="057565a9-e54d-41b2-9780-5ee4c1dca98f" containerName="extract-content" Dec 06 01:50:20 crc kubenswrapper[4991]: I1206 01:50:20.678939 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="057565a9-e54d-41b2-9780-5ee4c1dca98f" containerName="extract-content" Dec 06 01:50:20 crc kubenswrapper[4991]: E1206 01:50:20.678954 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3067d2e3-f5b3-45c8-9e37-8a29a2dd153f" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 06 01:50:20 crc kubenswrapper[4991]: I1206 01:50:20.678960 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="3067d2e3-f5b3-45c8-9e37-8a29a2dd153f" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 06 01:50:20 crc kubenswrapper[4991]: I1206 01:50:20.679277 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="057565a9-e54d-41b2-9780-5ee4c1dca98f" containerName="registry-server" Dec 06 01:50:20 crc kubenswrapper[4991]: I1206 01:50:20.679314 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="3067d2e3-f5b3-45c8-9e37-8a29a2dd153f" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 06 01:50:20 crc kubenswrapper[4991]: I1206 01:50:20.680262 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 06 01:50:20 crc kubenswrapper[4991]: I1206 01:50:20.683571 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 06 01:50:20 crc kubenswrapper[4991]: I1206 01:50:20.683619 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 06 01:50:20 crc kubenswrapper[4991]: I1206 01:50:20.683838 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-759tp" Dec 06 01:50:20 crc kubenswrapper[4991]: I1206 01:50:20.684414 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 06 01:50:20 crc kubenswrapper[4991]: I1206 01:50:20.695310 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 06 01:50:20 crc kubenswrapper[4991]: I1206 01:50:20.802378 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/67d48b5b-50a6-4515-9ca7-bc491239938a-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"67d48b5b-50a6-4515-9ca7-bc491239938a\") " pod="openstack/tempest-tests-tempest" Dec 06 01:50:20 crc kubenswrapper[4991]: I1206 01:50:20.802439 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/67d48b5b-50a6-4515-9ca7-bc491239938a-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"67d48b5b-50a6-4515-9ca7-bc491239938a\") " pod="openstack/tempest-tests-tempest" Dec 06 01:50:20 crc kubenswrapper[4991]: I1206 01:50:20.802609 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/67d48b5b-50a6-4515-9ca7-bc491239938a-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"67d48b5b-50a6-4515-9ca7-bc491239938a\") " pod="openstack/tempest-tests-tempest" Dec 06 01:50:20 crc kubenswrapper[4991]: I1206 01:50:20.802715 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/67d48b5b-50a6-4515-9ca7-bc491239938a-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"67d48b5b-50a6-4515-9ca7-bc491239938a\") " pod="openstack/tempest-tests-tempest" Dec 06 01:50:20 crc kubenswrapper[4991]: I1206 01:50:20.802816 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"67d48b5b-50a6-4515-9ca7-bc491239938a\") " pod="openstack/tempest-tests-tempest" Dec 06 01:50:20 crc kubenswrapper[4991]: I1206 01:50:20.803050 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/67d48b5b-50a6-4515-9ca7-bc491239938a-config-data\") pod \"tempest-tests-tempest\" (UID: \"67d48b5b-50a6-4515-9ca7-bc491239938a\") " pod="openstack/tempest-tests-tempest" Dec 06 01:50:20 crc kubenswrapper[4991]: I1206 01:50:20.803229 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgzsk\" (UniqueName: \"kubernetes.io/projected/67d48b5b-50a6-4515-9ca7-bc491239938a-kube-api-access-mgzsk\") pod \"tempest-tests-tempest\" (UID: \"67d48b5b-50a6-4515-9ca7-bc491239938a\") " pod="openstack/tempest-tests-tempest" Dec 06 01:50:20 crc kubenswrapper[4991]: I1206 01:50:20.803319 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/67d48b5b-50a6-4515-9ca7-bc491239938a-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"67d48b5b-50a6-4515-9ca7-bc491239938a\") " pod="openstack/tempest-tests-tempest" Dec 06 01:50:20 crc kubenswrapper[4991]: I1206 01:50:20.803528 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/67d48b5b-50a6-4515-9ca7-bc491239938a-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"67d48b5b-50a6-4515-9ca7-bc491239938a\") " pod="openstack/tempest-tests-tempest" Dec 06 01:50:20 crc kubenswrapper[4991]: I1206 01:50:20.905264 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/67d48b5b-50a6-4515-9ca7-bc491239938a-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"67d48b5b-50a6-4515-9ca7-bc491239938a\") " pod="openstack/tempest-tests-tempest" Dec 06 01:50:20 crc kubenswrapper[4991]: I1206 01:50:20.905967 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/67d48b5b-50a6-4515-9ca7-bc491239938a-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"67d48b5b-50a6-4515-9ca7-bc491239938a\") " pod="openstack/tempest-tests-tempest" Dec 06 01:50:20 crc kubenswrapper[4991]: I1206 01:50:20.907053 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/67d48b5b-50a6-4515-9ca7-bc491239938a-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"67d48b5b-50a6-4515-9ca7-bc491239938a\") " pod="openstack/tempest-tests-tempest" Dec 06 01:50:20 crc kubenswrapper[4991]: I1206 01:50:20.907522 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/67d48b5b-50a6-4515-9ca7-bc491239938a-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"67d48b5b-50a6-4515-9ca7-bc491239938a\") " pod="openstack/tempest-tests-tempest" Dec 06 01:50:20 crc kubenswrapper[4991]: I1206 01:50:20.907647 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"67d48b5b-50a6-4515-9ca7-bc491239938a\") " pod="openstack/tempest-tests-tempest" Dec 06 01:50:20 crc kubenswrapper[4991]: I1206 01:50:20.907791 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/67d48b5b-50a6-4515-9ca7-bc491239938a-config-data\") pod \"tempest-tests-tempest\" (UID: \"67d48b5b-50a6-4515-9ca7-bc491239938a\") " pod="openstack/tempest-tests-tempest" Dec 06 01:50:20 crc kubenswrapper[4991]: I1206 01:50:20.906971 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/67d48b5b-50a6-4515-9ca7-bc491239938a-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"67d48b5b-50a6-4515-9ca7-bc491239938a\") " pod="openstack/tempest-tests-tempest" Dec 06 01:50:20 crc kubenswrapper[4991]: I1206 01:50:20.908007 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgzsk\" (UniqueName: \"kubernetes.io/projected/67d48b5b-50a6-4515-9ca7-bc491239938a-kube-api-access-mgzsk\") pod \"tempest-tests-tempest\" (UID: \"67d48b5b-50a6-4515-9ca7-bc491239938a\") " pod="openstack/tempest-tests-tempest" Dec 06 01:50:20 crc kubenswrapper[4991]: I1206 01:50:20.908146 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/67d48b5b-50a6-4515-9ca7-bc491239938a-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"67d48b5b-50a6-4515-9ca7-bc491239938a\") " pod="openstack/tempest-tests-tempest" Dec 06 01:50:20 crc kubenswrapper[4991]: I1206 01:50:20.908273 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/67d48b5b-50a6-4515-9ca7-bc491239938a-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"67d48b5b-50a6-4515-9ca7-bc491239938a\") " pod="openstack/tempest-tests-tempest" Dec 06 01:50:20 crc kubenswrapper[4991]: I1206 01:50:20.908434 4991 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"67d48b5b-50a6-4515-9ca7-bc491239938a\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/tempest-tests-tempest" Dec 06 01:50:20 crc kubenswrapper[4991]: I1206 01:50:20.908887 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/67d48b5b-50a6-4515-9ca7-bc491239938a-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"67d48b5b-50a6-4515-9ca7-bc491239938a\") " pod="openstack/tempest-tests-tempest" Dec 06 01:50:20 crc kubenswrapper[4991]: I1206 01:50:20.909305 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/67d48b5b-50a6-4515-9ca7-bc491239938a-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"67d48b5b-50a6-4515-9ca7-bc491239938a\") " pod="openstack/tempest-tests-tempest" Dec 06 01:50:20 crc kubenswrapper[4991]: I1206 01:50:20.910356 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/67d48b5b-50a6-4515-9ca7-bc491239938a-config-data\") pod \"tempest-tests-tempest\" (UID: \"67d48b5b-50a6-4515-9ca7-bc491239938a\") " pod="openstack/tempest-tests-tempest" Dec 06 01:50:20 crc kubenswrapper[4991]: I1206 01:50:20.912572 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/67d48b5b-50a6-4515-9ca7-bc491239938a-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"67d48b5b-50a6-4515-9ca7-bc491239938a\") " pod="openstack/tempest-tests-tempest" Dec 06 01:50:20 crc kubenswrapper[4991]: I1206 01:50:20.917960 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/67d48b5b-50a6-4515-9ca7-bc491239938a-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"67d48b5b-50a6-4515-9ca7-bc491239938a\") " pod="openstack/tempest-tests-tempest" Dec 06 01:50:20 crc kubenswrapper[4991]: I1206 01:50:20.919101 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/67d48b5b-50a6-4515-9ca7-bc491239938a-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"67d48b5b-50a6-4515-9ca7-bc491239938a\") " pod="openstack/tempest-tests-tempest" Dec 06 01:50:20 crc kubenswrapper[4991]: I1206 01:50:20.925780 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgzsk\" (UniqueName: \"kubernetes.io/projected/67d48b5b-50a6-4515-9ca7-bc491239938a-kube-api-access-mgzsk\") pod \"tempest-tests-tempest\" (UID: \"67d48b5b-50a6-4515-9ca7-bc491239938a\") " pod="openstack/tempest-tests-tempest" Dec 06 01:50:20 crc kubenswrapper[4991]: I1206 01:50:20.969354 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"67d48b5b-50a6-4515-9ca7-bc491239938a\") " pod="openstack/tempest-tests-tempest" Dec 06 01:50:21 crc kubenswrapper[4991]: I1206 01:50:21.012913 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 06 01:50:21 crc kubenswrapper[4991]: I1206 01:50:21.567181 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 06 01:50:21 crc kubenswrapper[4991]: I1206 01:50:21.579612 4991 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 01:50:21 crc kubenswrapper[4991]: I1206 01:50:21.639554 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"67d48b5b-50a6-4515-9ca7-bc491239938a","Type":"ContainerStarted","Data":"b66a1fdf75be0991435d200277b3e3946ebafc9eefbdbb7e4b91c643417f351f"} Dec 06 01:50:46 crc kubenswrapper[4991]: I1206 01:50:46.739574 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 01:50:46 crc kubenswrapper[4991]: I1206 01:50:46.740301 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 01:50:46 crc kubenswrapper[4991]: I1206 01:50:46.740355 4991 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" Dec 06 01:50:46 crc kubenswrapper[4991]: I1206 01:50:46.740898 4991 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ff82cbcfd79f5972d1bcb99cecde7613433b7eb24ecbbd15610b87dc68e48070"} pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 01:50:46 crc kubenswrapper[4991]: I1206 01:50:46.740960 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" containerID="cri-o://ff82cbcfd79f5972d1bcb99cecde7613433b7eb24ecbbd15610b87dc68e48070" gracePeriod=600 Dec 06 01:50:46 crc kubenswrapper[4991]: I1206 01:50:46.975607 4991 generic.go:334] "Generic (PLEG): container finished" podID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerID="ff82cbcfd79f5972d1bcb99cecde7613433b7eb24ecbbd15610b87dc68e48070" exitCode=0 Dec 06 01:50:46 crc kubenswrapper[4991]: I1206 01:50:46.975628 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" event={"ID":"f35cea72-9e31-4432-9e3b-16a50ebce794","Type":"ContainerDied","Data":"ff82cbcfd79f5972d1bcb99cecde7613433b7eb24ecbbd15610b87dc68e48070"} Dec 06 01:50:46 crc kubenswrapper[4991]: I1206 01:50:46.975685 4991 scope.go:117] "RemoveContainer" containerID="1a1ae10ecce63eccf3bd1a24ea89c8ff30bbd03594b3f18dc363c75cc9304fa8" Dec 06 01:50:51 crc kubenswrapper[4991]: E1206 01:50:51.128742 4991 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Dec 06 01:50:51 crc kubenswrapper[4991]: E1206 01:50:51.129734 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mgzsk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(67d48b5b-50a6-4515-9ca7-bc491239938a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 01:50:51 crc kubenswrapper[4991]: E1206 01:50:51.130897 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="67d48b5b-50a6-4515-9ca7-bc491239938a" Dec 06 01:50:52 crc kubenswrapper[4991]: I1206 01:50:52.040364 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" event={"ID":"f35cea72-9e31-4432-9e3b-16a50ebce794","Type":"ContainerStarted","Data":"a081368548455c25dff8453cd575b59dbccfd8323726b50732f5e6e2f20570a0"} Dec 06 01:50:52 crc kubenswrapper[4991]: E1206 01:50:52.043040 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="67d48b5b-50a6-4515-9ca7-bc491239938a" Dec 06 01:51:07 crc kubenswrapper[4991]: I1206 01:51:07.212179 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"67d48b5b-50a6-4515-9ca7-bc491239938a","Type":"ContainerStarted","Data":"d432e23e5c24ea0c1b9b7e136a07cffe9da78cd514ad1093342994d78c496aef"} Dec 06 01:51:07 crc kubenswrapper[4991]: I1206 01:51:07.244399 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.308848058 podStartE2EDuration="48.244368422s" podCreationTimestamp="2025-12-06 01:50:19 +0000 UTC" firstStartedPulling="2025-12-06 01:50:21.579101195 +0000 UTC m=+2894.929391703" lastFinishedPulling="2025-12-06 01:51:05.514621569 +0000 UTC m=+2938.864912067" observedRunningTime="2025-12-06 01:51:07.23217951 +0000 UTC m=+2940.582469978" watchObservedRunningTime="2025-12-06 01:51:07.244368422 +0000 UTC m=+2940.594658930" Dec 06 01:53:05 crc kubenswrapper[4991]: I1206 01:53:05.306736 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wcnnn"] Dec 06 01:53:05 crc kubenswrapper[4991]: I1206 01:53:05.321311 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wcnnn" Dec 06 01:53:05 crc kubenswrapper[4991]: I1206 01:53:05.340178 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21e4af18-7750-4560-a70f-27e2ee289cfa-catalog-content\") pod \"redhat-marketplace-wcnnn\" (UID: \"21e4af18-7750-4560-a70f-27e2ee289cfa\") " pod="openshift-marketplace/redhat-marketplace-wcnnn" Dec 06 01:53:05 crc kubenswrapper[4991]: I1206 01:53:05.340351 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21e4af18-7750-4560-a70f-27e2ee289cfa-utilities\") pod \"redhat-marketplace-wcnnn\" (UID: \"21e4af18-7750-4560-a70f-27e2ee289cfa\") " pod="openshift-marketplace/redhat-marketplace-wcnnn" Dec 06 01:53:05 crc kubenswrapper[4991]: I1206 01:53:05.340584 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd7kh\" (UniqueName: \"kubernetes.io/projected/21e4af18-7750-4560-a70f-27e2ee289cfa-kube-api-access-dd7kh\") pod \"redhat-marketplace-wcnnn\" (UID: \"21e4af18-7750-4560-a70f-27e2ee289cfa\") " pod="openshift-marketplace/redhat-marketplace-wcnnn" Dec 06 01:53:05 crc kubenswrapper[4991]: I1206 01:53:05.351062 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wcnnn"] Dec 06 01:53:05 crc kubenswrapper[4991]: I1206 01:53:05.442516 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21e4af18-7750-4560-a70f-27e2ee289cfa-catalog-content\") pod \"redhat-marketplace-wcnnn\" (UID: \"21e4af18-7750-4560-a70f-27e2ee289cfa\") " pod="openshift-marketplace/redhat-marketplace-wcnnn" Dec 06 01:53:05 crc kubenswrapper[4991]: I1206 01:53:05.442563 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21e4af18-7750-4560-a70f-27e2ee289cfa-utilities\") pod \"redhat-marketplace-wcnnn\" (UID: \"21e4af18-7750-4560-a70f-27e2ee289cfa\") " pod="openshift-marketplace/redhat-marketplace-wcnnn" Dec 06 01:53:05 crc kubenswrapper[4991]: I1206 01:53:05.442643 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd7kh\" (UniqueName: \"kubernetes.io/projected/21e4af18-7750-4560-a70f-27e2ee289cfa-kube-api-access-dd7kh\") pod \"redhat-marketplace-wcnnn\" (UID: \"21e4af18-7750-4560-a70f-27e2ee289cfa\") " pod="openshift-marketplace/redhat-marketplace-wcnnn" Dec 06 01:53:05 crc kubenswrapper[4991]: I1206 01:53:05.443460 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21e4af18-7750-4560-a70f-27e2ee289cfa-utilities\") pod \"redhat-marketplace-wcnnn\" (UID: \"21e4af18-7750-4560-a70f-27e2ee289cfa\") " pod="openshift-marketplace/redhat-marketplace-wcnnn" Dec 06 01:53:05 crc kubenswrapper[4991]: I1206 01:53:05.443584 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21e4af18-7750-4560-a70f-27e2ee289cfa-catalog-content\") pod \"redhat-marketplace-wcnnn\" (UID: \"21e4af18-7750-4560-a70f-27e2ee289cfa\") " pod="openshift-marketplace/redhat-marketplace-wcnnn" Dec 06 01:53:05 crc kubenswrapper[4991]: I1206 01:53:05.472251 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd7kh\" (UniqueName: \"kubernetes.io/projected/21e4af18-7750-4560-a70f-27e2ee289cfa-kube-api-access-dd7kh\") pod \"redhat-marketplace-wcnnn\" (UID: \"21e4af18-7750-4560-a70f-27e2ee289cfa\") " pod="openshift-marketplace/redhat-marketplace-wcnnn" Dec 06 01:53:05 crc kubenswrapper[4991]: I1206 01:53:05.666523 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wcnnn" Dec 06 01:53:06 crc kubenswrapper[4991]: I1206 01:53:06.228568 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wcnnn"] Dec 06 01:53:06 crc kubenswrapper[4991]: I1206 01:53:06.596778 4991 generic.go:334] "Generic (PLEG): container finished" podID="21e4af18-7750-4560-a70f-27e2ee289cfa" containerID="573fe477e9b7d494a8add19d5b316b900de0fdc3de95efbc1af69d925ad5c668" exitCode=0 Dec 06 01:53:06 crc kubenswrapper[4991]: I1206 01:53:06.596940 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wcnnn" event={"ID":"21e4af18-7750-4560-a70f-27e2ee289cfa","Type":"ContainerDied","Data":"573fe477e9b7d494a8add19d5b316b900de0fdc3de95efbc1af69d925ad5c668"} Dec 06 01:53:06 crc kubenswrapper[4991]: I1206 01:53:06.597635 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wcnnn" event={"ID":"21e4af18-7750-4560-a70f-27e2ee289cfa","Type":"ContainerStarted","Data":"7a5b01139fd78c879bb40efce042f86d128ffb7050db74037a750a4936a18b7e"} Dec 06 01:53:07 crc kubenswrapper[4991]: I1206 01:53:07.611090 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wcnnn" event={"ID":"21e4af18-7750-4560-a70f-27e2ee289cfa","Type":"ContainerStarted","Data":"f324307b02da2219ded96ebd1a028ed04889d5b9c75bfd4af9bb832b41983fef"} Dec 06 01:53:08 crc kubenswrapper[4991]: I1206 01:53:08.627715 4991 generic.go:334] "Generic (PLEG): container finished" podID="21e4af18-7750-4560-a70f-27e2ee289cfa" containerID="f324307b02da2219ded96ebd1a028ed04889d5b9c75bfd4af9bb832b41983fef" exitCode=0 Dec 06 01:53:08 crc kubenswrapper[4991]: I1206 01:53:08.627796 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wcnnn" event={"ID":"21e4af18-7750-4560-a70f-27e2ee289cfa","Type":"ContainerDied","Data":"f324307b02da2219ded96ebd1a028ed04889d5b9c75bfd4af9bb832b41983fef"} Dec 06 01:53:09 crc kubenswrapper[4991]: I1206 01:53:09.649325 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wcnnn" event={"ID":"21e4af18-7750-4560-a70f-27e2ee289cfa","Type":"ContainerStarted","Data":"e108acff5e3afd2e95330488fa7a3e26168b9f259480cec4563c7f52086a3633"} Dec 06 01:53:15 crc kubenswrapper[4991]: I1206 01:53:15.666840 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wcnnn" Dec 06 01:53:15 crc kubenswrapper[4991]: I1206 01:53:15.667789 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wcnnn" Dec 06 01:53:15 crc kubenswrapper[4991]: I1206 01:53:15.754450 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wcnnn" Dec 06 01:53:15 crc kubenswrapper[4991]: I1206 01:53:15.801047 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wcnnn" podStartSLOduration=8.122627398 podStartE2EDuration="10.801026634s" podCreationTimestamp="2025-12-06 01:53:05 +0000 UTC" firstStartedPulling="2025-12-06 01:53:06.601658203 +0000 UTC m=+3059.951948711" lastFinishedPulling="2025-12-06 01:53:09.280057459 +0000 UTC m=+3062.630347947" observedRunningTime="2025-12-06 01:53:09.685182529 +0000 UTC m=+3063.035473017" watchObservedRunningTime="2025-12-06 01:53:15.801026634 +0000 UTC m=+3069.151317112" Dec 06 01:53:15 crc kubenswrapper[4991]: I1206 01:53:15.820362 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wcnnn" Dec 06 01:53:16 crc kubenswrapper[4991]: I1206 01:53:16.012872 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wcnnn"] Dec 06 01:53:16 crc kubenswrapper[4991]: I1206 01:53:16.739453 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 01:53:16 crc kubenswrapper[4991]: I1206 01:53:16.739894 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 01:53:17 crc kubenswrapper[4991]: I1206 01:53:17.752800 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wcnnn" podUID="21e4af18-7750-4560-a70f-27e2ee289cfa" containerName="registry-server" containerID="cri-o://e108acff5e3afd2e95330488fa7a3e26168b9f259480cec4563c7f52086a3633" gracePeriod=2 Dec 06 01:53:18 crc kubenswrapper[4991]: I1206 01:53:18.330182 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wcnnn" Dec 06 01:53:18 crc kubenswrapper[4991]: I1206 01:53:18.411303 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dd7kh\" (UniqueName: \"kubernetes.io/projected/21e4af18-7750-4560-a70f-27e2ee289cfa-kube-api-access-dd7kh\") pod \"21e4af18-7750-4560-a70f-27e2ee289cfa\" (UID: \"21e4af18-7750-4560-a70f-27e2ee289cfa\") " Dec 06 01:53:18 crc kubenswrapper[4991]: I1206 01:53:18.411438 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21e4af18-7750-4560-a70f-27e2ee289cfa-utilities\") pod \"21e4af18-7750-4560-a70f-27e2ee289cfa\" (UID: \"21e4af18-7750-4560-a70f-27e2ee289cfa\") " Dec 06 01:53:18 crc kubenswrapper[4991]: I1206 01:53:18.411646 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21e4af18-7750-4560-a70f-27e2ee289cfa-catalog-content\") pod \"21e4af18-7750-4560-a70f-27e2ee289cfa\" (UID: \"21e4af18-7750-4560-a70f-27e2ee289cfa\") " Dec 06 01:53:18 crc kubenswrapper[4991]: I1206 01:53:18.412356 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21e4af18-7750-4560-a70f-27e2ee289cfa-utilities" (OuterVolumeSpecName: "utilities") pod "21e4af18-7750-4560-a70f-27e2ee289cfa" (UID: "21e4af18-7750-4560-a70f-27e2ee289cfa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:53:18 crc kubenswrapper[4991]: I1206 01:53:18.418321 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21e4af18-7750-4560-a70f-27e2ee289cfa-kube-api-access-dd7kh" (OuterVolumeSpecName: "kube-api-access-dd7kh") pod "21e4af18-7750-4560-a70f-27e2ee289cfa" (UID: "21e4af18-7750-4560-a70f-27e2ee289cfa"). InnerVolumeSpecName "kube-api-access-dd7kh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:53:18 crc kubenswrapper[4991]: I1206 01:53:18.435857 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21e4af18-7750-4560-a70f-27e2ee289cfa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "21e4af18-7750-4560-a70f-27e2ee289cfa" (UID: "21e4af18-7750-4560-a70f-27e2ee289cfa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:53:18 crc kubenswrapper[4991]: I1206 01:53:18.514028 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dd7kh\" (UniqueName: \"kubernetes.io/projected/21e4af18-7750-4560-a70f-27e2ee289cfa-kube-api-access-dd7kh\") on node \"crc\" DevicePath \"\"" Dec 06 01:53:18 crc kubenswrapper[4991]: I1206 01:53:18.514071 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21e4af18-7750-4560-a70f-27e2ee289cfa-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 01:53:18 crc kubenswrapper[4991]: I1206 01:53:18.514084 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21e4af18-7750-4560-a70f-27e2ee289cfa-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 01:53:18 crc kubenswrapper[4991]: I1206 01:53:18.766286 4991 generic.go:334] "Generic (PLEG): container finished" podID="21e4af18-7750-4560-a70f-27e2ee289cfa" containerID="e108acff5e3afd2e95330488fa7a3e26168b9f259480cec4563c7f52086a3633" exitCode=0 Dec 06 01:53:18 crc kubenswrapper[4991]: I1206 01:53:18.767954 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wcnnn" event={"ID":"21e4af18-7750-4560-a70f-27e2ee289cfa","Type":"ContainerDied","Data":"e108acff5e3afd2e95330488fa7a3e26168b9f259480cec4563c7f52086a3633"} Dec 06 01:53:18 crc kubenswrapper[4991]: I1206 01:53:18.768013 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wcnnn" event={"ID":"21e4af18-7750-4560-a70f-27e2ee289cfa","Type":"ContainerDied","Data":"7a5b01139fd78c879bb40efce042f86d128ffb7050db74037a750a4936a18b7e"} Dec 06 01:53:18 crc kubenswrapper[4991]: I1206 01:53:18.768034 4991 scope.go:117] "RemoveContainer" containerID="e108acff5e3afd2e95330488fa7a3e26168b9f259480cec4563c7f52086a3633" Dec 06 01:53:18 crc kubenswrapper[4991]: I1206 01:53:18.768174 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wcnnn" Dec 06 01:53:18 crc kubenswrapper[4991]: I1206 01:53:18.810742 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wcnnn"] Dec 06 01:53:18 crc kubenswrapper[4991]: I1206 01:53:18.813535 4991 scope.go:117] "RemoveContainer" containerID="f324307b02da2219ded96ebd1a028ed04889d5b9c75bfd4af9bb832b41983fef" Dec 06 01:53:18 crc kubenswrapper[4991]: I1206 01:53:18.819697 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wcnnn"] Dec 06 01:53:18 crc kubenswrapper[4991]: I1206 01:53:18.837827 4991 scope.go:117] "RemoveContainer" containerID="573fe477e9b7d494a8add19d5b316b900de0fdc3de95efbc1af69d925ad5c668" Dec 06 01:53:18 crc kubenswrapper[4991]: I1206 01:53:18.890584 4991 scope.go:117] "RemoveContainer" containerID="e108acff5e3afd2e95330488fa7a3e26168b9f259480cec4563c7f52086a3633" Dec 06 01:53:18 crc kubenswrapper[4991]: E1206 01:53:18.891521 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e108acff5e3afd2e95330488fa7a3e26168b9f259480cec4563c7f52086a3633\": container with ID starting with e108acff5e3afd2e95330488fa7a3e26168b9f259480cec4563c7f52086a3633 not found: ID does not exist" containerID="e108acff5e3afd2e95330488fa7a3e26168b9f259480cec4563c7f52086a3633" Dec 06 01:53:18 crc kubenswrapper[4991]: I1206 01:53:18.891746 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e108acff5e3afd2e95330488fa7a3e26168b9f259480cec4563c7f52086a3633"} err="failed to get container status \"e108acff5e3afd2e95330488fa7a3e26168b9f259480cec4563c7f52086a3633\": rpc error: code = NotFound desc = could not find container \"e108acff5e3afd2e95330488fa7a3e26168b9f259480cec4563c7f52086a3633\": container with ID starting with e108acff5e3afd2e95330488fa7a3e26168b9f259480cec4563c7f52086a3633 not found: ID does not exist" Dec 06 01:53:18 crc kubenswrapper[4991]: I1206 01:53:18.891945 4991 scope.go:117] "RemoveContainer" containerID="f324307b02da2219ded96ebd1a028ed04889d5b9c75bfd4af9bb832b41983fef" Dec 06 01:53:18 crc kubenswrapper[4991]: E1206 01:53:18.892564 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f324307b02da2219ded96ebd1a028ed04889d5b9c75bfd4af9bb832b41983fef\": container with ID starting with f324307b02da2219ded96ebd1a028ed04889d5b9c75bfd4af9bb832b41983fef not found: ID does not exist" containerID="f324307b02da2219ded96ebd1a028ed04889d5b9c75bfd4af9bb832b41983fef" Dec 06 01:53:18 crc kubenswrapper[4991]: I1206 01:53:18.892631 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f324307b02da2219ded96ebd1a028ed04889d5b9c75bfd4af9bb832b41983fef"} err="failed to get container status \"f324307b02da2219ded96ebd1a028ed04889d5b9c75bfd4af9bb832b41983fef\": rpc error: code = NotFound desc = could not find container \"f324307b02da2219ded96ebd1a028ed04889d5b9c75bfd4af9bb832b41983fef\": container with ID starting with f324307b02da2219ded96ebd1a028ed04889d5b9c75bfd4af9bb832b41983fef not found: ID does not exist" Dec 06 01:53:18 crc kubenswrapper[4991]: I1206 01:53:18.892718 4991 scope.go:117] "RemoveContainer" containerID="573fe477e9b7d494a8add19d5b316b900de0fdc3de95efbc1af69d925ad5c668" Dec 06 01:53:18 crc kubenswrapper[4991]: E1206 01:53:18.893551 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"573fe477e9b7d494a8add19d5b316b900de0fdc3de95efbc1af69d925ad5c668\": container with ID starting with 573fe477e9b7d494a8add19d5b316b900de0fdc3de95efbc1af69d925ad5c668 not found: ID does not exist" containerID="573fe477e9b7d494a8add19d5b316b900de0fdc3de95efbc1af69d925ad5c668" Dec 06 01:53:18 crc kubenswrapper[4991]: I1206 01:53:18.893743 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"573fe477e9b7d494a8add19d5b316b900de0fdc3de95efbc1af69d925ad5c668"} err="failed to get container status \"573fe477e9b7d494a8add19d5b316b900de0fdc3de95efbc1af69d925ad5c668\": rpc error: code = NotFound desc = could not find container \"573fe477e9b7d494a8add19d5b316b900de0fdc3de95efbc1af69d925ad5c668\": container with ID starting with 573fe477e9b7d494a8add19d5b316b900de0fdc3de95efbc1af69d925ad5c668 not found: ID does not exist" Dec 06 01:53:19 crc kubenswrapper[4991]: I1206 01:53:19.062471 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21e4af18-7750-4560-a70f-27e2ee289cfa" path="/var/lib/kubelet/pods/21e4af18-7750-4560-a70f-27e2ee289cfa/volumes" Dec 06 01:53:46 crc kubenswrapper[4991]: I1206 01:53:46.739501 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 01:53:46 crc kubenswrapper[4991]: I1206 01:53:46.740216 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 01:54:16 crc kubenswrapper[4991]: I1206 01:54:16.739558 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 01:54:16 crc kubenswrapper[4991]: I1206 01:54:16.741017 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 01:54:16 crc kubenswrapper[4991]: I1206 01:54:16.741142 4991 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" Dec 06 01:54:16 crc kubenswrapper[4991]: I1206 01:54:16.741851 4991 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a081368548455c25dff8453cd575b59dbccfd8323726b50732f5e6e2f20570a0"} pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 01:54:16 crc kubenswrapper[4991]: I1206 01:54:16.742015 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" containerID="cri-o://a081368548455c25dff8453cd575b59dbccfd8323726b50732f5e6e2f20570a0" gracePeriod=600 Dec 06 01:54:16 crc kubenswrapper[4991]: E1206 01:54:16.891924 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:54:17 crc kubenswrapper[4991]: I1206 01:54:17.409521 4991 generic.go:334] "Generic (PLEG): container finished" podID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerID="a081368548455c25dff8453cd575b59dbccfd8323726b50732f5e6e2f20570a0" exitCode=0 Dec 06 01:54:17 crc kubenswrapper[4991]: I1206 01:54:17.409566 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" event={"ID":"f35cea72-9e31-4432-9e3b-16a50ebce794","Type":"ContainerDied","Data":"a081368548455c25dff8453cd575b59dbccfd8323726b50732f5e6e2f20570a0"} Dec 06 01:54:17 crc kubenswrapper[4991]: I1206 01:54:17.409772 4991 scope.go:117] "RemoveContainer" containerID="ff82cbcfd79f5972d1bcb99cecde7613433b7eb24ecbbd15610b87dc68e48070" Dec 06 01:54:17 crc kubenswrapper[4991]: I1206 01:54:17.410449 4991 scope.go:117] "RemoveContainer" containerID="a081368548455c25dff8453cd575b59dbccfd8323726b50732f5e6e2f20570a0" Dec 06 01:54:17 crc kubenswrapper[4991]: E1206 01:54:17.410885 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:54:32 crc kubenswrapper[4991]: I1206 01:54:32.038826 4991 scope.go:117] "RemoveContainer" containerID="a081368548455c25dff8453cd575b59dbccfd8323726b50732f5e6e2f20570a0" Dec 06 01:54:32 crc kubenswrapper[4991]: E1206 01:54:32.040098 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:54:42 crc kubenswrapper[4991]: I1206 01:54:42.102273 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zmwph"] Dec 06 01:54:42 crc kubenswrapper[4991]: E1206 01:54:42.103478 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21e4af18-7750-4560-a70f-27e2ee289cfa" containerName="extract-content" Dec 06 01:54:42 crc kubenswrapper[4991]: I1206 01:54:42.103494 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="21e4af18-7750-4560-a70f-27e2ee289cfa" containerName="extract-content" Dec 06 01:54:42 crc kubenswrapper[4991]: E1206 01:54:42.103521 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21e4af18-7750-4560-a70f-27e2ee289cfa" containerName="extract-utilities" Dec 06 01:54:42 crc kubenswrapper[4991]: I1206 01:54:42.103528 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="21e4af18-7750-4560-a70f-27e2ee289cfa" containerName="extract-utilities" Dec 06 01:54:42 crc kubenswrapper[4991]: E1206 01:54:42.103558 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21e4af18-7750-4560-a70f-27e2ee289cfa" containerName="registry-server" Dec 06 01:54:42 crc kubenswrapper[4991]: I1206 01:54:42.103565 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="21e4af18-7750-4560-a70f-27e2ee289cfa" containerName="registry-server" Dec 06 01:54:42 crc kubenswrapper[4991]: I1206 01:54:42.103748 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="21e4af18-7750-4560-a70f-27e2ee289cfa" containerName="registry-server" Dec 06 01:54:42 crc kubenswrapper[4991]: I1206 01:54:42.105441 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zmwph" Dec 06 01:54:42 crc kubenswrapper[4991]: I1206 01:54:42.124811 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zmwph"] Dec 06 01:54:42 crc kubenswrapper[4991]: I1206 01:54:42.302181 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z2pr\" (UniqueName: \"kubernetes.io/projected/f9613661-270b-40e1-a4ce-1fb6d120ddab-kube-api-access-8z2pr\") pod \"community-operators-zmwph\" (UID: \"f9613661-270b-40e1-a4ce-1fb6d120ddab\") " pod="openshift-marketplace/community-operators-zmwph" Dec 06 01:54:42 crc kubenswrapper[4991]: I1206 01:54:42.302248 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9613661-270b-40e1-a4ce-1fb6d120ddab-utilities\") pod \"community-operators-zmwph\" (UID: \"f9613661-270b-40e1-a4ce-1fb6d120ddab\") " pod="openshift-marketplace/community-operators-zmwph" Dec 06 01:54:42 crc kubenswrapper[4991]: I1206 01:54:42.302494 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9613661-270b-40e1-a4ce-1fb6d120ddab-catalog-content\") pod \"community-operators-zmwph\" (UID: \"f9613661-270b-40e1-a4ce-1fb6d120ddab\") " pod="openshift-marketplace/community-operators-zmwph" Dec 06 01:54:42 crc kubenswrapper[4991]: I1206 01:54:42.404954 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z2pr\" (UniqueName: \"kubernetes.io/projected/f9613661-270b-40e1-a4ce-1fb6d120ddab-kube-api-access-8z2pr\") pod \"community-operators-zmwph\" (UID: \"f9613661-270b-40e1-a4ce-1fb6d120ddab\") " pod="openshift-marketplace/community-operators-zmwph" Dec 06 01:54:42 crc kubenswrapper[4991]: I1206 01:54:42.405072 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9613661-270b-40e1-a4ce-1fb6d120ddab-utilities\") pod \"community-operators-zmwph\" (UID: \"f9613661-270b-40e1-a4ce-1fb6d120ddab\") " pod="openshift-marketplace/community-operators-zmwph" Dec 06 01:54:42 crc kubenswrapper[4991]: I1206 01:54:42.405145 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9613661-270b-40e1-a4ce-1fb6d120ddab-catalog-content\") pod \"community-operators-zmwph\" (UID: \"f9613661-270b-40e1-a4ce-1fb6d120ddab\") " pod="openshift-marketplace/community-operators-zmwph" Dec 06 01:54:42 crc kubenswrapper[4991]: I1206 01:54:42.405683 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9613661-270b-40e1-a4ce-1fb6d120ddab-catalog-content\") pod \"community-operators-zmwph\" (UID: \"f9613661-270b-40e1-a4ce-1fb6d120ddab\") " pod="openshift-marketplace/community-operators-zmwph" Dec 06 01:54:42 crc kubenswrapper[4991]: I1206 01:54:42.406817 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9613661-270b-40e1-a4ce-1fb6d120ddab-utilities\") pod \"community-operators-zmwph\" (UID: \"f9613661-270b-40e1-a4ce-1fb6d120ddab\") " pod="openshift-marketplace/community-operators-zmwph" Dec 06 01:54:42 crc kubenswrapper[4991]: I1206 01:54:42.442322 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z2pr\" (UniqueName: \"kubernetes.io/projected/f9613661-270b-40e1-a4ce-1fb6d120ddab-kube-api-access-8z2pr\") pod \"community-operators-zmwph\" (UID: \"f9613661-270b-40e1-a4ce-1fb6d120ddab\") " pod="openshift-marketplace/community-operators-zmwph" Dec 06 01:54:42 crc kubenswrapper[4991]: I1206 01:54:42.729812 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zmwph" Dec 06 01:54:43 crc kubenswrapper[4991]: I1206 01:54:43.242169 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zmwph"] Dec 06 01:54:43 crc kubenswrapper[4991]: I1206 01:54:43.747854 4991 generic.go:334] "Generic (PLEG): container finished" podID="f9613661-270b-40e1-a4ce-1fb6d120ddab" containerID="2c0ac688b8c2a377bca5b84bf6d9889b06f0165c6bd796ab08c221e9a84d1600" exitCode=0 Dec 06 01:54:43 crc kubenswrapper[4991]: I1206 01:54:43.747912 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zmwph" event={"ID":"f9613661-270b-40e1-a4ce-1fb6d120ddab","Type":"ContainerDied","Data":"2c0ac688b8c2a377bca5b84bf6d9889b06f0165c6bd796ab08c221e9a84d1600"} Dec 06 01:54:43 crc kubenswrapper[4991]: I1206 01:54:43.747952 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zmwph" event={"ID":"f9613661-270b-40e1-a4ce-1fb6d120ddab","Type":"ContainerStarted","Data":"8d457df03fdd193ec927dcd73b03476136119f2e5f3859b92e63c921c50cdfa0"} Dec 06 01:54:44 crc kubenswrapper[4991]: I1206 01:54:44.764771 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zmwph" event={"ID":"f9613661-270b-40e1-a4ce-1fb6d120ddab","Type":"ContainerStarted","Data":"f0206863395dd6b0970b4b41e73028cc56237158eb156954ac681b522cf77664"} Dec 06 01:54:45 crc kubenswrapper[4991]: I1206 01:54:45.778042 4991 generic.go:334] "Generic (PLEG): container finished" podID="f9613661-270b-40e1-a4ce-1fb6d120ddab" containerID="f0206863395dd6b0970b4b41e73028cc56237158eb156954ac681b522cf77664" exitCode=0 Dec 06 01:54:45 crc kubenswrapper[4991]: I1206 01:54:45.778183 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zmwph" event={"ID":"f9613661-270b-40e1-a4ce-1fb6d120ddab","Type":"ContainerDied","Data":"f0206863395dd6b0970b4b41e73028cc56237158eb156954ac681b522cf77664"} Dec 06 01:54:47 crc kubenswrapper[4991]: I1206 01:54:47.051547 4991 scope.go:117] "RemoveContainer" containerID="a081368548455c25dff8453cd575b59dbccfd8323726b50732f5e6e2f20570a0" Dec 06 01:54:47 crc kubenswrapper[4991]: E1206 01:54:47.052763 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:54:47 crc kubenswrapper[4991]: I1206 01:54:47.798824 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zmwph" event={"ID":"f9613661-270b-40e1-a4ce-1fb6d120ddab","Type":"ContainerStarted","Data":"dc162012b2ddb3696fbd2457bbce60af214894210bc5596489291e0ff30c5355"} Dec 06 01:54:52 crc kubenswrapper[4991]: I1206 01:54:52.730264 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zmwph" Dec 06 01:54:52 crc kubenswrapper[4991]: I1206 01:54:52.730929 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zmwph" Dec 06 01:54:52 crc kubenswrapper[4991]: I1206 01:54:52.828576 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zmwph" Dec 06 01:54:52 crc kubenswrapper[4991]: I1206 01:54:52.865428 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zmwph" podStartSLOduration=8.057848045 podStartE2EDuration="10.865398847s" podCreationTimestamp="2025-12-06 01:54:42 +0000 UTC" firstStartedPulling="2025-12-06 01:54:43.753970083 +0000 UTC m=+3157.104260551" lastFinishedPulling="2025-12-06 01:54:46.561520885 +0000 UTC m=+3159.911811353" observedRunningTime="2025-12-06 01:54:47.826577631 +0000 UTC m=+3161.176868099" watchObservedRunningTime="2025-12-06 01:54:52.865398847 +0000 UTC m=+3166.215689325" Dec 06 01:54:53 crc kubenswrapper[4991]: I1206 01:54:53.152696 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zmwph" Dec 06 01:54:53 crc kubenswrapper[4991]: I1206 01:54:53.211491 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zmwph"] Dec 06 01:54:55 crc kubenswrapper[4991]: I1206 01:54:55.120325 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zmwph" podUID="f9613661-270b-40e1-a4ce-1fb6d120ddab" containerName="registry-server" containerID="cri-o://dc162012b2ddb3696fbd2457bbce60af214894210bc5596489291e0ff30c5355" gracePeriod=2 Dec 06 01:54:55 crc kubenswrapper[4991]: I1206 01:54:55.692189 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zmwph" Dec 06 01:54:55 crc kubenswrapper[4991]: I1206 01:54:55.791926 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8z2pr\" (UniqueName: \"kubernetes.io/projected/f9613661-270b-40e1-a4ce-1fb6d120ddab-kube-api-access-8z2pr\") pod \"f9613661-270b-40e1-a4ce-1fb6d120ddab\" (UID: \"f9613661-270b-40e1-a4ce-1fb6d120ddab\") " Dec 06 01:54:55 crc kubenswrapper[4991]: I1206 01:54:55.792477 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9613661-270b-40e1-a4ce-1fb6d120ddab-catalog-content\") pod \"f9613661-270b-40e1-a4ce-1fb6d120ddab\" (UID: \"f9613661-270b-40e1-a4ce-1fb6d120ddab\") " Dec 06 01:54:55 crc kubenswrapper[4991]: I1206 01:54:55.792614 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9613661-270b-40e1-a4ce-1fb6d120ddab-utilities\") pod \"f9613661-270b-40e1-a4ce-1fb6d120ddab\" (UID: \"f9613661-270b-40e1-a4ce-1fb6d120ddab\") " Dec 06 01:54:55 crc kubenswrapper[4991]: I1206 01:54:55.793726 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9613661-270b-40e1-a4ce-1fb6d120ddab-utilities" (OuterVolumeSpecName: "utilities") pod "f9613661-270b-40e1-a4ce-1fb6d120ddab" (UID: "f9613661-270b-40e1-a4ce-1fb6d120ddab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:54:55 crc kubenswrapper[4991]: I1206 01:54:55.800756 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9613661-270b-40e1-a4ce-1fb6d120ddab-kube-api-access-8z2pr" (OuterVolumeSpecName: "kube-api-access-8z2pr") pod "f9613661-270b-40e1-a4ce-1fb6d120ddab" (UID: "f9613661-270b-40e1-a4ce-1fb6d120ddab"). InnerVolumeSpecName "kube-api-access-8z2pr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:54:55 crc kubenswrapper[4991]: I1206 01:54:55.858658 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9613661-270b-40e1-a4ce-1fb6d120ddab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f9613661-270b-40e1-a4ce-1fb6d120ddab" (UID: "f9613661-270b-40e1-a4ce-1fb6d120ddab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:54:55 crc kubenswrapper[4991]: I1206 01:54:55.894744 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8z2pr\" (UniqueName: \"kubernetes.io/projected/f9613661-270b-40e1-a4ce-1fb6d120ddab-kube-api-access-8z2pr\") on node \"crc\" DevicePath \"\"" Dec 06 01:54:55 crc kubenswrapper[4991]: I1206 01:54:55.894782 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9613661-270b-40e1-a4ce-1fb6d120ddab-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 01:54:55 crc kubenswrapper[4991]: I1206 01:54:55.894791 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9613661-270b-40e1-a4ce-1fb6d120ddab-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 01:54:56 crc kubenswrapper[4991]: I1206 01:54:56.138806 4991 generic.go:334] "Generic (PLEG): container finished" podID="f9613661-270b-40e1-a4ce-1fb6d120ddab" containerID="dc162012b2ddb3696fbd2457bbce60af214894210bc5596489291e0ff30c5355" exitCode=0 Dec 06 01:54:56 crc kubenswrapper[4991]: I1206 01:54:56.138892 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zmwph" event={"ID":"f9613661-270b-40e1-a4ce-1fb6d120ddab","Type":"ContainerDied","Data":"dc162012b2ddb3696fbd2457bbce60af214894210bc5596489291e0ff30c5355"} Dec 06 01:54:56 crc kubenswrapper[4991]: I1206 01:54:56.138946 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zmwph" Dec 06 01:54:56 crc kubenswrapper[4991]: I1206 01:54:56.139052 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zmwph" event={"ID":"f9613661-270b-40e1-a4ce-1fb6d120ddab","Type":"ContainerDied","Data":"8d457df03fdd193ec927dcd73b03476136119f2e5f3859b92e63c921c50cdfa0"} Dec 06 01:54:56 crc kubenswrapper[4991]: I1206 01:54:56.139105 4991 scope.go:117] "RemoveContainer" containerID="dc162012b2ddb3696fbd2457bbce60af214894210bc5596489291e0ff30c5355" Dec 06 01:54:56 crc kubenswrapper[4991]: I1206 01:54:56.176745 4991 scope.go:117] "RemoveContainer" containerID="f0206863395dd6b0970b4b41e73028cc56237158eb156954ac681b522cf77664" Dec 06 01:54:56 crc kubenswrapper[4991]: I1206 01:54:56.208978 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zmwph"] Dec 06 01:54:56 crc kubenswrapper[4991]: I1206 01:54:56.223461 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zmwph"] Dec 06 01:54:56 crc kubenswrapper[4991]: I1206 01:54:56.232417 4991 scope.go:117] "RemoveContainer" containerID="2c0ac688b8c2a377bca5b84bf6d9889b06f0165c6bd796ab08c221e9a84d1600" Dec 06 01:54:56 crc kubenswrapper[4991]: I1206 01:54:56.296496 4991 scope.go:117] "RemoveContainer" containerID="dc162012b2ddb3696fbd2457bbce60af214894210bc5596489291e0ff30c5355" Dec 06 01:54:56 crc kubenswrapper[4991]: E1206 01:54:56.297706 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc162012b2ddb3696fbd2457bbce60af214894210bc5596489291e0ff30c5355\": container with ID starting with dc162012b2ddb3696fbd2457bbce60af214894210bc5596489291e0ff30c5355 not found: ID does not exist" containerID="dc162012b2ddb3696fbd2457bbce60af214894210bc5596489291e0ff30c5355" Dec 06 01:54:56 crc kubenswrapper[4991]: I1206 01:54:56.297865 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc162012b2ddb3696fbd2457bbce60af214894210bc5596489291e0ff30c5355"} err="failed to get container status \"dc162012b2ddb3696fbd2457bbce60af214894210bc5596489291e0ff30c5355\": rpc error: code = NotFound desc = could not find container \"dc162012b2ddb3696fbd2457bbce60af214894210bc5596489291e0ff30c5355\": container with ID starting with dc162012b2ddb3696fbd2457bbce60af214894210bc5596489291e0ff30c5355 not found: ID does not exist" Dec 06 01:54:56 crc kubenswrapper[4991]: I1206 01:54:56.298046 4991 scope.go:117] "RemoveContainer" containerID="f0206863395dd6b0970b4b41e73028cc56237158eb156954ac681b522cf77664" Dec 06 01:54:56 crc kubenswrapper[4991]: E1206 01:54:56.300194 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0206863395dd6b0970b4b41e73028cc56237158eb156954ac681b522cf77664\": container with ID starting with f0206863395dd6b0970b4b41e73028cc56237158eb156954ac681b522cf77664 not found: ID does not exist" containerID="f0206863395dd6b0970b4b41e73028cc56237158eb156954ac681b522cf77664" Dec 06 01:54:56 crc kubenswrapper[4991]: I1206 01:54:56.300263 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0206863395dd6b0970b4b41e73028cc56237158eb156954ac681b522cf77664"} err="failed to get container status \"f0206863395dd6b0970b4b41e73028cc56237158eb156954ac681b522cf77664\": rpc error: code = NotFound desc = could not find container \"f0206863395dd6b0970b4b41e73028cc56237158eb156954ac681b522cf77664\": container with ID starting with f0206863395dd6b0970b4b41e73028cc56237158eb156954ac681b522cf77664 not found: ID does not exist" Dec 06 01:54:56 crc kubenswrapper[4991]: I1206 01:54:56.300308 4991 scope.go:117] "RemoveContainer" containerID="2c0ac688b8c2a377bca5b84bf6d9889b06f0165c6bd796ab08c221e9a84d1600" Dec 06 01:54:56 crc kubenswrapper[4991]: E1206 01:54:56.300975 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c0ac688b8c2a377bca5b84bf6d9889b06f0165c6bd796ab08c221e9a84d1600\": container with ID starting with 2c0ac688b8c2a377bca5b84bf6d9889b06f0165c6bd796ab08c221e9a84d1600 not found: ID does not exist" containerID="2c0ac688b8c2a377bca5b84bf6d9889b06f0165c6bd796ab08c221e9a84d1600" Dec 06 01:54:56 crc kubenswrapper[4991]: I1206 01:54:56.301168 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c0ac688b8c2a377bca5b84bf6d9889b06f0165c6bd796ab08c221e9a84d1600"} err="failed to get container status \"2c0ac688b8c2a377bca5b84bf6d9889b06f0165c6bd796ab08c221e9a84d1600\": rpc error: code = NotFound desc = could not find container \"2c0ac688b8c2a377bca5b84bf6d9889b06f0165c6bd796ab08c221e9a84d1600\": container with ID starting with 2c0ac688b8c2a377bca5b84bf6d9889b06f0165c6bd796ab08c221e9a84d1600 not found: ID does not exist" Dec 06 01:54:57 crc kubenswrapper[4991]: I1206 01:54:57.084602 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9613661-270b-40e1-a4ce-1fb6d120ddab" path="/var/lib/kubelet/pods/f9613661-270b-40e1-a4ce-1fb6d120ddab/volumes" Dec 06 01:55:02 crc kubenswrapper[4991]: I1206 01:55:02.038554 4991 scope.go:117] "RemoveContainer" containerID="a081368548455c25dff8453cd575b59dbccfd8323726b50732f5e6e2f20570a0" Dec 06 01:55:02 crc kubenswrapper[4991]: E1206 01:55:02.039510 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:55:13 crc kubenswrapper[4991]: I1206 01:55:13.038972 4991 scope.go:117] "RemoveContainer" containerID="a081368548455c25dff8453cd575b59dbccfd8323726b50732f5e6e2f20570a0" Dec 06 01:55:13 crc kubenswrapper[4991]: E1206 01:55:13.040474 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:55:28 crc kubenswrapper[4991]: I1206 01:55:28.038075 4991 scope.go:117] "RemoveContainer" containerID="a081368548455c25dff8453cd575b59dbccfd8323726b50732f5e6e2f20570a0" Dec 06 01:55:28 crc kubenswrapper[4991]: E1206 01:55:28.038966 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:55:43 crc kubenswrapper[4991]: I1206 01:55:43.038582 4991 scope.go:117] "RemoveContainer" containerID="a081368548455c25dff8453cd575b59dbccfd8323726b50732f5e6e2f20570a0" Dec 06 01:55:43 crc kubenswrapper[4991]: E1206 01:55:43.040601 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:55:56 crc kubenswrapper[4991]: I1206 01:55:56.038248 4991 scope.go:117] "RemoveContainer" containerID="a081368548455c25dff8453cd575b59dbccfd8323726b50732f5e6e2f20570a0" Dec 06 01:55:56 crc kubenswrapper[4991]: E1206 01:55:56.039779 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:56:11 crc kubenswrapper[4991]: I1206 01:56:11.038452 4991 scope.go:117] "RemoveContainer" containerID="a081368548455c25dff8453cd575b59dbccfd8323726b50732f5e6e2f20570a0" Dec 06 01:56:11 crc kubenswrapper[4991]: E1206 01:56:11.039232 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:56:24 crc kubenswrapper[4991]: I1206 01:56:24.038213 4991 scope.go:117] "RemoveContainer" containerID="a081368548455c25dff8453cd575b59dbccfd8323726b50732f5e6e2f20570a0" Dec 06 01:56:24 crc kubenswrapper[4991]: E1206 01:56:24.039255 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:56:35 crc kubenswrapper[4991]: I1206 01:56:35.038014 4991 scope.go:117] "RemoveContainer" containerID="a081368548455c25dff8453cd575b59dbccfd8323726b50732f5e6e2f20570a0" Dec 06 01:56:35 crc kubenswrapper[4991]: E1206 01:56:35.038782 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:56:46 crc kubenswrapper[4991]: I1206 01:56:46.038513 4991 scope.go:117] "RemoveContainer" containerID="a081368548455c25dff8453cd575b59dbccfd8323726b50732f5e6e2f20570a0" Dec 06 01:56:46 crc kubenswrapper[4991]: E1206 01:56:46.039362 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:56:57 crc kubenswrapper[4991]: I1206 01:56:57.051507 4991 scope.go:117] "RemoveContainer" containerID="a081368548455c25dff8453cd575b59dbccfd8323726b50732f5e6e2f20570a0" Dec 06 01:56:57 crc kubenswrapper[4991]: E1206 01:56:57.052413 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:57:11 crc kubenswrapper[4991]: I1206 01:57:11.038350 4991 scope.go:117] "RemoveContainer" containerID="a081368548455c25dff8453cd575b59dbccfd8323726b50732f5e6e2f20570a0" Dec 06 01:57:11 crc kubenswrapper[4991]: E1206 01:57:11.039397 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:57:24 crc kubenswrapper[4991]: I1206 01:57:24.038191 4991 scope.go:117] "RemoveContainer" containerID="a081368548455c25dff8453cd575b59dbccfd8323726b50732f5e6e2f20570a0" Dec 06 01:57:24 crc kubenswrapper[4991]: E1206 01:57:24.039474 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:57:39 crc kubenswrapper[4991]: I1206 01:57:39.041364 4991 scope.go:117] "RemoveContainer" containerID="a081368548455c25dff8453cd575b59dbccfd8323726b50732f5e6e2f20570a0" Dec 06 01:57:39 crc kubenswrapper[4991]: E1206 01:57:39.043401 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:57:50 crc kubenswrapper[4991]: I1206 01:57:50.038484 4991 scope.go:117] "RemoveContainer" containerID="a081368548455c25dff8453cd575b59dbccfd8323726b50732f5e6e2f20570a0" Dec 06 01:57:50 crc kubenswrapper[4991]: E1206 01:57:50.039628 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:57:54 crc kubenswrapper[4991]: I1206 01:57:54.008136 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wlrx5"] Dec 06 01:57:54 crc kubenswrapper[4991]: E1206 01:57:54.018818 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9613661-270b-40e1-a4ce-1fb6d120ddab" containerName="extract-content" Dec 06 01:57:54 crc kubenswrapper[4991]: I1206 01:57:54.018873 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9613661-270b-40e1-a4ce-1fb6d120ddab" containerName="extract-content" Dec 06 01:57:54 crc kubenswrapper[4991]: E1206 01:57:54.018943 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9613661-270b-40e1-a4ce-1fb6d120ddab" containerName="registry-server" Dec 06 01:57:54 crc kubenswrapper[4991]: I1206 01:57:54.018956 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9613661-270b-40e1-a4ce-1fb6d120ddab" containerName="registry-server" Dec 06 01:57:54 crc kubenswrapper[4991]: E1206 01:57:54.019060 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9613661-270b-40e1-a4ce-1fb6d120ddab" containerName="extract-utilities" Dec 06 01:57:54 crc kubenswrapper[4991]: I1206 01:57:54.019072 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9613661-270b-40e1-a4ce-1fb6d120ddab" containerName="extract-utilities" Dec 06 01:57:54 crc kubenswrapper[4991]: I1206 01:57:54.019624 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9613661-270b-40e1-a4ce-1fb6d120ddab" containerName="registry-server" Dec 06 01:57:54 crc kubenswrapper[4991]: I1206 01:57:54.022587 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wlrx5" Dec 06 01:57:54 crc kubenswrapper[4991]: I1206 01:57:54.039471 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wlrx5"] Dec 06 01:57:54 crc kubenswrapper[4991]: I1206 01:57:54.207775 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34c80a90-2691-450a-8328-8ad6d64ed003-utilities\") pod \"redhat-operators-wlrx5\" (UID: \"34c80a90-2691-450a-8328-8ad6d64ed003\") " pod="openshift-marketplace/redhat-operators-wlrx5" Dec 06 01:57:54 crc kubenswrapper[4991]: I1206 01:57:54.207857 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34c80a90-2691-450a-8328-8ad6d64ed003-catalog-content\") pod \"redhat-operators-wlrx5\" (UID: \"34c80a90-2691-450a-8328-8ad6d64ed003\") " pod="openshift-marketplace/redhat-operators-wlrx5" Dec 06 01:57:54 crc kubenswrapper[4991]: I1206 01:57:54.208225 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flnf9\" (UniqueName: \"kubernetes.io/projected/34c80a90-2691-450a-8328-8ad6d64ed003-kube-api-access-flnf9\") pod \"redhat-operators-wlrx5\" (UID: \"34c80a90-2691-450a-8328-8ad6d64ed003\") " pod="openshift-marketplace/redhat-operators-wlrx5" Dec 06 01:57:54 crc kubenswrapper[4991]: I1206 01:57:54.309828 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flnf9\" (UniqueName: \"kubernetes.io/projected/34c80a90-2691-450a-8328-8ad6d64ed003-kube-api-access-flnf9\") pod \"redhat-operators-wlrx5\" (UID: \"34c80a90-2691-450a-8328-8ad6d64ed003\") " pod="openshift-marketplace/redhat-operators-wlrx5" Dec 06 01:57:54 crc kubenswrapper[4991]: I1206 01:57:54.310030 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34c80a90-2691-450a-8328-8ad6d64ed003-utilities\") pod \"redhat-operators-wlrx5\" (UID: \"34c80a90-2691-450a-8328-8ad6d64ed003\") " pod="openshift-marketplace/redhat-operators-wlrx5" Dec 06 01:57:54 crc kubenswrapper[4991]: I1206 01:57:54.310604 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34c80a90-2691-450a-8328-8ad6d64ed003-utilities\") pod \"redhat-operators-wlrx5\" (UID: \"34c80a90-2691-450a-8328-8ad6d64ed003\") " pod="openshift-marketplace/redhat-operators-wlrx5" Dec 06 01:57:54 crc kubenswrapper[4991]: I1206 01:57:54.310750 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34c80a90-2691-450a-8328-8ad6d64ed003-catalog-content\") pod \"redhat-operators-wlrx5\" (UID: \"34c80a90-2691-450a-8328-8ad6d64ed003\") " pod="openshift-marketplace/redhat-operators-wlrx5" Dec 06 01:57:54 crc kubenswrapper[4991]: I1206 01:57:54.311228 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34c80a90-2691-450a-8328-8ad6d64ed003-catalog-content\") pod \"redhat-operators-wlrx5\" (UID: \"34c80a90-2691-450a-8328-8ad6d64ed003\") " pod="openshift-marketplace/redhat-operators-wlrx5" Dec 06 01:57:54 crc kubenswrapper[4991]: I1206 01:57:54.351214 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flnf9\" (UniqueName: \"kubernetes.io/projected/34c80a90-2691-450a-8328-8ad6d64ed003-kube-api-access-flnf9\") pod \"redhat-operators-wlrx5\" (UID: \"34c80a90-2691-450a-8328-8ad6d64ed003\") " pod="openshift-marketplace/redhat-operators-wlrx5" Dec 06 01:57:54 crc kubenswrapper[4991]: I1206 01:57:54.650255 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wlrx5" Dec 06 01:57:55 crc kubenswrapper[4991]: I1206 01:57:55.174938 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wlrx5"] Dec 06 01:57:55 crc kubenswrapper[4991]: W1206 01:57:55.185115 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34c80a90_2691_450a_8328_8ad6d64ed003.slice/crio-986afa9baa5ed7f7fde7fa7720c060a21a27774f97e1f8758e95eaa1d50e2408 WatchSource:0}: Error finding container 986afa9baa5ed7f7fde7fa7720c060a21a27774f97e1f8758e95eaa1d50e2408: Status 404 returned error can't find the container with id 986afa9baa5ed7f7fde7fa7720c060a21a27774f97e1f8758e95eaa1d50e2408 Dec 06 01:57:55 crc kubenswrapper[4991]: I1206 01:57:55.209695 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlrx5" event={"ID":"34c80a90-2691-450a-8328-8ad6d64ed003","Type":"ContainerStarted","Data":"986afa9baa5ed7f7fde7fa7720c060a21a27774f97e1f8758e95eaa1d50e2408"} Dec 06 01:57:56 crc kubenswrapper[4991]: I1206 01:57:56.225066 4991 generic.go:334] "Generic (PLEG): container finished" podID="34c80a90-2691-450a-8328-8ad6d64ed003" containerID="9f6d9aeb1dc0f2677fc3773236b98cb3c36946d796f1887d24d8cc6b46d50688" exitCode=0 Dec 06 01:57:56 crc kubenswrapper[4991]: I1206 01:57:56.225180 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlrx5" event={"ID":"34c80a90-2691-450a-8328-8ad6d64ed003","Type":"ContainerDied","Data":"9f6d9aeb1dc0f2677fc3773236b98cb3c36946d796f1887d24d8cc6b46d50688"} Dec 06 01:57:56 crc kubenswrapper[4991]: I1206 01:57:56.229750 4991 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 01:57:57 crc kubenswrapper[4991]: I1206 01:57:57.237625 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlrx5" event={"ID":"34c80a90-2691-450a-8328-8ad6d64ed003","Type":"ContainerStarted","Data":"fd83ea4d7e640046412cdc3146316bcf912208bad2bee329eceb5bec6b0eef33"} Dec 06 01:57:59 crc kubenswrapper[4991]: I1206 01:57:59.258870 4991 generic.go:334] "Generic (PLEG): container finished" podID="34c80a90-2691-450a-8328-8ad6d64ed003" containerID="fd83ea4d7e640046412cdc3146316bcf912208bad2bee329eceb5bec6b0eef33" exitCode=0 Dec 06 01:57:59 crc kubenswrapper[4991]: I1206 01:57:59.259003 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlrx5" event={"ID":"34c80a90-2691-450a-8328-8ad6d64ed003","Type":"ContainerDied","Data":"fd83ea4d7e640046412cdc3146316bcf912208bad2bee329eceb5bec6b0eef33"} Dec 06 01:58:00 crc kubenswrapper[4991]: I1206 01:58:00.274263 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlrx5" event={"ID":"34c80a90-2691-450a-8328-8ad6d64ed003","Type":"ContainerStarted","Data":"b48b39861edc8666f59f5766b2b77a841048ea18dc585bd7ba95d95b199ce4ee"} Dec 06 01:58:00 crc kubenswrapper[4991]: I1206 01:58:00.305264 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wlrx5" podStartSLOduration=3.8925729589999998 podStartE2EDuration="7.305244624s" podCreationTimestamp="2025-12-06 01:57:53 +0000 UTC" firstStartedPulling="2025-12-06 01:57:56.229139053 +0000 UTC m=+3349.579429551" lastFinishedPulling="2025-12-06 01:57:59.641810748 +0000 UTC m=+3352.992101216" observedRunningTime="2025-12-06 01:58:00.301451454 +0000 UTC m=+3353.651741942" watchObservedRunningTime="2025-12-06 01:58:00.305244624 +0000 UTC m=+3353.655535092" Dec 06 01:58:03 crc kubenswrapper[4991]: I1206 01:58:03.038069 4991 scope.go:117] "RemoveContainer" containerID="a081368548455c25dff8453cd575b59dbccfd8323726b50732f5e6e2f20570a0" Dec 06 01:58:03 crc kubenswrapper[4991]: E1206 01:58:03.038798 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:58:04 crc kubenswrapper[4991]: I1206 01:58:04.651327 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wlrx5" Dec 06 01:58:04 crc kubenswrapper[4991]: I1206 01:58:04.651966 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wlrx5" Dec 06 01:58:05 crc kubenswrapper[4991]: I1206 01:58:05.710497 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wlrx5" podUID="34c80a90-2691-450a-8328-8ad6d64ed003" containerName="registry-server" probeResult="failure" output=< Dec 06 01:58:05 crc kubenswrapper[4991]: timeout: failed to connect service ":50051" within 1s Dec 06 01:58:05 crc kubenswrapper[4991]: > Dec 06 01:58:14 crc kubenswrapper[4991]: I1206 01:58:14.710109 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wlrx5" Dec 06 01:58:14 crc kubenswrapper[4991]: I1206 01:58:14.769884 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wlrx5" Dec 06 01:58:14 crc kubenswrapper[4991]: I1206 01:58:14.957754 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wlrx5"] Dec 06 01:58:16 crc kubenswrapper[4991]: I1206 01:58:16.492846 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wlrx5" podUID="34c80a90-2691-450a-8328-8ad6d64ed003" containerName="registry-server" containerID="cri-o://b48b39861edc8666f59f5766b2b77a841048ea18dc585bd7ba95d95b199ce4ee" gracePeriod=2 Dec 06 01:58:17 crc kubenswrapper[4991]: I1206 01:58:17.025070 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wlrx5" Dec 06 01:58:17 crc kubenswrapper[4991]: I1206 01:58:17.038605 4991 scope.go:117] "RemoveContainer" containerID="a081368548455c25dff8453cd575b59dbccfd8323726b50732f5e6e2f20570a0" Dec 06 01:58:17 crc kubenswrapper[4991]: E1206 01:58:17.038940 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:58:17 crc kubenswrapper[4991]: I1206 01:58:17.129959 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34c80a90-2691-450a-8328-8ad6d64ed003-catalog-content\") pod \"34c80a90-2691-450a-8328-8ad6d64ed003\" (UID: \"34c80a90-2691-450a-8328-8ad6d64ed003\") " Dec 06 01:58:17 crc kubenswrapper[4991]: I1206 01:58:17.130211 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34c80a90-2691-450a-8328-8ad6d64ed003-utilities\") pod \"34c80a90-2691-450a-8328-8ad6d64ed003\" (UID: \"34c80a90-2691-450a-8328-8ad6d64ed003\") " Dec 06 01:58:17 crc kubenswrapper[4991]: I1206 01:58:17.130332 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flnf9\" (UniqueName: \"kubernetes.io/projected/34c80a90-2691-450a-8328-8ad6d64ed003-kube-api-access-flnf9\") pod \"34c80a90-2691-450a-8328-8ad6d64ed003\" (UID: \"34c80a90-2691-450a-8328-8ad6d64ed003\") " Dec 06 01:58:17 crc kubenswrapper[4991]: I1206 01:58:17.131226 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34c80a90-2691-450a-8328-8ad6d64ed003-utilities" (OuterVolumeSpecName: "utilities") pod "34c80a90-2691-450a-8328-8ad6d64ed003" (UID: "34c80a90-2691-450a-8328-8ad6d64ed003"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:58:17 crc kubenswrapper[4991]: I1206 01:58:17.131820 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34c80a90-2691-450a-8328-8ad6d64ed003-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 01:58:17 crc kubenswrapper[4991]: I1206 01:58:17.140052 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34c80a90-2691-450a-8328-8ad6d64ed003-kube-api-access-flnf9" (OuterVolumeSpecName: "kube-api-access-flnf9") pod "34c80a90-2691-450a-8328-8ad6d64ed003" (UID: "34c80a90-2691-450a-8328-8ad6d64ed003"). InnerVolumeSpecName "kube-api-access-flnf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 01:58:17 crc kubenswrapper[4991]: I1206 01:58:17.233760 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flnf9\" (UniqueName: \"kubernetes.io/projected/34c80a90-2691-450a-8328-8ad6d64ed003-kube-api-access-flnf9\") on node \"crc\" DevicePath \"\"" Dec 06 01:58:17 crc kubenswrapper[4991]: I1206 01:58:17.241610 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34c80a90-2691-450a-8328-8ad6d64ed003-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "34c80a90-2691-450a-8328-8ad6d64ed003" (UID: "34c80a90-2691-450a-8328-8ad6d64ed003"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 01:58:17 crc kubenswrapper[4991]: I1206 01:58:17.336081 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34c80a90-2691-450a-8328-8ad6d64ed003-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 01:58:17 crc kubenswrapper[4991]: I1206 01:58:17.505270 4991 generic.go:334] "Generic (PLEG): container finished" podID="34c80a90-2691-450a-8328-8ad6d64ed003" containerID="b48b39861edc8666f59f5766b2b77a841048ea18dc585bd7ba95d95b199ce4ee" exitCode=0 Dec 06 01:58:17 crc kubenswrapper[4991]: I1206 01:58:17.505335 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlrx5" event={"ID":"34c80a90-2691-450a-8328-8ad6d64ed003","Type":"ContainerDied","Data":"b48b39861edc8666f59f5766b2b77a841048ea18dc585bd7ba95d95b199ce4ee"} Dec 06 01:58:17 crc kubenswrapper[4991]: I1206 01:58:17.505338 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wlrx5" Dec 06 01:58:17 crc kubenswrapper[4991]: I1206 01:58:17.505363 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlrx5" event={"ID":"34c80a90-2691-450a-8328-8ad6d64ed003","Type":"ContainerDied","Data":"986afa9baa5ed7f7fde7fa7720c060a21a27774f97e1f8758e95eaa1d50e2408"} Dec 06 01:58:17 crc kubenswrapper[4991]: I1206 01:58:17.505381 4991 scope.go:117] "RemoveContainer" containerID="b48b39861edc8666f59f5766b2b77a841048ea18dc585bd7ba95d95b199ce4ee" Dec 06 01:58:17 crc kubenswrapper[4991]: I1206 01:58:17.541236 4991 scope.go:117] "RemoveContainer" containerID="fd83ea4d7e640046412cdc3146316bcf912208bad2bee329eceb5bec6b0eef33" Dec 06 01:58:17 crc kubenswrapper[4991]: I1206 01:58:17.552149 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wlrx5"] Dec 06 01:58:17 crc kubenswrapper[4991]: I1206 01:58:17.558885 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wlrx5"] Dec 06 01:58:17 crc kubenswrapper[4991]: I1206 01:58:17.569526 4991 scope.go:117] "RemoveContainer" containerID="9f6d9aeb1dc0f2677fc3773236b98cb3c36946d796f1887d24d8cc6b46d50688" Dec 06 01:58:17 crc kubenswrapper[4991]: I1206 01:58:17.629194 4991 scope.go:117] "RemoveContainer" containerID="b48b39861edc8666f59f5766b2b77a841048ea18dc585bd7ba95d95b199ce4ee" Dec 06 01:58:17 crc kubenswrapper[4991]: E1206 01:58:17.629712 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b48b39861edc8666f59f5766b2b77a841048ea18dc585bd7ba95d95b199ce4ee\": container with ID starting with b48b39861edc8666f59f5766b2b77a841048ea18dc585bd7ba95d95b199ce4ee not found: ID does not exist" containerID="b48b39861edc8666f59f5766b2b77a841048ea18dc585bd7ba95d95b199ce4ee" Dec 06 01:58:17 crc kubenswrapper[4991]: I1206 01:58:17.629848 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b48b39861edc8666f59f5766b2b77a841048ea18dc585bd7ba95d95b199ce4ee"} err="failed to get container status \"b48b39861edc8666f59f5766b2b77a841048ea18dc585bd7ba95d95b199ce4ee\": rpc error: code = NotFound desc = could not find container \"b48b39861edc8666f59f5766b2b77a841048ea18dc585bd7ba95d95b199ce4ee\": container with ID starting with b48b39861edc8666f59f5766b2b77a841048ea18dc585bd7ba95d95b199ce4ee not found: ID does not exist" Dec 06 01:58:17 crc kubenswrapper[4991]: I1206 01:58:17.629911 4991 scope.go:117] "RemoveContainer" containerID="fd83ea4d7e640046412cdc3146316bcf912208bad2bee329eceb5bec6b0eef33" Dec 06 01:58:17 crc kubenswrapper[4991]: E1206 01:58:17.630292 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd83ea4d7e640046412cdc3146316bcf912208bad2bee329eceb5bec6b0eef33\": container with ID starting with fd83ea4d7e640046412cdc3146316bcf912208bad2bee329eceb5bec6b0eef33 not found: ID does not exist" containerID="fd83ea4d7e640046412cdc3146316bcf912208bad2bee329eceb5bec6b0eef33" Dec 06 01:58:17 crc kubenswrapper[4991]: I1206 01:58:17.630331 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd83ea4d7e640046412cdc3146316bcf912208bad2bee329eceb5bec6b0eef33"} err="failed to get container status \"fd83ea4d7e640046412cdc3146316bcf912208bad2bee329eceb5bec6b0eef33\": rpc error: code = NotFound desc = could not find container \"fd83ea4d7e640046412cdc3146316bcf912208bad2bee329eceb5bec6b0eef33\": container with ID starting with fd83ea4d7e640046412cdc3146316bcf912208bad2bee329eceb5bec6b0eef33 not found: ID does not exist" Dec 06 01:58:17 crc kubenswrapper[4991]: I1206 01:58:17.630353 4991 scope.go:117] "RemoveContainer" containerID="9f6d9aeb1dc0f2677fc3773236b98cb3c36946d796f1887d24d8cc6b46d50688" Dec 06 01:58:17 crc kubenswrapper[4991]: E1206 01:58:17.630623 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f6d9aeb1dc0f2677fc3773236b98cb3c36946d796f1887d24d8cc6b46d50688\": container with ID starting with 9f6d9aeb1dc0f2677fc3773236b98cb3c36946d796f1887d24d8cc6b46d50688 not found: ID does not exist" containerID="9f6d9aeb1dc0f2677fc3773236b98cb3c36946d796f1887d24d8cc6b46d50688" Dec 06 01:58:17 crc kubenswrapper[4991]: I1206 01:58:17.630649 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f6d9aeb1dc0f2677fc3773236b98cb3c36946d796f1887d24d8cc6b46d50688"} err="failed to get container status \"9f6d9aeb1dc0f2677fc3773236b98cb3c36946d796f1887d24d8cc6b46d50688\": rpc error: code = NotFound desc = could not find container \"9f6d9aeb1dc0f2677fc3773236b98cb3c36946d796f1887d24d8cc6b46d50688\": container with ID starting with 9f6d9aeb1dc0f2677fc3773236b98cb3c36946d796f1887d24d8cc6b46d50688 not found: ID does not exist" Dec 06 01:58:19 crc kubenswrapper[4991]: I1206 01:58:19.083761 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34c80a90-2691-450a-8328-8ad6d64ed003" path="/var/lib/kubelet/pods/34c80a90-2691-450a-8328-8ad6d64ed003/volumes" Dec 06 01:58:31 crc kubenswrapper[4991]: I1206 01:58:31.038755 4991 scope.go:117] "RemoveContainer" containerID="a081368548455c25dff8453cd575b59dbccfd8323726b50732f5e6e2f20570a0" Dec 06 01:58:31 crc kubenswrapper[4991]: E1206 01:58:31.040004 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:58:44 crc kubenswrapper[4991]: I1206 01:58:44.039338 4991 scope.go:117] "RemoveContainer" containerID="a081368548455c25dff8453cd575b59dbccfd8323726b50732f5e6e2f20570a0" Dec 06 01:58:44 crc kubenswrapper[4991]: E1206 01:58:44.040043 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:58:59 crc kubenswrapper[4991]: I1206 01:58:59.037292 4991 scope.go:117] "RemoveContainer" containerID="a081368548455c25dff8453cd575b59dbccfd8323726b50732f5e6e2f20570a0" Dec 06 01:58:59 crc kubenswrapper[4991]: E1206 01:58:59.038045 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:59:11 crc kubenswrapper[4991]: I1206 01:59:11.038421 4991 scope.go:117] "RemoveContainer" containerID="a081368548455c25dff8453cd575b59dbccfd8323726b50732f5e6e2f20570a0" Dec 06 01:59:11 crc kubenswrapper[4991]: E1206 01:59:11.039384 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 01:59:24 crc kubenswrapper[4991]: I1206 01:59:24.037911 4991 scope.go:117] "RemoveContainer" containerID="a081368548455c25dff8453cd575b59dbccfd8323726b50732f5e6e2f20570a0" Dec 06 01:59:25 crc kubenswrapper[4991]: I1206 01:59:25.267544 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" event={"ID":"f35cea72-9e31-4432-9e3b-16a50ebce794","Type":"ContainerStarted","Data":"569db4596366e9518cb632f3539d5e3f236a4f822ba96150dcc6d1c03fe9e86d"} Dec 06 02:00:00 crc kubenswrapper[4991]: I1206 02:00:00.165320 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416440-xlr8q"] Dec 06 02:00:00 crc kubenswrapper[4991]: E1206 02:00:00.166592 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34c80a90-2691-450a-8328-8ad6d64ed003" containerName="extract-content" Dec 06 02:00:00 crc kubenswrapper[4991]: I1206 02:00:00.166608 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="34c80a90-2691-450a-8328-8ad6d64ed003" containerName="extract-content" Dec 06 02:00:00 crc kubenswrapper[4991]: E1206 02:00:00.166632 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34c80a90-2691-450a-8328-8ad6d64ed003" containerName="extract-utilities" Dec 06 02:00:00 crc kubenswrapper[4991]: I1206 02:00:00.166639 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="34c80a90-2691-450a-8328-8ad6d64ed003" containerName="extract-utilities" Dec 06 02:00:00 crc kubenswrapper[4991]: E1206 02:00:00.166674 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34c80a90-2691-450a-8328-8ad6d64ed003" containerName="registry-server" Dec 06 02:00:00 crc kubenswrapper[4991]: I1206 02:00:00.166680 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="34c80a90-2691-450a-8328-8ad6d64ed003" containerName="registry-server" Dec 06 02:00:00 crc kubenswrapper[4991]: I1206 02:00:00.166911 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="34c80a90-2691-450a-8328-8ad6d64ed003" containerName="registry-server" Dec 06 02:00:00 crc kubenswrapper[4991]: I1206 02:00:00.167797 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416440-xlr8q" Dec 06 02:00:00 crc kubenswrapper[4991]: I1206 02:00:00.172655 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 06 02:00:00 crc kubenswrapper[4991]: I1206 02:00:00.172906 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 06 02:00:00 crc kubenswrapper[4991]: I1206 02:00:00.203051 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416440-xlr8q"] Dec 06 02:00:00 crc kubenswrapper[4991]: I1206 02:00:00.234074 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znjsq\" (UniqueName: \"kubernetes.io/projected/72b29200-1c8b-4b1d-823b-5826737495b9-kube-api-access-znjsq\") pod \"collect-profiles-29416440-xlr8q\" (UID: \"72b29200-1c8b-4b1d-823b-5826737495b9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416440-xlr8q" Dec 06 02:00:00 crc kubenswrapper[4991]: I1206 02:00:00.234137 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/72b29200-1c8b-4b1d-823b-5826737495b9-config-volume\") pod \"collect-profiles-29416440-xlr8q\" (UID: \"72b29200-1c8b-4b1d-823b-5826737495b9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416440-xlr8q" Dec 06 02:00:00 crc kubenswrapper[4991]: I1206 02:00:00.234237 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/72b29200-1c8b-4b1d-823b-5826737495b9-secret-volume\") pod \"collect-profiles-29416440-xlr8q\" (UID: \"72b29200-1c8b-4b1d-823b-5826737495b9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416440-xlr8q" Dec 06 02:00:00 crc kubenswrapper[4991]: I1206 02:00:00.336752 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znjsq\" (UniqueName: \"kubernetes.io/projected/72b29200-1c8b-4b1d-823b-5826737495b9-kube-api-access-znjsq\") pod \"collect-profiles-29416440-xlr8q\" (UID: \"72b29200-1c8b-4b1d-823b-5826737495b9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416440-xlr8q" Dec 06 02:00:00 crc kubenswrapper[4991]: I1206 02:00:00.336826 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/72b29200-1c8b-4b1d-823b-5826737495b9-config-volume\") pod \"collect-profiles-29416440-xlr8q\" (UID: \"72b29200-1c8b-4b1d-823b-5826737495b9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416440-xlr8q" Dec 06 02:00:00 crc kubenswrapper[4991]: I1206 02:00:00.336919 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/72b29200-1c8b-4b1d-823b-5826737495b9-secret-volume\") pod \"collect-profiles-29416440-xlr8q\" (UID: \"72b29200-1c8b-4b1d-823b-5826737495b9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416440-xlr8q" Dec 06 02:00:00 crc kubenswrapper[4991]: I1206 02:00:00.338506 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/72b29200-1c8b-4b1d-823b-5826737495b9-config-volume\") pod \"collect-profiles-29416440-xlr8q\" (UID: \"72b29200-1c8b-4b1d-823b-5826737495b9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416440-xlr8q" Dec 06 02:00:00 crc kubenswrapper[4991]: I1206 02:00:00.345919 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/72b29200-1c8b-4b1d-823b-5826737495b9-secret-volume\") pod \"collect-profiles-29416440-xlr8q\" (UID: \"72b29200-1c8b-4b1d-823b-5826737495b9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416440-xlr8q" Dec 06 02:00:00 crc kubenswrapper[4991]: I1206 02:00:00.353937 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znjsq\" (UniqueName: \"kubernetes.io/projected/72b29200-1c8b-4b1d-823b-5826737495b9-kube-api-access-znjsq\") pod \"collect-profiles-29416440-xlr8q\" (UID: \"72b29200-1c8b-4b1d-823b-5826737495b9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416440-xlr8q" Dec 06 02:00:00 crc kubenswrapper[4991]: I1206 02:00:00.501727 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416440-xlr8q" Dec 06 02:00:00 crc kubenswrapper[4991]: I1206 02:00:00.980602 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416440-xlr8q"] Dec 06 02:00:01 crc kubenswrapper[4991]: W1206 02:00:00.999966 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72b29200_1c8b_4b1d_823b_5826737495b9.slice/crio-d4684449a05995eab2f607abd0000e9507a54f95e65515ee8a6d080c48eaae8c WatchSource:0}: Error finding container d4684449a05995eab2f607abd0000e9507a54f95e65515ee8a6d080c48eaae8c: Status 404 returned error can't find the container with id d4684449a05995eab2f607abd0000e9507a54f95e65515ee8a6d080c48eaae8c Dec 06 02:00:01 crc kubenswrapper[4991]: I1206 02:00:01.727172 4991 generic.go:334] "Generic (PLEG): container finished" podID="72b29200-1c8b-4b1d-823b-5826737495b9" containerID="49a5cf4a158298eb1c132c44e0e771105b4bf31760597f79a4b8425b4685438f" exitCode=0 Dec 06 02:00:01 crc kubenswrapper[4991]: I1206 02:00:01.727250 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416440-xlr8q" event={"ID":"72b29200-1c8b-4b1d-823b-5826737495b9","Type":"ContainerDied","Data":"49a5cf4a158298eb1c132c44e0e771105b4bf31760597f79a4b8425b4685438f"} Dec 06 02:00:01 crc kubenswrapper[4991]: I1206 02:00:01.727671 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416440-xlr8q" event={"ID":"72b29200-1c8b-4b1d-823b-5826737495b9","Type":"ContainerStarted","Data":"d4684449a05995eab2f607abd0000e9507a54f95e65515ee8a6d080c48eaae8c"} Dec 06 02:00:03 crc kubenswrapper[4991]: I1206 02:00:03.147244 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416440-xlr8q" Dec 06 02:00:03 crc kubenswrapper[4991]: I1206 02:00:03.245687 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/72b29200-1c8b-4b1d-823b-5826737495b9-secret-volume\") pod \"72b29200-1c8b-4b1d-823b-5826737495b9\" (UID: \"72b29200-1c8b-4b1d-823b-5826737495b9\") " Dec 06 02:00:03 crc kubenswrapper[4991]: I1206 02:00:03.245889 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znjsq\" (UniqueName: \"kubernetes.io/projected/72b29200-1c8b-4b1d-823b-5826737495b9-kube-api-access-znjsq\") pod \"72b29200-1c8b-4b1d-823b-5826737495b9\" (UID: \"72b29200-1c8b-4b1d-823b-5826737495b9\") " Dec 06 02:00:03 crc kubenswrapper[4991]: I1206 02:00:03.246090 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/72b29200-1c8b-4b1d-823b-5826737495b9-config-volume\") pod \"72b29200-1c8b-4b1d-823b-5826737495b9\" (UID: \"72b29200-1c8b-4b1d-823b-5826737495b9\") " Dec 06 02:00:03 crc kubenswrapper[4991]: I1206 02:00:03.246519 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72b29200-1c8b-4b1d-823b-5826737495b9-config-volume" (OuterVolumeSpecName: "config-volume") pod "72b29200-1c8b-4b1d-823b-5826737495b9" (UID: "72b29200-1c8b-4b1d-823b-5826737495b9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 02:00:03 crc kubenswrapper[4991]: I1206 02:00:03.251902 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72b29200-1c8b-4b1d-823b-5826737495b9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "72b29200-1c8b-4b1d-823b-5826737495b9" (UID: "72b29200-1c8b-4b1d-823b-5826737495b9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 02:00:03 crc kubenswrapper[4991]: I1206 02:00:03.251924 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72b29200-1c8b-4b1d-823b-5826737495b9-kube-api-access-znjsq" (OuterVolumeSpecName: "kube-api-access-znjsq") pod "72b29200-1c8b-4b1d-823b-5826737495b9" (UID: "72b29200-1c8b-4b1d-823b-5826737495b9"). InnerVolumeSpecName "kube-api-access-znjsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 02:00:03 crc kubenswrapper[4991]: I1206 02:00:03.349201 4991 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/72b29200-1c8b-4b1d-823b-5826737495b9-config-volume\") on node \"crc\" DevicePath \"\"" Dec 06 02:00:03 crc kubenswrapper[4991]: I1206 02:00:03.349255 4991 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/72b29200-1c8b-4b1d-823b-5826737495b9-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 06 02:00:03 crc kubenswrapper[4991]: I1206 02:00:03.349274 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znjsq\" (UniqueName: \"kubernetes.io/projected/72b29200-1c8b-4b1d-823b-5826737495b9-kube-api-access-znjsq\") on node \"crc\" DevicePath \"\"" Dec 06 02:00:03 crc kubenswrapper[4991]: I1206 02:00:03.753365 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416440-xlr8q" event={"ID":"72b29200-1c8b-4b1d-823b-5826737495b9","Type":"ContainerDied","Data":"d4684449a05995eab2f607abd0000e9507a54f95e65515ee8a6d080c48eaae8c"} Dec 06 02:00:03 crc kubenswrapper[4991]: I1206 02:00:03.753428 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4684449a05995eab2f607abd0000e9507a54f95e65515ee8a6d080c48eaae8c" Dec 06 02:00:03 crc kubenswrapper[4991]: I1206 02:00:03.753425 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416440-xlr8q" Dec 06 02:00:04 crc kubenswrapper[4991]: I1206 02:00:04.251372 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416395-kn927"] Dec 06 02:00:04 crc kubenswrapper[4991]: I1206 02:00:04.267175 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416395-kn927"] Dec 06 02:00:05 crc kubenswrapper[4991]: I1206 02:00:05.060409 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4d64bf7-c39d-4ec3-86c6-e14d46d20388" path="/var/lib/kubelet/pods/e4d64bf7-c39d-4ec3-86c6-e14d46d20388/volumes" Dec 06 02:00:11 crc kubenswrapper[4991]: I1206 02:00:11.910565 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pbc9l"] Dec 06 02:00:11 crc kubenswrapper[4991]: E1206 02:00:11.912685 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72b29200-1c8b-4b1d-823b-5826737495b9" containerName="collect-profiles" Dec 06 02:00:11 crc kubenswrapper[4991]: I1206 02:00:11.912770 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="72b29200-1c8b-4b1d-823b-5826737495b9" containerName="collect-profiles" Dec 06 02:00:11 crc kubenswrapper[4991]: I1206 02:00:11.913027 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="72b29200-1c8b-4b1d-823b-5826737495b9" containerName="collect-profiles" Dec 06 02:00:11 crc kubenswrapper[4991]: I1206 02:00:11.914702 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pbc9l" Dec 06 02:00:11 crc kubenswrapper[4991]: I1206 02:00:11.929819 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pbc9l"] Dec 06 02:00:12 crc kubenswrapper[4991]: I1206 02:00:12.076961 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sxwb\" (UniqueName: \"kubernetes.io/projected/2013c800-76e8-4c30-b489-bda82f0d6e77-kube-api-access-4sxwb\") pod \"certified-operators-pbc9l\" (UID: \"2013c800-76e8-4c30-b489-bda82f0d6e77\") " pod="openshift-marketplace/certified-operators-pbc9l" Dec 06 02:00:12 crc kubenswrapper[4991]: I1206 02:00:12.077075 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2013c800-76e8-4c30-b489-bda82f0d6e77-utilities\") pod \"certified-operators-pbc9l\" (UID: \"2013c800-76e8-4c30-b489-bda82f0d6e77\") " pod="openshift-marketplace/certified-operators-pbc9l" Dec 06 02:00:12 crc kubenswrapper[4991]: I1206 02:00:12.077151 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2013c800-76e8-4c30-b489-bda82f0d6e77-catalog-content\") pod \"certified-operators-pbc9l\" (UID: \"2013c800-76e8-4c30-b489-bda82f0d6e77\") " pod="openshift-marketplace/certified-operators-pbc9l" Dec 06 02:00:12 crc kubenswrapper[4991]: I1206 02:00:12.178810 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2013c800-76e8-4c30-b489-bda82f0d6e77-catalog-content\") pod \"certified-operators-pbc9l\" (UID: \"2013c800-76e8-4c30-b489-bda82f0d6e77\") " pod="openshift-marketplace/certified-operators-pbc9l" Dec 06 02:00:12 crc kubenswrapper[4991]: I1206 02:00:12.179257 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sxwb\" (UniqueName: \"kubernetes.io/projected/2013c800-76e8-4c30-b489-bda82f0d6e77-kube-api-access-4sxwb\") pod \"certified-operators-pbc9l\" (UID: \"2013c800-76e8-4c30-b489-bda82f0d6e77\") " pod="openshift-marketplace/certified-operators-pbc9l" Dec 06 02:00:12 crc kubenswrapper[4991]: I1206 02:00:12.179487 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2013c800-76e8-4c30-b489-bda82f0d6e77-utilities\") pod \"certified-operators-pbc9l\" (UID: \"2013c800-76e8-4c30-b489-bda82f0d6e77\") " pod="openshift-marketplace/certified-operators-pbc9l" Dec 06 02:00:12 crc kubenswrapper[4991]: I1206 02:00:12.179687 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2013c800-76e8-4c30-b489-bda82f0d6e77-catalog-content\") pod \"certified-operators-pbc9l\" (UID: \"2013c800-76e8-4c30-b489-bda82f0d6e77\") " pod="openshift-marketplace/certified-operators-pbc9l" Dec 06 02:00:12 crc kubenswrapper[4991]: I1206 02:00:12.179944 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2013c800-76e8-4c30-b489-bda82f0d6e77-utilities\") pod \"certified-operators-pbc9l\" (UID: \"2013c800-76e8-4c30-b489-bda82f0d6e77\") " pod="openshift-marketplace/certified-operators-pbc9l" Dec 06 02:00:12 crc kubenswrapper[4991]: I1206 02:00:12.202449 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sxwb\" (UniqueName: \"kubernetes.io/projected/2013c800-76e8-4c30-b489-bda82f0d6e77-kube-api-access-4sxwb\") pod \"certified-operators-pbc9l\" (UID: \"2013c800-76e8-4c30-b489-bda82f0d6e77\") " pod="openshift-marketplace/certified-operators-pbc9l" Dec 06 02:00:12 crc kubenswrapper[4991]: I1206 02:00:12.263065 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pbc9l" Dec 06 02:00:12 crc kubenswrapper[4991]: I1206 02:00:12.772502 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pbc9l"] Dec 06 02:00:12 crc kubenswrapper[4991]: W1206 02:00:12.779552 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2013c800_76e8_4c30_b489_bda82f0d6e77.slice/crio-6462c4fd7eb3b7f513145e0afd8cc052b0814d0e12c37c6d8a74bf2c52489ab4 WatchSource:0}: Error finding container 6462c4fd7eb3b7f513145e0afd8cc052b0814d0e12c37c6d8a74bf2c52489ab4: Status 404 returned error can't find the container with id 6462c4fd7eb3b7f513145e0afd8cc052b0814d0e12c37c6d8a74bf2c52489ab4 Dec 06 02:00:12 crc kubenswrapper[4991]: I1206 02:00:12.869010 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pbc9l" event={"ID":"2013c800-76e8-4c30-b489-bda82f0d6e77","Type":"ContainerStarted","Data":"6462c4fd7eb3b7f513145e0afd8cc052b0814d0e12c37c6d8a74bf2c52489ab4"} Dec 06 02:00:13 crc kubenswrapper[4991]: I1206 02:00:13.883572 4991 generic.go:334] "Generic (PLEG): container finished" podID="2013c800-76e8-4c30-b489-bda82f0d6e77" containerID="54e72e09b5c236f08379c29d874ef1f9c9a6f60a2ceb0ef44cb4bf32db64dea8" exitCode=0 Dec 06 02:00:13 crc kubenswrapper[4991]: I1206 02:00:13.883621 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pbc9l" event={"ID":"2013c800-76e8-4c30-b489-bda82f0d6e77","Type":"ContainerDied","Data":"54e72e09b5c236f08379c29d874ef1f9c9a6f60a2ceb0ef44cb4bf32db64dea8"} Dec 06 02:00:14 crc kubenswrapper[4991]: I1206 02:00:14.898226 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pbc9l" event={"ID":"2013c800-76e8-4c30-b489-bda82f0d6e77","Type":"ContainerStarted","Data":"1c92dfe174dab2b88756f9015a2398ab7f80b45ca2fff159998a1e63b19cb231"} Dec 06 02:00:15 crc kubenswrapper[4991]: I1206 02:00:15.909204 4991 generic.go:334] "Generic (PLEG): container finished" podID="2013c800-76e8-4c30-b489-bda82f0d6e77" containerID="1c92dfe174dab2b88756f9015a2398ab7f80b45ca2fff159998a1e63b19cb231" exitCode=0 Dec 06 02:00:15 crc kubenswrapper[4991]: I1206 02:00:15.909360 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pbc9l" event={"ID":"2013c800-76e8-4c30-b489-bda82f0d6e77","Type":"ContainerDied","Data":"1c92dfe174dab2b88756f9015a2398ab7f80b45ca2fff159998a1e63b19cb231"} Dec 06 02:00:16 crc kubenswrapper[4991]: I1206 02:00:16.925829 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pbc9l" event={"ID":"2013c800-76e8-4c30-b489-bda82f0d6e77","Type":"ContainerStarted","Data":"e7421a7b9a851ef76b779e87463ad59be067bdfcc1937332dc376a26fe95db6e"} Dec 06 02:00:16 crc kubenswrapper[4991]: I1206 02:00:16.963864 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pbc9l" podStartSLOduration=3.433285932 podStartE2EDuration="5.963835095s" podCreationTimestamp="2025-12-06 02:00:11 +0000 UTC" firstStartedPulling="2025-12-06 02:00:13.886898824 +0000 UTC m=+3487.237189292" lastFinishedPulling="2025-12-06 02:00:16.417447987 +0000 UTC m=+3489.767738455" observedRunningTime="2025-12-06 02:00:16.950566824 +0000 UTC m=+3490.300857322" watchObservedRunningTime="2025-12-06 02:00:16.963835095 +0000 UTC m=+3490.314125573" Dec 06 02:00:18 crc kubenswrapper[4991]: I1206 02:00:18.415268 4991 scope.go:117] "RemoveContainer" containerID="5e8c4810473d427850db919966144089aee9a221e28ce2fb315f808076fb0205" Dec 06 02:00:22 crc kubenswrapper[4991]: I1206 02:00:22.263664 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pbc9l" Dec 06 02:00:22 crc kubenswrapper[4991]: I1206 02:00:22.266702 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pbc9l" Dec 06 02:00:22 crc kubenswrapper[4991]: I1206 02:00:22.311092 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pbc9l" Dec 06 02:00:23 crc kubenswrapper[4991]: I1206 02:00:23.069831 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pbc9l" Dec 06 02:00:23 crc kubenswrapper[4991]: I1206 02:00:23.132907 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pbc9l"] Dec 06 02:00:25 crc kubenswrapper[4991]: I1206 02:00:25.015203 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pbc9l" podUID="2013c800-76e8-4c30-b489-bda82f0d6e77" containerName="registry-server" containerID="cri-o://e7421a7b9a851ef76b779e87463ad59be067bdfcc1937332dc376a26fe95db6e" gracePeriod=2 Dec 06 02:00:25 crc kubenswrapper[4991]: I1206 02:00:25.502892 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pbc9l" Dec 06 02:00:25 crc kubenswrapper[4991]: I1206 02:00:25.570400 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2013c800-76e8-4c30-b489-bda82f0d6e77-utilities\") pod \"2013c800-76e8-4c30-b489-bda82f0d6e77\" (UID: \"2013c800-76e8-4c30-b489-bda82f0d6e77\") " Dec 06 02:00:25 crc kubenswrapper[4991]: I1206 02:00:25.570579 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4sxwb\" (UniqueName: \"kubernetes.io/projected/2013c800-76e8-4c30-b489-bda82f0d6e77-kube-api-access-4sxwb\") pod \"2013c800-76e8-4c30-b489-bda82f0d6e77\" (UID: \"2013c800-76e8-4c30-b489-bda82f0d6e77\") " Dec 06 02:00:25 crc kubenswrapper[4991]: I1206 02:00:25.570711 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2013c800-76e8-4c30-b489-bda82f0d6e77-catalog-content\") pod \"2013c800-76e8-4c30-b489-bda82f0d6e77\" (UID: \"2013c800-76e8-4c30-b489-bda82f0d6e77\") " Dec 06 02:00:25 crc kubenswrapper[4991]: I1206 02:00:25.577684 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2013c800-76e8-4c30-b489-bda82f0d6e77-utilities" (OuterVolumeSpecName: "utilities") pod "2013c800-76e8-4c30-b489-bda82f0d6e77" (UID: "2013c800-76e8-4c30-b489-bda82f0d6e77"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 02:00:25 crc kubenswrapper[4991]: I1206 02:00:25.579911 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2013c800-76e8-4c30-b489-bda82f0d6e77-kube-api-access-4sxwb" (OuterVolumeSpecName: "kube-api-access-4sxwb") pod "2013c800-76e8-4c30-b489-bda82f0d6e77" (UID: "2013c800-76e8-4c30-b489-bda82f0d6e77"). InnerVolumeSpecName "kube-api-access-4sxwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 02:00:25 crc kubenswrapper[4991]: I1206 02:00:25.673397 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2013c800-76e8-4c30-b489-bda82f0d6e77-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 02:00:25 crc kubenswrapper[4991]: I1206 02:00:25.673428 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4sxwb\" (UniqueName: \"kubernetes.io/projected/2013c800-76e8-4c30-b489-bda82f0d6e77-kube-api-access-4sxwb\") on node \"crc\" DevicePath \"\"" Dec 06 02:00:25 crc kubenswrapper[4991]: I1206 02:00:25.859528 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2013c800-76e8-4c30-b489-bda82f0d6e77-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2013c800-76e8-4c30-b489-bda82f0d6e77" (UID: "2013c800-76e8-4c30-b489-bda82f0d6e77"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 02:00:25 crc kubenswrapper[4991]: I1206 02:00:25.876712 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2013c800-76e8-4c30-b489-bda82f0d6e77-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 02:00:26 crc kubenswrapper[4991]: I1206 02:00:26.030296 4991 generic.go:334] "Generic (PLEG): container finished" podID="2013c800-76e8-4c30-b489-bda82f0d6e77" containerID="e7421a7b9a851ef76b779e87463ad59be067bdfcc1937332dc376a26fe95db6e" exitCode=0 Dec 06 02:00:26 crc kubenswrapper[4991]: I1206 02:00:26.030620 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pbc9l" Dec 06 02:00:26 crc kubenswrapper[4991]: I1206 02:00:26.030541 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pbc9l" event={"ID":"2013c800-76e8-4c30-b489-bda82f0d6e77","Type":"ContainerDied","Data":"e7421a7b9a851ef76b779e87463ad59be067bdfcc1937332dc376a26fe95db6e"} Dec 06 02:00:26 crc kubenswrapper[4991]: I1206 02:00:26.031292 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pbc9l" event={"ID":"2013c800-76e8-4c30-b489-bda82f0d6e77","Type":"ContainerDied","Data":"6462c4fd7eb3b7f513145e0afd8cc052b0814d0e12c37c6d8a74bf2c52489ab4"} Dec 06 02:00:26 crc kubenswrapper[4991]: I1206 02:00:26.031340 4991 scope.go:117] "RemoveContainer" containerID="e7421a7b9a851ef76b779e87463ad59be067bdfcc1937332dc376a26fe95db6e" Dec 06 02:00:26 crc kubenswrapper[4991]: I1206 02:00:26.060103 4991 scope.go:117] "RemoveContainer" containerID="1c92dfe174dab2b88756f9015a2398ab7f80b45ca2fff159998a1e63b19cb231" Dec 06 02:00:26 crc kubenswrapper[4991]: I1206 02:00:26.087264 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pbc9l"] Dec 06 02:00:26 crc kubenswrapper[4991]: I1206 02:00:26.096024 4991 scope.go:117] "RemoveContainer" containerID="54e72e09b5c236f08379c29d874ef1f9c9a6f60a2ceb0ef44cb4bf32db64dea8" Dec 06 02:00:26 crc kubenswrapper[4991]: I1206 02:00:26.100495 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pbc9l"] Dec 06 02:00:26 crc kubenswrapper[4991]: I1206 02:00:26.157925 4991 scope.go:117] "RemoveContainer" containerID="e7421a7b9a851ef76b779e87463ad59be067bdfcc1937332dc376a26fe95db6e" Dec 06 02:00:26 crc kubenswrapper[4991]: E1206 02:00:26.158486 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7421a7b9a851ef76b779e87463ad59be067bdfcc1937332dc376a26fe95db6e\": container with ID starting with e7421a7b9a851ef76b779e87463ad59be067bdfcc1937332dc376a26fe95db6e not found: ID does not exist" containerID="e7421a7b9a851ef76b779e87463ad59be067bdfcc1937332dc376a26fe95db6e" Dec 06 02:00:26 crc kubenswrapper[4991]: I1206 02:00:26.158546 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7421a7b9a851ef76b779e87463ad59be067bdfcc1937332dc376a26fe95db6e"} err="failed to get container status \"e7421a7b9a851ef76b779e87463ad59be067bdfcc1937332dc376a26fe95db6e\": rpc error: code = NotFound desc = could not find container \"e7421a7b9a851ef76b779e87463ad59be067bdfcc1937332dc376a26fe95db6e\": container with ID starting with e7421a7b9a851ef76b779e87463ad59be067bdfcc1937332dc376a26fe95db6e not found: ID does not exist" Dec 06 02:00:26 crc kubenswrapper[4991]: I1206 02:00:26.158574 4991 scope.go:117] "RemoveContainer" containerID="1c92dfe174dab2b88756f9015a2398ab7f80b45ca2fff159998a1e63b19cb231" Dec 06 02:00:26 crc kubenswrapper[4991]: E1206 02:00:26.159127 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c92dfe174dab2b88756f9015a2398ab7f80b45ca2fff159998a1e63b19cb231\": container with ID starting with 1c92dfe174dab2b88756f9015a2398ab7f80b45ca2fff159998a1e63b19cb231 not found: ID does not exist" containerID="1c92dfe174dab2b88756f9015a2398ab7f80b45ca2fff159998a1e63b19cb231" Dec 06 02:00:26 crc kubenswrapper[4991]: I1206 02:00:26.159155 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c92dfe174dab2b88756f9015a2398ab7f80b45ca2fff159998a1e63b19cb231"} err="failed to get container status \"1c92dfe174dab2b88756f9015a2398ab7f80b45ca2fff159998a1e63b19cb231\": rpc error: code = NotFound desc = could not find container \"1c92dfe174dab2b88756f9015a2398ab7f80b45ca2fff159998a1e63b19cb231\": container with ID starting with 1c92dfe174dab2b88756f9015a2398ab7f80b45ca2fff159998a1e63b19cb231 not found: ID does not exist" Dec 06 02:00:26 crc kubenswrapper[4991]: I1206 02:00:26.159174 4991 scope.go:117] "RemoveContainer" containerID="54e72e09b5c236f08379c29d874ef1f9c9a6f60a2ceb0ef44cb4bf32db64dea8" Dec 06 02:00:26 crc kubenswrapper[4991]: E1206 02:00:26.159536 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54e72e09b5c236f08379c29d874ef1f9c9a6f60a2ceb0ef44cb4bf32db64dea8\": container with ID starting with 54e72e09b5c236f08379c29d874ef1f9c9a6f60a2ceb0ef44cb4bf32db64dea8 not found: ID does not exist" containerID="54e72e09b5c236f08379c29d874ef1f9c9a6f60a2ceb0ef44cb4bf32db64dea8" Dec 06 02:00:26 crc kubenswrapper[4991]: I1206 02:00:26.159621 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54e72e09b5c236f08379c29d874ef1f9c9a6f60a2ceb0ef44cb4bf32db64dea8"} err="failed to get container status \"54e72e09b5c236f08379c29d874ef1f9c9a6f60a2ceb0ef44cb4bf32db64dea8\": rpc error: code = NotFound desc = could not find container \"54e72e09b5c236f08379c29d874ef1f9c9a6f60a2ceb0ef44cb4bf32db64dea8\": container with ID starting with 54e72e09b5c236f08379c29d874ef1f9c9a6f60a2ceb0ef44cb4bf32db64dea8 not found: ID does not exist" Dec 06 02:00:27 crc kubenswrapper[4991]: I1206 02:00:27.080649 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2013c800-76e8-4c30-b489-bda82f0d6e77" path="/var/lib/kubelet/pods/2013c800-76e8-4c30-b489-bda82f0d6e77/volumes" Dec 06 02:01:00 crc kubenswrapper[4991]: I1206 02:01:00.150293 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29416441-t9lln"] Dec 06 02:01:00 crc kubenswrapper[4991]: E1206 02:01:00.151133 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2013c800-76e8-4c30-b489-bda82f0d6e77" containerName="registry-server" Dec 06 02:01:00 crc kubenswrapper[4991]: I1206 02:01:00.151148 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="2013c800-76e8-4c30-b489-bda82f0d6e77" containerName="registry-server" Dec 06 02:01:00 crc kubenswrapper[4991]: E1206 02:01:00.151168 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2013c800-76e8-4c30-b489-bda82f0d6e77" containerName="extract-content" Dec 06 02:01:00 crc kubenswrapper[4991]: I1206 02:01:00.151173 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="2013c800-76e8-4c30-b489-bda82f0d6e77" containerName="extract-content" Dec 06 02:01:00 crc kubenswrapper[4991]: E1206 02:01:00.151184 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2013c800-76e8-4c30-b489-bda82f0d6e77" containerName="extract-utilities" Dec 06 02:01:00 crc kubenswrapper[4991]: I1206 02:01:00.151190 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="2013c800-76e8-4c30-b489-bda82f0d6e77" containerName="extract-utilities" Dec 06 02:01:00 crc kubenswrapper[4991]: I1206 02:01:00.151371 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="2013c800-76e8-4c30-b489-bda82f0d6e77" containerName="registry-server" Dec 06 02:01:00 crc kubenswrapper[4991]: I1206 02:01:00.152033 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29416441-t9lln" Dec 06 02:01:00 crc kubenswrapper[4991]: I1206 02:01:00.172471 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29416441-t9lln"] Dec 06 02:01:00 crc kubenswrapper[4991]: I1206 02:01:00.300268 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/79a5c906-606c-4b08-bb99-fadf17a19cfa-fernet-keys\") pod \"keystone-cron-29416441-t9lln\" (UID: \"79a5c906-606c-4b08-bb99-fadf17a19cfa\") " pod="openstack/keystone-cron-29416441-t9lln" Dec 06 02:01:00 crc kubenswrapper[4991]: I1206 02:01:00.300406 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79a5c906-606c-4b08-bb99-fadf17a19cfa-combined-ca-bundle\") pod \"keystone-cron-29416441-t9lln\" (UID: \"79a5c906-606c-4b08-bb99-fadf17a19cfa\") " pod="openstack/keystone-cron-29416441-t9lln" Dec 06 02:01:00 crc kubenswrapper[4991]: I1206 02:01:00.300833 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swhw9\" (UniqueName: \"kubernetes.io/projected/79a5c906-606c-4b08-bb99-fadf17a19cfa-kube-api-access-swhw9\") pod \"keystone-cron-29416441-t9lln\" (UID: \"79a5c906-606c-4b08-bb99-fadf17a19cfa\") " pod="openstack/keystone-cron-29416441-t9lln" Dec 06 02:01:00 crc kubenswrapper[4991]: I1206 02:01:00.300920 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79a5c906-606c-4b08-bb99-fadf17a19cfa-config-data\") pod \"keystone-cron-29416441-t9lln\" (UID: \"79a5c906-606c-4b08-bb99-fadf17a19cfa\") " pod="openstack/keystone-cron-29416441-t9lln" Dec 06 02:01:00 crc kubenswrapper[4991]: I1206 02:01:00.403281 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/79a5c906-606c-4b08-bb99-fadf17a19cfa-fernet-keys\") pod \"keystone-cron-29416441-t9lln\" (UID: \"79a5c906-606c-4b08-bb99-fadf17a19cfa\") " pod="openstack/keystone-cron-29416441-t9lln" Dec 06 02:01:00 crc kubenswrapper[4991]: I1206 02:01:00.403369 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79a5c906-606c-4b08-bb99-fadf17a19cfa-combined-ca-bundle\") pod \"keystone-cron-29416441-t9lln\" (UID: \"79a5c906-606c-4b08-bb99-fadf17a19cfa\") " pod="openstack/keystone-cron-29416441-t9lln" Dec 06 02:01:00 crc kubenswrapper[4991]: I1206 02:01:00.403439 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swhw9\" (UniqueName: \"kubernetes.io/projected/79a5c906-606c-4b08-bb99-fadf17a19cfa-kube-api-access-swhw9\") pod \"keystone-cron-29416441-t9lln\" (UID: \"79a5c906-606c-4b08-bb99-fadf17a19cfa\") " pod="openstack/keystone-cron-29416441-t9lln" Dec 06 02:01:00 crc kubenswrapper[4991]: I1206 02:01:00.403472 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79a5c906-606c-4b08-bb99-fadf17a19cfa-config-data\") pod \"keystone-cron-29416441-t9lln\" (UID: \"79a5c906-606c-4b08-bb99-fadf17a19cfa\") " pod="openstack/keystone-cron-29416441-t9lln" Dec 06 02:01:00 crc kubenswrapper[4991]: I1206 02:01:00.410421 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79a5c906-606c-4b08-bb99-fadf17a19cfa-combined-ca-bundle\") pod \"keystone-cron-29416441-t9lln\" (UID: \"79a5c906-606c-4b08-bb99-fadf17a19cfa\") " pod="openstack/keystone-cron-29416441-t9lln" Dec 06 02:01:00 crc kubenswrapper[4991]: I1206 02:01:00.415303 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/79a5c906-606c-4b08-bb99-fadf17a19cfa-fernet-keys\") pod \"keystone-cron-29416441-t9lln\" (UID: \"79a5c906-606c-4b08-bb99-fadf17a19cfa\") " pod="openstack/keystone-cron-29416441-t9lln" Dec 06 02:01:00 crc kubenswrapper[4991]: I1206 02:01:00.417656 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79a5c906-606c-4b08-bb99-fadf17a19cfa-config-data\") pod \"keystone-cron-29416441-t9lln\" (UID: \"79a5c906-606c-4b08-bb99-fadf17a19cfa\") " pod="openstack/keystone-cron-29416441-t9lln" Dec 06 02:01:00 crc kubenswrapper[4991]: I1206 02:01:00.423524 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swhw9\" (UniqueName: \"kubernetes.io/projected/79a5c906-606c-4b08-bb99-fadf17a19cfa-kube-api-access-swhw9\") pod \"keystone-cron-29416441-t9lln\" (UID: \"79a5c906-606c-4b08-bb99-fadf17a19cfa\") " pod="openstack/keystone-cron-29416441-t9lln" Dec 06 02:01:00 crc kubenswrapper[4991]: I1206 02:01:00.480294 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29416441-t9lln" Dec 06 02:01:01 crc kubenswrapper[4991]: I1206 02:01:01.002278 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29416441-t9lln"] Dec 06 02:01:01 crc kubenswrapper[4991]: I1206 02:01:01.616933 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29416441-t9lln" event={"ID":"79a5c906-606c-4b08-bb99-fadf17a19cfa","Type":"ContainerStarted","Data":"d94f4b6d4603b5a7867702932f2c4c9ebcc79b66d633a9615e94d8ed6a4d45b9"} Dec 06 02:01:01 crc kubenswrapper[4991]: I1206 02:01:01.617045 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29416441-t9lln" event={"ID":"79a5c906-606c-4b08-bb99-fadf17a19cfa","Type":"ContainerStarted","Data":"de943b517822efe8b137bef310fc7ea7e18f25c5efef422e08a11268bff7bdb8"} Dec 06 02:01:01 crc kubenswrapper[4991]: I1206 02:01:01.633017 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29416441-t9lln" podStartSLOduration=1.63297734 podStartE2EDuration="1.63297734s" podCreationTimestamp="2025-12-06 02:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 02:01:01.63260635 +0000 UTC m=+3534.982896858" watchObservedRunningTime="2025-12-06 02:01:01.63297734 +0000 UTC m=+3534.983267828" Dec 06 02:01:03 crc kubenswrapper[4991]: I1206 02:01:03.649189 4991 generic.go:334] "Generic (PLEG): container finished" podID="79a5c906-606c-4b08-bb99-fadf17a19cfa" containerID="d94f4b6d4603b5a7867702932f2c4c9ebcc79b66d633a9615e94d8ed6a4d45b9" exitCode=0 Dec 06 02:01:03 crc kubenswrapper[4991]: I1206 02:01:03.649271 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29416441-t9lln" event={"ID":"79a5c906-606c-4b08-bb99-fadf17a19cfa","Type":"ContainerDied","Data":"d94f4b6d4603b5a7867702932f2c4c9ebcc79b66d633a9615e94d8ed6a4d45b9"} Dec 06 02:01:05 crc kubenswrapper[4991]: I1206 02:01:05.155038 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29416441-t9lln" Dec 06 02:01:05 crc kubenswrapper[4991]: I1206 02:01:05.320946 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79a5c906-606c-4b08-bb99-fadf17a19cfa-combined-ca-bundle\") pod \"79a5c906-606c-4b08-bb99-fadf17a19cfa\" (UID: \"79a5c906-606c-4b08-bb99-fadf17a19cfa\") " Dec 06 02:01:05 crc kubenswrapper[4991]: I1206 02:01:05.321189 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79a5c906-606c-4b08-bb99-fadf17a19cfa-config-data\") pod \"79a5c906-606c-4b08-bb99-fadf17a19cfa\" (UID: \"79a5c906-606c-4b08-bb99-fadf17a19cfa\") " Dec 06 02:01:05 crc kubenswrapper[4991]: I1206 02:01:05.321242 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swhw9\" (UniqueName: \"kubernetes.io/projected/79a5c906-606c-4b08-bb99-fadf17a19cfa-kube-api-access-swhw9\") pod \"79a5c906-606c-4b08-bb99-fadf17a19cfa\" (UID: \"79a5c906-606c-4b08-bb99-fadf17a19cfa\") " Dec 06 02:01:05 crc kubenswrapper[4991]: I1206 02:01:05.321354 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/79a5c906-606c-4b08-bb99-fadf17a19cfa-fernet-keys\") pod \"79a5c906-606c-4b08-bb99-fadf17a19cfa\" (UID: \"79a5c906-606c-4b08-bb99-fadf17a19cfa\") " Dec 06 02:01:05 crc kubenswrapper[4991]: I1206 02:01:05.337790 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79a5c906-606c-4b08-bb99-fadf17a19cfa-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "79a5c906-606c-4b08-bb99-fadf17a19cfa" (UID: "79a5c906-606c-4b08-bb99-fadf17a19cfa"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 02:01:05 crc kubenswrapper[4991]: I1206 02:01:05.337947 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79a5c906-606c-4b08-bb99-fadf17a19cfa-kube-api-access-swhw9" (OuterVolumeSpecName: "kube-api-access-swhw9") pod "79a5c906-606c-4b08-bb99-fadf17a19cfa" (UID: "79a5c906-606c-4b08-bb99-fadf17a19cfa"). InnerVolumeSpecName "kube-api-access-swhw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 02:01:05 crc kubenswrapper[4991]: I1206 02:01:05.362189 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79a5c906-606c-4b08-bb99-fadf17a19cfa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "79a5c906-606c-4b08-bb99-fadf17a19cfa" (UID: "79a5c906-606c-4b08-bb99-fadf17a19cfa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 02:01:05 crc kubenswrapper[4991]: I1206 02:01:05.413292 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79a5c906-606c-4b08-bb99-fadf17a19cfa-config-data" (OuterVolumeSpecName: "config-data") pod "79a5c906-606c-4b08-bb99-fadf17a19cfa" (UID: "79a5c906-606c-4b08-bb99-fadf17a19cfa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 02:01:05 crc kubenswrapper[4991]: I1206 02:01:05.423739 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79a5c906-606c-4b08-bb99-fadf17a19cfa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 02:01:05 crc kubenswrapper[4991]: I1206 02:01:05.423772 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79a5c906-606c-4b08-bb99-fadf17a19cfa-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 02:01:05 crc kubenswrapper[4991]: I1206 02:01:05.423782 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swhw9\" (UniqueName: \"kubernetes.io/projected/79a5c906-606c-4b08-bb99-fadf17a19cfa-kube-api-access-swhw9\") on node \"crc\" DevicePath \"\"" Dec 06 02:01:05 crc kubenswrapper[4991]: I1206 02:01:05.423793 4991 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/79a5c906-606c-4b08-bb99-fadf17a19cfa-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 06 02:01:05 crc kubenswrapper[4991]: I1206 02:01:05.675333 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29416441-t9lln" event={"ID":"79a5c906-606c-4b08-bb99-fadf17a19cfa","Type":"ContainerDied","Data":"de943b517822efe8b137bef310fc7ea7e18f25c5efef422e08a11268bff7bdb8"} Dec 06 02:01:05 crc kubenswrapper[4991]: I1206 02:01:05.676178 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de943b517822efe8b137bef310fc7ea7e18f25c5efef422e08a11268bff7bdb8" Dec 06 02:01:05 crc kubenswrapper[4991]: I1206 02:01:05.675356 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29416441-t9lln" Dec 06 02:01:46 crc kubenswrapper[4991]: I1206 02:01:46.739933 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 02:01:46 crc kubenswrapper[4991]: I1206 02:01:46.740604 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 02:02:16 crc kubenswrapper[4991]: I1206 02:02:16.739765 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 02:02:16 crc kubenswrapper[4991]: I1206 02:02:16.740263 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 02:02:27 crc kubenswrapper[4991]: I1206 02:02:27.623633 4991 generic.go:334] "Generic (PLEG): container finished" podID="67d48b5b-50a6-4515-9ca7-bc491239938a" containerID="d432e23e5c24ea0c1b9b7e136a07cffe9da78cd514ad1093342994d78c496aef" exitCode=0 Dec 06 02:02:27 crc kubenswrapper[4991]: I1206 02:02:27.623717 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"67d48b5b-50a6-4515-9ca7-bc491239938a","Type":"ContainerDied","Data":"d432e23e5c24ea0c1b9b7e136a07cffe9da78cd514ad1093342994d78c496aef"} Dec 06 02:02:29 crc kubenswrapper[4991]: I1206 02:02:29.125790 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 06 02:02:29 crc kubenswrapper[4991]: I1206 02:02:29.281235 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgzsk\" (UniqueName: \"kubernetes.io/projected/67d48b5b-50a6-4515-9ca7-bc491239938a-kube-api-access-mgzsk\") pod \"67d48b5b-50a6-4515-9ca7-bc491239938a\" (UID: \"67d48b5b-50a6-4515-9ca7-bc491239938a\") " Dec 06 02:02:29 crc kubenswrapper[4991]: I1206 02:02:29.281696 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/67d48b5b-50a6-4515-9ca7-bc491239938a-ca-certs\") pod \"67d48b5b-50a6-4515-9ca7-bc491239938a\" (UID: \"67d48b5b-50a6-4515-9ca7-bc491239938a\") " Dec 06 02:02:29 crc kubenswrapper[4991]: I1206 02:02:29.281727 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/67d48b5b-50a6-4515-9ca7-bc491239938a-openstack-config-secret\") pod \"67d48b5b-50a6-4515-9ca7-bc491239938a\" (UID: \"67d48b5b-50a6-4515-9ca7-bc491239938a\") " Dec 06 02:02:29 crc kubenswrapper[4991]: I1206 02:02:29.281840 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/67d48b5b-50a6-4515-9ca7-bc491239938a-config-data\") pod \"67d48b5b-50a6-4515-9ca7-bc491239938a\" (UID: \"67d48b5b-50a6-4515-9ca7-bc491239938a\") " Dec 06 02:02:29 crc kubenswrapper[4991]: I1206 02:02:29.281868 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/67d48b5b-50a6-4515-9ca7-bc491239938a-ssh-key\") pod \"67d48b5b-50a6-4515-9ca7-bc491239938a\" (UID: \"67d48b5b-50a6-4515-9ca7-bc491239938a\") " Dec 06 02:02:29 crc kubenswrapper[4991]: I1206 02:02:29.281957 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"67d48b5b-50a6-4515-9ca7-bc491239938a\" (UID: \"67d48b5b-50a6-4515-9ca7-bc491239938a\") " Dec 06 02:02:29 crc kubenswrapper[4991]: I1206 02:02:29.282004 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/67d48b5b-50a6-4515-9ca7-bc491239938a-test-operator-ephemeral-workdir\") pod \"67d48b5b-50a6-4515-9ca7-bc491239938a\" (UID: \"67d48b5b-50a6-4515-9ca7-bc491239938a\") " Dec 06 02:02:29 crc kubenswrapper[4991]: I1206 02:02:29.282045 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/67d48b5b-50a6-4515-9ca7-bc491239938a-test-operator-ephemeral-temporary\") pod \"67d48b5b-50a6-4515-9ca7-bc491239938a\" (UID: \"67d48b5b-50a6-4515-9ca7-bc491239938a\") " Dec 06 02:02:29 crc kubenswrapper[4991]: I1206 02:02:29.282096 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/67d48b5b-50a6-4515-9ca7-bc491239938a-openstack-config\") pod \"67d48b5b-50a6-4515-9ca7-bc491239938a\" (UID: \"67d48b5b-50a6-4515-9ca7-bc491239938a\") " Dec 06 02:02:29 crc kubenswrapper[4991]: I1206 02:02:29.283536 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67d48b5b-50a6-4515-9ca7-bc491239938a-config-data" (OuterVolumeSpecName: "config-data") pod "67d48b5b-50a6-4515-9ca7-bc491239938a" (UID: "67d48b5b-50a6-4515-9ca7-bc491239938a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 02:02:29 crc kubenswrapper[4991]: I1206 02:02:29.284403 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67d48b5b-50a6-4515-9ca7-bc491239938a-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "67d48b5b-50a6-4515-9ca7-bc491239938a" (UID: "67d48b5b-50a6-4515-9ca7-bc491239938a"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 02:02:29 crc kubenswrapper[4991]: I1206 02:02:29.288349 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67d48b5b-50a6-4515-9ca7-bc491239938a-kube-api-access-mgzsk" (OuterVolumeSpecName: "kube-api-access-mgzsk") pod "67d48b5b-50a6-4515-9ca7-bc491239938a" (UID: "67d48b5b-50a6-4515-9ca7-bc491239938a"). InnerVolumeSpecName "kube-api-access-mgzsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 02:02:29 crc kubenswrapper[4991]: I1206 02:02:29.291419 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "test-operator-logs") pod "67d48b5b-50a6-4515-9ca7-bc491239938a" (UID: "67d48b5b-50a6-4515-9ca7-bc491239938a"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 06 02:02:29 crc kubenswrapper[4991]: I1206 02:02:29.292912 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67d48b5b-50a6-4515-9ca7-bc491239938a-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "67d48b5b-50a6-4515-9ca7-bc491239938a" (UID: "67d48b5b-50a6-4515-9ca7-bc491239938a"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 02:02:29 crc kubenswrapper[4991]: I1206 02:02:29.316659 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67d48b5b-50a6-4515-9ca7-bc491239938a-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "67d48b5b-50a6-4515-9ca7-bc491239938a" (UID: "67d48b5b-50a6-4515-9ca7-bc491239938a"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 02:02:29 crc kubenswrapper[4991]: I1206 02:02:29.323663 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67d48b5b-50a6-4515-9ca7-bc491239938a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "67d48b5b-50a6-4515-9ca7-bc491239938a" (UID: "67d48b5b-50a6-4515-9ca7-bc491239938a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 02:02:29 crc kubenswrapper[4991]: I1206 02:02:29.353185 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67d48b5b-50a6-4515-9ca7-bc491239938a-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "67d48b5b-50a6-4515-9ca7-bc491239938a" (UID: "67d48b5b-50a6-4515-9ca7-bc491239938a"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 02:02:29 crc kubenswrapper[4991]: I1206 02:02:29.367727 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67d48b5b-50a6-4515-9ca7-bc491239938a-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "67d48b5b-50a6-4515-9ca7-bc491239938a" (UID: "67d48b5b-50a6-4515-9ca7-bc491239938a"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 02:02:29 crc kubenswrapper[4991]: I1206 02:02:29.384656 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/67d48b5b-50a6-4515-9ca7-bc491239938a-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 02:02:29 crc kubenswrapper[4991]: I1206 02:02:29.384697 4991 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/67d48b5b-50a6-4515-9ca7-bc491239938a-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 02:02:29 crc kubenswrapper[4991]: I1206 02:02:29.384738 4991 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Dec 06 02:02:29 crc kubenswrapper[4991]: I1206 02:02:29.384752 4991 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/67d48b5b-50a6-4515-9ca7-bc491239938a-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 06 02:02:29 crc kubenswrapper[4991]: I1206 02:02:29.384767 4991 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/67d48b5b-50a6-4515-9ca7-bc491239938a-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 06 02:02:29 crc kubenswrapper[4991]: I1206 02:02:29.384779 4991 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/67d48b5b-50a6-4515-9ca7-bc491239938a-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 06 02:02:29 crc kubenswrapper[4991]: I1206 02:02:29.384792 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgzsk\" (UniqueName: \"kubernetes.io/projected/67d48b5b-50a6-4515-9ca7-bc491239938a-kube-api-access-mgzsk\") on node \"crc\" DevicePath \"\"" Dec 06 02:02:29 crc kubenswrapper[4991]: I1206 02:02:29.384805 4991 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/67d48b5b-50a6-4515-9ca7-bc491239938a-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 06 02:02:29 crc kubenswrapper[4991]: I1206 02:02:29.384819 4991 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/67d48b5b-50a6-4515-9ca7-bc491239938a-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 06 02:02:29 crc kubenswrapper[4991]: I1206 02:02:29.415636 4991 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Dec 06 02:02:29 crc kubenswrapper[4991]: I1206 02:02:29.487579 4991 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Dec 06 02:02:29 crc kubenswrapper[4991]: I1206 02:02:29.659589 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"67d48b5b-50a6-4515-9ca7-bc491239938a","Type":"ContainerDied","Data":"b66a1fdf75be0991435d200277b3e3946ebafc9eefbdbb7e4b91c643417f351f"} Dec 06 02:02:29 crc kubenswrapper[4991]: I1206 02:02:29.659631 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b66a1fdf75be0991435d200277b3e3946ebafc9eefbdbb7e4b91c643417f351f" Dec 06 02:02:29 crc kubenswrapper[4991]: I1206 02:02:29.659634 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 06 02:02:31 crc kubenswrapper[4991]: I1206 02:02:31.701480 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 06 02:02:31 crc kubenswrapper[4991]: E1206 02:02:31.702602 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67d48b5b-50a6-4515-9ca7-bc491239938a" containerName="tempest-tests-tempest-tests-runner" Dec 06 02:02:31 crc kubenswrapper[4991]: I1206 02:02:31.702620 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="67d48b5b-50a6-4515-9ca7-bc491239938a" containerName="tempest-tests-tempest-tests-runner" Dec 06 02:02:31 crc kubenswrapper[4991]: E1206 02:02:31.702657 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79a5c906-606c-4b08-bb99-fadf17a19cfa" containerName="keystone-cron" Dec 06 02:02:31 crc kubenswrapper[4991]: I1206 02:02:31.702666 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="79a5c906-606c-4b08-bb99-fadf17a19cfa" containerName="keystone-cron" Dec 06 02:02:31 crc kubenswrapper[4991]: I1206 02:02:31.702885 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="79a5c906-606c-4b08-bb99-fadf17a19cfa" containerName="keystone-cron" Dec 06 02:02:31 crc kubenswrapper[4991]: I1206 02:02:31.702909 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="67d48b5b-50a6-4515-9ca7-bc491239938a" containerName="tempest-tests-tempest-tests-runner" Dec 06 02:02:31 crc kubenswrapper[4991]: I1206 02:02:31.703679 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 06 02:02:31 crc kubenswrapper[4991]: I1206 02:02:31.707825 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-759tp" Dec 06 02:02:31 crc kubenswrapper[4991]: I1206 02:02:31.721783 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 06 02:02:31 crc kubenswrapper[4991]: I1206 02:02:31.846452 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m848h\" (UniqueName: \"kubernetes.io/projected/ce84be0d-7832-4602-a7fb-977d635de15b-kube-api-access-m848h\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ce84be0d-7832-4602-a7fb-977d635de15b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 06 02:02:31 crc kubenswrapper[4991]: I1206 02:02:31.846674 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ce84be0d-7832-4602-a7fb-977d635de15b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 06 02:02:31 crc kubenswrapper[4991]: I1206 02:02:31.948858 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m848h\" (UniqueName: \"kubernetes.io/projected/ce84be0d-7832-4602-a7fb-977d635de15b-kube-api-access-m848h\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ce84be0d-7832-4602-a7fb-977d635de15b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 06 02:02:31 crc kubenswrapper[4991]: I1206 02:02:31.949360 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ce84be0d-7832-4602-a7fb-977d635de15b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 06 02:02:31 crc kubenswrapper[4991]: I1206 02:02:31.952967 4991 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ce84be0d-7832-4602-a7fb-977d635de15b\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 06 02:02:31 crc kubenswrapper[4991]: I1206 02:02:31.971815 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m848h\" (UniqueName: \"kubernetes.io/projected/ce84be0d-7832-4602-a7fb-977d635de15b-kube-api-access-m848h\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ce84be0d-7832-4602-a7fb-977d635de15b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 06 02:02:32 crc kubenswrapper[4991]: I1206 02:02:32.007556 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ce84be0d-7832-4602-a7fb-977d635de15b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 06 02:02:32 crc kubenswrapper[4991]: I1206 02:02:32.065563 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 06 02:02:32 crc kubenswrapper[4991]: I1206 02:02:32.602400 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 06 02:02:32 crc kubenswrapper[4991]: I1206 02:02:32.696358 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"ce84be0d-7832-4602-a7fb-977d635de15b","Type":"ContainerStarted","Data":"edebd45b0f9e5316ae031d0df7a84f2d5a040ca62499ec25a94620e1d108663c"} Dec 06 02:02:34 crc kubenswrapper[4991]: I1206 02:02:34.720670 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"ce84be0d-7832-4602-a7fb-977d635de15b","Type":"ContainerStarted","Data":"1e1fe0ad5f2f05196c6412d4a75ed500c3ad3ea10dc11ca97434de9ea11ed720"} Dec 06 02:02:34 crc kubenswrapper[4991]: I1206 02:02:34.743055 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.72989923 podStartE2EDuration="3.74303842s" podCreationTimestamp="2025-12-06 02:02:31 +0000 UTC" firstStartedPulling="2025-12-06 02:02:32.608103077 +0000 UTC m=+3625.958393545" lastFinishedPulling="2025-12-06 02:02:33.621242227 +0000 UTC m=+3626.971532735" observedRunningTime="2025-12-06 02:02:34.738753897 +0000 UTC m=+3628.089044415" watchObservedRunningTime="2025-12-06 02:02:34.74303842 +0000 UTC m=+3628.093328888" Dec 06 02:02:46 crc kubenswrapper[4991]: I1206 02:02:46.739654 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 02:02:46 crc kubenswrapper[4991]: I1206 02:02:46.740531 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 02:02:46 crc kubenswrapper[4991]: I1206 02:02:46.740608 4991 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" Dec 06 02:02:46 crc kubenswrapper[4991]: I1206 02:02:46.741745 4991 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"569db4596366e9518cb632f3539d5e3f236a4f822ba96150dcc6d1c03fe9e86d"} pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 02:02:46 crc kubenswrapper[4991]: I1206 02:02:46.741857 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" containerID="cri-o://569db4596366e9518cb632f3539d5e3f236a4f822ba96150dcc6d1c03fe9e86d" gracePeriod=600 Dec 06 02:02:46 crc kubenswrapper[4991]: I1206 02:02:46.875237 4991 generic.go:334] "Generic (PLEG): container finished" podID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerID="569db4596366e9518cb632f3539d5e3f236a4f822ba96150dcc6d1c03fe9e86d" exitCode=0 Dec 06 02:02:46 crc kubenswrapper[4991]: I1206 02:02:46.875369 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" event={"ID":"f35cea72-9e31-4432-9e3b-16a50ebce794","Type":"ContainerDied","Data":"569db4596366e9518cb632f3539d5e3f236a4f822ba96150dcc6d1c03fe9e86d"} Dec 06 02:02:46 crc kubenswrapper[4991]: I1206 02:02:46.875420 4991 scope.go:117] "RemoveContainer" containerID="a081368548455c25dff8453cd575b59dbccfd8323726b50732f5e6e2f20570a0" Dec 06 02:02:47 crc kubenswrapper[4991]: I1206 02:02:47.940232 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" event={"ID":"f35cea72-9e31-4432-9e3b-16a50ebce794","Type":"ContainerStarted","Data":"3f69fd2784f79b950316e24a856a47d6fb8cc89cd4f6585f544e9f30b9f5185a"} Dec 06 02:02:56 crc kubenswrapper[4991]: I1206 02:02:56.571365 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5l8jz/must-gather-wqhp2"] Dec 06 02:02:56 crc kubenswrapper[4991]: I1206 02:02:56.574279 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5l8jz/must-gather-wqhp2" Dec 06 02:02:56 crc kubenswrapper[4991]: I1206 02:02:56.583208 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-5l8jz"/"kube-root-ca.crt" Dec 06 02:02:56 crc kubenswrapper[4991]: I1206 02:02:56.583305 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-5l8jz"/"openshift-service-ca.crt" Dec 06 02:02:56 crc kubenswrapper[4991]: I1206 02:02:56.600094 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5l8jz/must-gather-wqhp2"] Dec 06 02:02:56 crc kubenswrapper[4991]: I1206 02:02:56.703807 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4bd7f7ba-4ca7-4a22-a400-10f272a42e5f-must-gather-output\") pod \"must-gather-wqhp2\" (UID: \"4bd7f7ba-4ca7-4a22-a400-10f272a42e5f\") " pod="openshift-must-gather-5l8jz/must-gather-wqhp2" Dec 06 02:02:56 crc kubenswrapper[4991]: I1206 02:02:56.704005 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbmbv\" (UniqueName: \"kubernetes.io/projected/4bd7f7ba-4ca7-4a22-a400-10f272a42e5f-kube-api-access-cbmbv\") pod \"must-gather-wqhp2\" (UID: \"4bd7f7ba-4ca7-4a22-a400-10f272a42e5f\") " pod="openshift-must-gather-5l8jz/must-gather-wqhp2" Dec 06 02:02:56 crc kubenswrapper[4991]: I1206 02:02:56.806406 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4bd7f7ba-4ca7-4a22-a400-10f272a42e5f-must-gather-output\") pod \"must-gather-wqhp2\" (UID: \"4bd7f7ba-4ca7-4a22-a400-10f272a42e5f\") " pod="openshift-must-gather-5l8jz/must-gather-wqhp2" Dec 06 02:02:56 crc kubenswrapper[4991]: I1206 02:02:56.806474 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbmbv\" (UniqueName: \"kubernetes.io/projected/4bd7f7ba-4ca7-4a22-a400-10f272a42e5f-kube-api-access-cbmbv\") pod \"must-gather-wqhp2\" (UID: \"4bd7f7ba-4ca7-4a22-a400-10f272a42e5f\") " pod="openshift-must-gather-5l8jz/must-gather-wqhp2" Dec 06 02:02:56 crc kubenswrapper[4991]: I1206 02:02:56.807193 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4bd7f7ba-4ca7-4a22-a400-10f272a42e5f-must-gather-output\") pod \"must-gather-wqhp2\" (UID: \"4bd7f7ba-4ca7-4a22-a400-10f272a42e5f\") " pod="openshift-must-gather-5l8jz/must-gather-wqhp2" Dec 06 02:02:56 crc kubenswrapper[4991]: I1206 02:02:56.826226 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbmbv\" (UniqueName: \"kubernetes.io/projected/4bd7f7ba-4ca7-4a22-a400-10f272a42e5f-kube-api-access-cbmbv\") pod \"must-gather-wqhp2\" (UID: \"4bd7f7ba-4ca7-4a22-a400-10f272a42e5f\") " pod="openshift-must-gather-5l8jz/must-gather-wqhp2" Dec 06 02:02:56 crc kubenswrapper[4991]: I1206 02:02:56.919946 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5l8jz/must-gather-wqhp2" Dec 06 02:02:57 crc kubenswrapper[4991]: I1206 02:02:57.407315 4991 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 02:02:57 crc kubenswrapper[4991]: I1206 02:02:57.415832 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5l8jz/must-gather-wqhp2"] Dec 06 02:02:58 crc kubenswrapper[4991]: I1206 02:02:58.086494 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5l8jz/must-gather-wqhp2" event={"ID":"4bd7f7ba-4ca7-4a22-a400-10f272a42e5f","Type":"ContainerStarted","Data":"8cdf2e5a801875985985f6a445bbe9cddfe20c74ce285728b6934b132b79c685"} Dec 06 02:03:02 crc kubenswrapper[4991]: I1206 02:03:02.126607 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5l8jz/must-gather-wqhp2" event={"ID":"4bd7f7ba-4ca7-4a22-a400-10f272a42e5f","Type":"ContainerStarted","Data":"c8f4607ef8c4aafd717123a4f52f8e3c0ed4986df5fa6e2ad0146eed6f60831d"} Dec 06 02:03:03 crc kubenswrapper[4991]: I1206 02:03:03.142672 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5l8jz/must-gather-wqhp2" event={"ID":"4bd7f7ba-4ca7-4a22-a400-10f272a42e5f","Type":"ContainerStarted","Data":"d79d6e9cea10f86c85722e1387ea68540194aeb5c964682c3f9f1290e37ac355"} Dec 06 02:03:03 crc kubenswrapper[4991]: I1206 02:03:03.178293 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5l8jz/must-gather-wqhp2" podStartSLOduration=2.853855116 podStartE2EDuration="7.178261223s" podCreationTimestamp="2025-12-06 02:02:56 +0000 UTC" firstStartedPulling="2025-12-06 02:02:57.407209393 +0000 UTC m=+3650.757499861" lastFinishedPulling="2025-12-06 02:03:01.7316155 +0000 UTC m=+3655.081905968" observedRunningTime="2025-12-06 02:03:03.16188723 +0000 UTC m=+3656.512177718" watchObservedRunningTime="2025-12-06 02:03:03.178261223 +0000 UTC m=+3656.528551691" Dec 06 02:03:06 crc kubenswrapper[4991]: I1206 02:03:06.114909 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5l8jz/crc-debug-vmv7n"] Dec 06 02:03:06 crc kubenswrapper[4991]: I1206 02:03:06.117060 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5l8jz/crc-debug-vmv7n" Dec 06 02:03:06 crc kubenswrapper[4991]: I1206 02:03:06.119489 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-5l8jz"/"default-dockercfg-49sfb" Dec 06 02:03:06 crc kubenswrapper[4991]: I1206 02:03:06.220732 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/14efb60c-a1f6-426e-aa6a-be1ec8fbb2bc-host\") pod \"crc-debug-vmv7n\" (UID: \"14efb60c-a1f6-426e-aa6a-be1ec8fbb2bc\") " pod="openshift-must-gather-5l8jz/crc-debug-vmv7n" Dec 06 02:03:06 crc kubenswrapper[4991]: I1206 02:03:06.221271 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6msn\" (UniqueName: \"kubernetes.io/projected/14efb60c-a1f6-426e-aa6a-be1ec8fbb2bc-kube-api-access-j6msn\") pod \"crc-debug-vmv7n\" (UID: \"14efb60c-a1f6-426e-aa6a-be1ec8fbb2bc\") " pod="openshift-must-gather-5l8jz/crc-debug-vmv7n" Dec 06 02:03:06 crc kubenswrapper[4991]: I1206 02:03:06.324118 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6msn\" (UniqueName: \"kubernetes.io/projected/14efb60c-a1f6-426e-aa6a-be1ec8fbb2bc-kube-api-access-j6msn\") pod \"crc-debug-vmv7n\" (UID: \"14efb60c-a1f6-426e-aa6a-be1ec8fbb2bc\") " pod="openshift-must-gather-5l8jz/crc-debug-vmv7n" Dec 06 02:03:06 crc kubenswrapper[4991]: I1206 02:03:06.324908 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/14efb60c-a1f6-426e-aa6a-be1ec8fbb2bc-host\") pod \"crc-debug-vmv7n\" (UID: \"14efb60c-a1f6-426e-aa6a-be1ec8fbb2bc\") " pod="openshift-must-gather-5l8jz/crc-debug-vmv7n" Dec 06 02:03:06 crc kubenswrapper[4991]: I1206 02:03:06.325040 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/14efb60c-a1f6-426e-aa6a-be1ec8fbb2bc-host\") pod \"crc-debug-vmv7n\" (UID: \"14efb60c-a1f6-426e-aa6a-be1ec8fbb2bc\") " pod="openshift-must-gather-5l8jz/crc-debug-vmv7n" Dec 06 02:03:06 crc kubenswrapper[4991]: I1206 02:03:06.347246 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-54vqs"] Dec 06 02:03:06 crc kubenswrapper[4991]: I1206 02:03:06.350436 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-54vqs" Dec 06 02:03:06 crc kubenswrapper[4991]: I1206 02:03:06.367917 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-54vqs"] Dec 06 02:03:06 crc kubenswrapper[4991]: I1206 02:03:06.368563 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6msn\" (UniqueName: \"kubernetes.io/projected/14efb60c-a1f6-426e-aa6a-be1ec8fbb2bc-kube-api-access-j6msn\") pod \"crc-debug-vmv7n\" (UID: \"14efb60c-a1f6-426e-aa6a-be1ec8fbb2bc\") " pod="openshift-must-gather-5l8jz/crc-debug-vmv7n" Dec 06 02:03:06 crc kubenswrapper[4991]: I1206 02:03:06.440340 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5l8jz/crc-debug-vmv7n" Dec 06 02:03:06 crc kubenswrapper[4991]: W1206 02:03:06.488762 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14efb60c_a1f6_426e_aa6a_be1ec8fbb2bc.slice/crio-3a2964e2e164b3790b5d172405678294ee41ef65688ec4292c0e9fa19a188bbf WatchSource:0}: Error finding container 3a2964e2e164b3790b5d172405678294ee41ef65688ec4292c0e9fa19a188bbf: Status 404 returned error can't find the container with id 3a2964e2e164b3790b5d172405678294ee41ef65688ec4292c0e9fa19a188bbf Dec 06 02:03:06 crc kubenswrapper[4991]: I1206 02:03:06.531789 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30f32081-98b7-426c-8782-6df5b33657df-utilities\") pod \"redhat-marketplace-54vqs\" (UID: \"30f32081-98b7-426c-8782-6df5b33657df\") " pod="openshift-marketplace/redhat-marketplace-54vqs" Dec 06 02:03:06 crc kubenswrapper[4991]: I1206 02:03:06.531896 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30f32081-98b7-426c-8782-6df5b33657df-catalog-content\") pod \"redhat-marketplace-54vqs\" (UID: \"30f32081-98b7-426c-8782-6df5b33657df\") " pod="openshift-marketplace/redhat-marketplace-54vqs" Dec 06 02:03:06 crc kubenswrapper[4991]: I1206 02:03:06.531935 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z9k4\" (UniqueName: \"kubernetes.io/projected/30f32081-98b7-426c-8782-6df5b33657df-kube-api-access-6z9k4\") pod \"redhat-marketplace-54vqs\" (UID: \"30f32081-98b7-426c-8782-6df5b33657df\") " pod="openshift-marketplace/redhat-marketplace-54vqs" Dec 06 02:03:06 crc kubenswrapper[4991]: I1206 02:03:06.634889 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30f32081-98b7-426c-8782-6df5b33657df-utilities\") pod \"redhat-marketplace-54vqs\" (UID: \"30f32081-98b7-426c-8782-6df5b33657df\") " pod="openshift-marketplace/redhat-marketplace-54vqs" Dec 06 02:03:06 crc kubenswrapper[4991]: I1206 02:03:06.635349 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30f32081-98b7-426c-8782-6df5b33657df-catalog-content\") pod \"redhat-marketplace-54vqs\" (UID: \"30f32081-98b7-426c-8782-6df5b33657df\") " pod="openshift-marketplace/redhat-marketplace-54vqs" Dec 06 02:03:06 crc kubenswrapper[4991]: I1206 02:03:06.635943 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30f32081-98b7-426c-8782-6df5b33657df-utilities\") pod \"redhat-marketplace-54vqs\" (UID: \"30f32081-98b7-426c-8782-6df5b33657df\") " pod="openshift-marketplace/redhat-marketplace-54vqs" Dec 06 02:03:06 crc kubenswrapper[4991]: I1206 02:03:06.636106 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z9k4\" (UniqueName: \"kubernetes.io/projected/30f32081-98b7-426c-8782-6df5b33657df-kube-api-access-6z9k4\") pod \"redhat-marketplace-54vqs\" (UID: \"30f32081-98b7-426c-8782-6df5b33657df\") " pod="openshift-marketplace/redhat-marketplace-54vqs" Dec 06 02:03:06 crc kubenswrapper[4991]: I1206 02:03:06.636061 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30f32081-98b7-426c-8782-6df5b33657df-catalog-content\") pod \"redhat-marketplace-54vqs\" (UID: \"30f32081-98b7-426c-8782-6df5b33657df\") " pod="openshift-marketplace/redhat-marketplace-54vqs" Dec 06 02:03:06 crc kubenswrapper[4991]: I1206 02:03:06.657444 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z9k4\" (UniqueName: \"kubernetes.io/projected/30f32081-98b7-426c-8782-6df5b33657df-kube-api-access-6z9k4\") pod \"redhat-marketplace-54vqs\" (UID: \"30f32081-98b7-426c-8782-6df5b33657df\") " pod="openshift-marketplace/redhat-marketplace-54vqs" Dec 06 02:03:06 crc kubenswrapper[4991]: I1206 02:03:06.719302 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-54vqs" Dec 06 02:03:07 crc kubenswrapper[4991]: I1206 02:03:07.188000 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5l8jz/crc-debug-vmv7n" event={"ID":"14efb60c-a1f6-426e-aa6a-be1ec8fbb2bc","Type":"ContainerStarted","Data":"3a2964e2e164b3790b5d172405678294ee41ef65688ec4292c0e9fa19a188bbf"} Dec 06 02:03:07 crc kubenswrapper[4991]: I1206 02:03:07.248384 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-54vqs"] Dec 06 02:03:07 crc kubenswrapper[4991]: W1206 02:03:07.258531 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30f32081_98b7_426c_8782_6df5b33657df.slice/crio-edf089c7003056899c16ad0ca32c41a8100f3340cb8da4a37d9b81b6fc20c4e1 WatchSource:0}: Error finding container edf089c7003056899c16ad0ca32c41a8100f3340cb8da4a37d9b81b6fc20c4e1: Status 404 returned error can't find the container with id edf089c7003056899c16ad0ca32c41a8100f3340cb8da4a37d9b81b6fc20c4e1 Dec 06 02:03:08 crc kubenswrapper[4991]: I1206 02:03:08.199131 4991 generic.go:334] "Generic (PLEG): container finished" podID="30f32081-98b7-426c-8782-6df5b33657df" containerID="0366119fe3051d3dd1b033d4ac45cc55fa4847dac02d71f0a511abc051b38102" exitCode=0 Dec 06 02:03:08 crc kubenswrapper[4991]: I1206 02:03:08.199832 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-54vqs" event={"ID":"30f32081-98b7-426c-8782-6df5b33657df","Type":"ContainerDied","Data":"0366119fe3051d3dd1b033d4ac45cc55fa4847dac02d71f0a511abc051b38102"} Dec 06 02:03:08 crc kubenswrapper[4991]: I1206 02:03:08.199873 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-54vqs" event={"ID":"30f32081-98b7-426c-8782-6df5b33657df","Type":"ContainerStarted","Data":"edf089c7003056899c16ad0ca32c41a8100f3340cb8da4a37d9b81b6fc20c4e1"} Dec 06 02:03:09 crc kubenswrapper[4991]: I1206 02:03:09.216071 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-54vqs" event={"ID":"30f32081-98b7-426c-8782-6df5b33657df","Type":"ContainerStarted","Data":"2178a3bf313fcb8f9068fa46f963b50fa0c558ccd18148d10702fcaecf930226"} Dec 06 02:03:10 crc kubenswrapper[4991]: I1206 02:03:10.230652 4991 generic.go:334] "Generic (PLEG): container finished" podID="30f32081-98b7-426c-8782-6df5b33657df" containerID="2178a3bf313fcb8f9068fa46f963b50fa0c558ccd18148d10702fcaecf930226" exitCode=0 Dec 06 02:03:10 crc kubenswrapper[4991]: I1206 02:03:10.230700 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-54vqs" event={"ID":"30f32081-98b7-426c-8782-6df5b33657df","Type":"ContainerDied","Data":"2178a3bf313fcb8f9068fa46f963b50fa0c558ccd18148d10702fcaecf930226"} Dec 06 02:03:11 crc kubenswrapper[4991]: I1206 02:03:11.244892 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-54vqs" event={"ID":"30f32081-98b7-426c-8782-6df5b33657df","Type":"ContainerStarted","Data":"2e9ce3dc8552e01c314759a34dc1a3c3bcd4968696cc2ae1a3e920ba752be5f2"} Dec 06 02:03:11 crc kubenswrapper[4991]: I1206 02:03:11.266730 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-54vqs" podStartSLOduration=2.849739068 podStartE2EDuration="5.266709639s" podCreationTimestamp="2025-12-06 02:03:06 +0000 UTC" firstStartedPulling="2025-12-06 02:03:08.202469013 +0000 UTC m=+3661.552759471" lastFinishedPulling="2025-12-06 02:03:10.619439574 +0000 UTC m=+3663.969730042" observedRunningTime="2025-12-06 02:03:11.262416845 +0000 UTC m=+3664.612707323" watchObservedRunningTime="2025-12-06 02:03:11.266709639 +0000 UTC m=+3664.617000107" Dec 06 02:03:16 crc kubenswrapper[4991]: I1206 02:03:16.720554 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-54vqs" Dec 06 02:03:16 crc kubenswrapper[4991]: I1206 02:03:16.721135 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-54vqs" Dec 06 02:03:16 crc kubenswrapper[4991]: I1206 02:03:16.776425 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-54vqs" Dec 06 02:03:17 crc kubenswrapper[4991]: I1206 02:03:17.354553 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-54vqs" Dec 06 02:03:17 crc kubenswrapper[4991]: I1206 02:03:17.413526 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-54vqs"] Dec 06 02:03:19 crc kubenswrapper[4991]: I1206 02:03:19.325568 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-54vqs" podUID="30f32081-98b7-426c-8782-6df5b33657df" containerName="registry-server" containerID="cri-o://2e9ce3dc8552e01c314759a34dc1a3c3bcd4968696cc2ae1a3e920ba752be5f2" gracePeriod=2 Dec 06 02:03:19 crc kubenswrapper[4991]: I1206 02:03:19.326026 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5l8jz/crc-debug-vmv7n" event={"ID":"14efb60c-a1f6-426e-aa6a-be1ec8fbb2bc","Type":"ContainerStarted","Data":"2fa78c91dfc8eab101b6be619d69c30b982d00ad83107eb37829902db663e980"} Dec 06 02:03:19 crc kubenswrapper[4991]: I1206 02:03:19.349116 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5l8jz/crc-debug-vmv7n" podStartSLOduration=1.597555979 podStartE2EDuration="13.349093896s" podCreationTimestamp="2025-12-06 02:03:06 +0000 UTC" firstStartedPulling="2025-12-06 02:03:06.491717958 +0000 UTC m=+3659.842008426" lastFinishedPulling="2025-12-06 02:03:18.243255875 +0000 UTC m=+3671.593546343" observedRunningTime="2025-12-06 02:03:19.341836924 +0000 UTC m=+3672.692127392" watchObservedRunningTime="2025-12-06 02:03:19.349093896 +0000 UTC m=+3672.699384364" Dec 06 02:03:19 crc kubenswrapper[4991]: I1206 02:03:19.871515 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-54vqs" Dec 06 02:03:19 crc kubenswrapper[4991]: I1206 02:03:19.935225 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30f32081-98b7-426c-8782-6df5b33657df-utilities\") pod \"30f32081-98b7-426c-8782-6df5b33657df\" (UID: \"30f32081-98b7-426c-8782-6df5b33657df\") " Dec 06 02:03:19 crc kubenswrapper[4991]: I1206 02:03:19.935299 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6z9k4\" (UniqueName: \"kubernetes.io/projected/30f32081-98b7-426c-8782-6df5b33657df-kube-api-access-6z9k4\") pod \"30f32081-98b7-426c-8782-6df5b33657df\" (UID: \"30f32081-98b7-426c-8782-6df5b33657df\") " Dec 06 02:03:19 crc kubenswrapper[4991]: I1206 02:03:19.935639 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30f32081-98b7-426c-8782-6df5b33657df-catalog-content\") pod \"30f32081-98b7-426c-8782-6df5b33657df\" (UID: \"30f32081-98b7-426c-8782-6df5b33657df\") " Dec 06 02:03:19 crc kubenswrapper[4991]: I1206 02:03:19.936351 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30f32081-98b7-426c-8782-6df5b33657df-utilities" (OuterVolumeSpecName: "utilities") pod "30f32081-98b7-426c-8782-6df5b33657df" (UID: "30f32081-98b7-426c-8782-6df5b33657df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 02:03:19 crc kubenswrapper[4991]: I1206 02:03:19.948261 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30f32081-98b7-426c-8782-6df5b33657df-kube-api-access-6z9k4" (OuterVolumeSpecName: "kube-api-access-6z9k4") pod "30f32081-98b7-426c-8782-6df5b33657df" (UID: "30f32081-98b7-426c-8782-6df5b33657df"). InnerVolumeSpecName "kube-api-access-6z9k4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 02:03:19 crc kubenswrapper[4991]: I1206 02:03:19.967773 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30f32081-98b7-426c-8782-6df5b33657df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "30f32081-98b7-426c-8782-6df5b33657df" (UID: "30f32081-98b7-426c-8782-6df5b33657df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 02:03:20 crc kubenswrapper[4991]: I1206 02:03:20.038334 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30f32081-98b7-426c-8782-6df5b33657df-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 02:03:20 crc kubenswrapper[4991]: I1206 02:03:20.038624 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30f32081-98b7-426c-8782-6df5b33657df-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 02:03:20 crc kubenswrapper[4991]: I1206 02:03:20.038638 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6z9k4\" (UniqueName: \"kubernetes.io/projected/30f32081-98b7-426c-8782-6df5b33657df-kube-api-access-6z9k4\") on node \"crc\" DevicePath \"\"" Dec 06 02:03:20 crc kubenswrapper[4991]: I1206 02:03:20.355305 4991 generic.go:334] "Generic (PLEG): container finished" podID="30f32081-98b7-426c-8782-6df5b33657df" containerID="2e9ce3dc8552e01c314759a34dc1a3c3bcd4968696cc2ae1a3e920ba752be5f2" exitCode=0 Dec 06 02:03:20 crc kubenswrapper[4991]: I1206 02:03:20.355352 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-54vqs" event={"ID":"30f32081-98b7-426c-8782-6df5b33657df","Type":"ContainerDied","Data":"2e9ce3dc8552e01c314759a34dc1a3c3bcd4968696cc2ae1a3e920ba752be5f2"} Dec 06 02:03:20 crc kubenswrapper[4991]: I1206 02:03:20.355401 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-54vqs" Dec 06 02:03:20 crc kubenswrapper[4991]: I1206 02:03:20.355428 4991 scope.go:117] "RemoveContainer" containerID="2e9ce3dc8552e01c314759a34dc1a3c3bcd4968696cc2ae1a3e920ba752be5f2" Dec 06 02:03:20 crc kubenswrapper[4991]: I1206 02:03:20.355412 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-54vqs" event={"ID":"30f32081-98b7-426c-8782-6df5b33657df","Type":"ContainerDied","Data":"edf089c7003056899c16ad0ca32c41a8100f3340cb8da4a37d9b81b6fc20c4e1"} Dec 06 02:03:20 crc kubenswrapper[4991]: I1206 02:03:20.386939 4991 scope.go:117] "RemoveContainer" containerID="2178a3bf313fcb8f9068fa46f963b50fa0c558ccd18148d10702fcaecf930226" Dec 06 02:03:20 crc kubenswrapper[4991]: I1206 02:03:20.391457 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-54vqs"] Dec 06 02:03:20 crc kubenswrapper[4991]: I1206 02:03:20.405677 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-54vqs"] Dec 06 02:03:20 crc kubenswrapper[4991]: I1206 02:03:20.414863 4991 scope.go:117] "RemoveContainer" containerID="0366119fe3051d3dd1b033d4ac45cc55fa4847dac02d71f0a511abc051b38102" Dec 06 02:03:20 crc kubenswrapper[4991]: I1206 02:03:20.465108 4991 scope.go:117] "RemoveContainer" containerID="2e9ce3dc8552e01c314759a34dc1a3c3bcd4968696cc2ae1a3e920ba752be5f2" Dec 06 02:03:20 crc kubenswrapper[4991]: E1206 02:03:20.465704 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e9ce3dc8552e01c314759a34dc1a3c3bcd4968696cc2ae1a3e920ba752be5f2\": container with ID starting with 2e9ce3dc8552e01c314759a34dc1a3c3bcd4968696cc2ae1a3e920ba752be5f2 not found: ID does not exist" containerID="2e9ce3dc8552e01c314759a34dc1a3c3bcd4968696cc2ae1a3e920ba752be5f2" Dec 06 02:03:20 crc kubenswrapper[4991]: I1206 02:03:20.465748 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e9ce3dc8552e01c314759a34dc1a3c3bcd4968696cc2ae1a3e920ba752be5f2"} err="failed to get container status \"2e9ce3dc8552e01c314759a34dc1a3c3bcd4968696cc2ae1a3e920ba752be5f2\": rpc error: code = NotFound desc = could not find container \"2e9ce3dc8552e01c314759a34dc1a3c3bcd4968696cc2ae1a3e920ba752be5f2\": container with ID starting with 2e9ce3dc8552e01c314759a34dc1a3c3bcd4968696cc2ae1a3e920ba752be5f2 not found: ID does not exist" Dec 06 02:03:20 crc kubenswrapper[4991]: I1206 02:03:20.465776 4991 scope.go:117] "RemoveContainer" containerID="2178a3bf313fcb8f9068fa46f963b50fa0c558ccd18148d10702fcaecf930226" Dec 06 02:03:20 crc kubenswrapper[4991]: E1206 02:03:20.466170 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2178a3bf313fcb8f9068fa46f963b50fa0c558ccd18148d10702fcaecf930226\": container with ID starting with 2178a3bf313fcb8f9068fa46f963b50fa0c558ccd18148d10702fcaecf930226 not found: ID does not exist" containerID="2178a3bf313fcb8f9068fa46f963b50fa0c558ccd18148d10702fcaecf930226" Dec 06 02:03:20 crc kubenswrapper[4991]: I1206 02:03:20.466195 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2178a3bf313fcb8f9068fa46f963b50fa0c558ccd18148d10702fcaecf930226"} err="failed to get container status \"2178a3bf313fcb8f9068fa46f963b50fa0c558ccd18148d10702fcaecf930226\": rpc error: code = NotFound desc = could not find container \"2178a3bf313fcb8f9068fa46f963b50fa0c558ccd18148d10702fcaecf930226\": container with ID starting with 2178a3bf313fcb8f9068fa46f963b50fa0c558ccd18148d10702fcaecf930226 not found: ID does not exist" Dec 06 02:03:20 crc kubenswrapper[4991]: I1206 02:03:20.466211 4991 scope.go:117] "RemoveContainer" containerID="0366119fe3051d3dd1b033d4ac45cc55fa4847dac02d71f0a511abc051b38102" Dec 06 02:03:20 crc kubenswrapper[4991]: E1206 02:03:20.466801 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0366119fe3051d3dd1b033d4ac45cc55fa4847dac02d71f0a511abc051b38102\": container with ID starting with 0366119fe3051d3dd1b033d4ac45cc55fa4847dac02d71f0a511abc051b38102 not found: ID does not exist" containerID="0366119fe3051d3dd1b033d4ac45cc55fa4847dac02d71f0a511abc051b38102" Dec 06 02:03:20 crc kubenswrapper[4991]: I1206 02:03:20.466849 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0366119fe3051d3dd1b033d4ac45cc55fa4847dac02d71f0a511abc051b38102"} err="failed to get container status \"0366119fe3051d3dd1b033d4ac45cc55fa4847dac02d71f0a511abc051b38102\": rpc error: code = NotFound desc = could not find container \"0366119fe3051d3dd1b033d4ac45cc55fa4847dac02d71f0a511abc051b38102\": container with ID starting with 0366119fe3051d3dd1b033d4ac45cc55fa4847dac02d71f0a511abc051b38102 not found: ID does not exist" Dec 06 02:03:21 crc kubenswrapper[4991]: I1206 02:03:21.051038 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30f32081-98b7-426c-8782-6df5b33657df" path="/var/lib/kubelet/pods/30f32081-98b7-426c-8782-6df5b33657df/volumes" Dec 06 02:04:01 crc kubenswrapper[4991]: I1206 02:04:01.796782 4991 generic.go:334] "Generic (PLEG): container finished" podID="14efb60c-a1f6-426e-aa6a-be1ec8fbb2bc" containerID="2fa78c91dfc8eab101b6be619d69c30b982d00ad83107eb37829902db663e980" exitCode=0 Dec 06 02:04:01 crc kubenswrapper[4991]: I1206 02:04:01.796913 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5l8jz/crc-debug-vmv7n" event={"ID":"14efb60c-a1f6-426e-aa6a-be1ec8fbb2bc","Type":"ContainerDied","Data":"2fa78c91dfc8eab101b6be619d69c30b982d00ad83107eb37829902db663e980"} Dec 06 02:04:02 crc kubenswrapper[4991]: I1206 02:04:02.979460 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5l8jz/crc-debug-vmv7n" Dec 06 02:04:03 crc kubenswrapper[4991]: I1206 02:04:03.020490 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5l8jz/crc-debug-vmv7n"] Dec 06 02:04:03 crc kubenswrapper[4991]: I1206 02:04:03.036605 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5l8jz/crc-debug-vmv7n"] Dec 06 02:04:03 crc kubenswrapper[4991]: I1206 02:04:03.160763 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6msn\" (UniqueName: \"kubernetes.io/projected/14efb60c-a1f6-426e-aa6a-be1ec8fbb2bc-kube-api-access-j6msn\") pod \"14efb60c-a1f6-426e-aa6a-be1ec8fbb2bc\" (UID: \"14efb60c-a1f6-426e-aa6a-be1ec8fbb2bc\") " Dec 06 02:04:03 crc kubenswrapper[4991]: I1206 02:04:03.160892 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/14efb60c-a1f6-426e-aa6a-be1ec8fbb2bc-host\") pod \"14efb60c-a1f6-426e-aa6a-be1ec8fbb2bc\" (UID: \"14efb60c-a1f6-426e-aa6a-be1ec8fbb2bc\") " Dec 06 02:04:03 crc kubenswrapper[4991]: I1206 02:04:03.161065 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/14efb60c-a1f6-426e-aa6a-be1ec8fbb2bc-host" (OuterVolumeSpecName: "host") pod "14efb60c-a1f6-426e-aa6a-be1ec8fbb2bc" (UID: "14efb60c-a1f6-426e-aa6a-be1ec8fbb2bc"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 02:04:03 crc kubenswrapper[4991]: I1206 02:04:03.162018 4991 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/14efb60c-a1f6-426e-aa6a-be1ec8fbb2bc-host\") on node \"crc\" DevicePath \"\"" Dec 06 02:04:03 crc kubenswrapper[4991]: I1206 02:04:03.171382 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14efb60c-a1f6-426e-aa6a-be1ec8fbb2bc-kube-api-access-j6msn" (OuterVolumeSpecName: "kube-api-access-j6msn") pod "14efb60c-a1f6-426e-aa6a-be1ec8fbb2bc" (UID: "14efb60c-a1f6-426e-aa6a-be1ec8fbb2bc"). InnerVolumeSpecName "kube-api-access-j6msn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 02:04:03 crc kubenswrapper[4991]: I1206 02:04:03.264529 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6msn\" (UniqueName: \"kubernetes.io/projected/14efb60c-a1f6-426e-aa6a-be1ec8fbb2bc-kube-api-access-j6msn\") on node \"crc\" DevicePath \"\"" Dec 06 02:04:03 crc kubenswrapper[4991]: I1206 02:04:03.831442 4991 scope.go:117] "RemoveContainer" containerID="2fa78c91dfc8eab101b6be619d69c30b982d00ad83107eb37829902db663e980" Dec 06 02:04:03 crc kubenswrapper[4991]: I1206 02:04:03.831957 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5l8jz/crc-debug-vmv7n" Dec 06 02:04:04 crc kubenswrapper[4991]: I1206 02:04:04.273184 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5l8jz/crc-debug-7fnrw"] Dec 06 02:04:04 crc kubenswrapper[4991]: E1206 02:04:04.274134 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14efb60c-a1f6-426e-aa6a-be1ec8fbb2bc" containerName="container-00" Dec 06 02:04:04 crc kubenswrapper[4991]: I1206 02:04:04.274157 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="14efb60c-a1f6-426e-aa6a-be1ec8fbb2bc" containerName="container-00" Dec 06 02:04:04 crc kubenswrapper[4991]: E1206 02:04:04.274198 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30f32081-98b7-426c-8782-6df5b33657df" containerName="extract-utilities" Dec 06 02:04:04 crc kubenswrapper[4991]: I1206 02:04:04.274212 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="30f32081-98b7-426c-8782-6df5b33657df" containerName="extract-utilities" Dec 06 02:04:04 crc kubenswrapper[4991]: E1206 02:04:04.274243 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30f32081-98b7-426c-8782-6df5b33657df" containerName="registry-server" Dec 06 02:04:04 crc kubenswrapper[4991]: I1206 02:04:04.274257 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="30f32081-98b7-426c-8782-6df5b33657df" containerName="registry-server" Dec 06 02:04:04 crc kubenswrapper[4991]: E1206 02:04:04.274283 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30f32081-98b7-426c-8782-6df5b33657df" containerName="extract-content" Dec 06 02:04:04 crc kubenswrapper[4991]: I1206 02:04:04.274295 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="30f32081-98b7-426c-8782-6df5b33657df" containerName="extract-content" Dec 06 02:04:04 crc kubenswrapper[4991]: I1206 02:04:04.274632 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="14efb60c-a1f6-426e-aa6a-be1ec8fbb2bc" containerName="container-00" Dec 06 02:04:04 crc kubenswrapper[4991]: I1206 02:04:04.274668 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="30f32081-98b7-426c-8782-6df5b33657df" containerName="registry-server" Dec 06 02:04:04 crc kubenswrapper[4991]: I1206 02:04:04.275687 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5l8jz/crc-debug-7fnrw" Dec 06 02:04:04 crc kubenswrapper[4991]: I1206 02:04:04.279412 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-5l8jz"/"default-dockercfg-49sfb" Dec 06 02:04:04 crc kubenswrapper[4991]: I1206 02:04:04.389875 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk6qw\" (UniqueName: \"kubernetes.io/projected/f79f3c7f-6909-48d2-8f7f-7f700860f57f-kube-api-access-hk6qw\") pod \"crc-debug-7fnrw\" (UID: \"f79f3c7f-6909-48d2-8f7f-7f700860f57f\") " pod="openshift-must-gather-5l8jz/crc-debug-7fnrw" Dec 06 02:04:04 crc kubenswrapper[4991]: I1206 02:04:04.390539 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f79f3c7f-6909-48d2-8f7f-7f700860f57f-host\") pod \"crc-debug-7fnrw\" (UID: \"f79f3c7f-6909-48d2-8f7f-7f700860f57f\") " pod="openshift-must-gather-5l8jz/crc-debug-7fnrw" Dec 06 02:04:04 crc kubenswrapper[4991]: I1206 02:04:04.493551 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f79f3c7f-6909-48d2-8f7f-7f700860f57f-host\") pod \"crc-debug-7fnrw\" (UID: \"f79f3c7f-6909-48d2-8f7f-7f700860f57f\") " pod="openshift-must-gather-5l8jz/crc-debug-7fnrw" Dec 06 02:04:04 crc kubenswrapper[4991]: I1206 02:04:04.493779 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hk6qw\" (UniqueName: \"kubernetes.io/projected/f79f3c7f-6909-48d2-8f7f-7f700860f57f-kube-api-access-hk6qw\") pod \"crc-debug-7fnrw\" (UID: \"f79f3c7f-6909-48d2-8f7f-7f700860f57f\") " pod="openshift-must-gather-5l8jz/crc-debug-7fnrw" Dec 06 02:04:04 crc kubenswrapper[4991]: I1206 02:04:04.493809 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f79f3c7f-6909-48d2-8f7f-7f700860f57f-host\") pod \"crc-debug-7fnrw\" (UID: \"f79f3c7f-6909-48d2-8f7f-7f700860f57f\") " pod="openshift-must-gather-5l8jz/crc-debug-7fnrw" Dec 06 02:04:04 crc kubenswrapper[4991]: I1206 02:04:04.522710 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk6qw\" (UniqueName: \"kubernetes.io/projected/f79f3c7f-6909-48d2-8f7f-7f700860f57f-kube-api-access-hk6qw\") pod \"crc-debug-7fnrw\" (UID: \"f79f3c7f-6909-48d2-8f7f-7f700860f57f\") " pod="openshift-must-gather-5l8jz/crc-debug-7fnrw" Dec 06 02:04:04 crc kubenswrapper[4991]: I1206 02:04:04.609834 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5l8jz/crc-debug-7fnrw" Dec 06 02:04:04 crc kubenswrapper[4991]: I1206 02:04:04.864426 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5l8jz/crc-debug-7fnrw" event={"ID":"f79f3c7f-6909-48d2-8f7f-7f700860f57f","Type":"ContainerStarted","Data":"a84f1c883acacc81a1f6f2cfb4b8acc1c9cad6b95b622746d900e711d2c7abea"} Dec 06 02:04:05 crc kubenswrapper[4991]: I1206 02:04:05.048720 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14efb60c-a1f6-426e-aa6a-be1ec8fbb2bc" path="/var/lib/kubelet/pods/14efb60c-a1f6-426e-aa6a-be1ec8fbb2bc/volumes" Dec 06 02:04:05 crc kubenswrapper[4991]: I1206 02:04:05.876902 4991 generic.go:334] "Generic (PLEG): container finished" podID="f79f3c7f-6909-48d2-8f7f-7f700860f57f" containerID="8b7068500a9b856fe074f5cb27c16e6b8307546a692a251dc4664ade9325e9da" exitCode=0 Dec 06 02:04:05 crc kubenswrapper[4991]: I1206 02:04:05.877314 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5l8jz/crc-debug-7fnrw" event={"ID":"f79f3c7f-6909-48d2-8f7f-7f700860f57f","Type":"ContainerDied","Data":"8b7068500a9b856fe074f5cb27c16e6b8307546a692a251dc4664ade9325e9da"} Dec 06 02:04:06 crc kubenswrapper[4991]: I1206 02:04:06.453039 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5l8jz/crc-debug-7fnrw"] Dec 06 02:04:06 crc kubenswrapper[4991]: I1206 02:04:06.463642 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5l8jz/crc-debug-7fnrw"] Dec 06 02:04:07 crc kubenswrapper[4991]: I1206 02:04:07.020847 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5l8jz/crc-debug-7fnrw" Dec 06 02:04:07 crc kubenswrapper[4991]: I1206 02:04:07.074468 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hk6qw\" (UniqueName: \"kubernetes.io/projected/f79f3c7f-6909-48d2-8f7f-7f700860f57f-kube-api-access-hk6qw\") pod \"f79f3c7f-6909-48d2-8f7f-7f700860f57f\" (UID: \"f79f3c7f-6909-48d2-8f7f-7f700860f57f\") " Dec 06 02:04:07 crc kubenswrapper[4991]: I1206 02:04:07.074831 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f79f3c7f-6909-48d2-8f7f-7f700860f57f-host\") pod \"f79f3c7f-6909-48d2-8f7f-7f700860f57f\" (UID: \"f79f3c7f-6909-48d2-8f7f-7f700860f57f\") " Dec 06 02:04:07 crc kubenswrapper[4991]: I1206 02:04:07.075033 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f79f3c7f-6909-48d2-8f7f-7f700860f57f-host" (OuterVolumeSpecName: "host") pod "f79f3c7f-6909-48d2-8f7f-7f700860f57f" (UID: "f79f3c7f-6909-48d2-8f7f-7f700860f57f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 02:04:07 crc kubenswrapper[4991]: I1206 02:04:07.075531 4991 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f79f3c7f-6909-48d2-8f7f-7f700860f57f-host\") on node \"crc\" DevicePath \"\"" Dec 06 02:04:07 crc kubenswrapper[4991]: I1206 02:04:07.083518 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f79f3c7f-6909-48d2-8f7f-7f700860f57f-kube-api-access-hk6qw" (OuterVolumeSpecName: "kube-api-access-hk6qw") pod "f79f3c7f-6909-48d2-8f7f-7f700860f57f" (UID: "f79f3c7f-6909-48d2-8f7f-7f700860f57f"). InnerVolumeSpecName "kube-api-access-hk6qw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 02:04:07 crc kubenswrapper[4991]: I1206 02:04:07.178968 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hk6qw\" (UniqueName: \"kubernetes.io/projected/f79f3c7f-6909-48d2-8f7f-7f700860f57f-kube-api-access-hk6qw\") on node \"crc\" DevicePath \"\"" Dec 06 02:04:07 crc kubenswrapper[4991]: I1206 02:04:07.671702 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5l8jz/crc-debug-r48rb"] Dec 06 02:04:07 crc kubenswrapper[4991]: E1206 02:04:07.672191 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f79f3c7f-6909-48d2-8f7f-7f700860f57f" containerName="container-00" Dec 06 02:04:07 crc kubenswrapper[4991]: I1206 02:04:07.672209 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="f79f3c7f-6909-48d2-8f7f-7f700860f57f" containerName="container-00" Dec 06 02:04:07 crc kubenswrapper[4991]: I1206 02:04:07.672475 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="f79f3c7f-6909-48d2-8f7f-7f700860f57f" containerName="container-00" Dec 06 02:04:07 crc kubenswrapper[4991]: I1206 02:04:07.673190 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5l8jz/crc-debug-r48rb" Dec 06 02:04:07 crc kubenswrapper[4991]: I1206 02:04:07.686895 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tts4\" (UniqueName: \"kubernetes.io/projected/76b4e5ff-1599-461e-8fcb-812b6dfd1b59-kube-api-access-8tts4\") pod \"crc-debug-r48rb\" (UID: \"76b4e5ff-1599-461e-8fcb-812b6dfd1b59\") " pod="openshift-must-gather-5l8jz/crc-debug-r48rb" Dec 06 02:04:07 crc kubenswrapper[4991]: I1206 02:04:07.687062 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/76b4e5ff-1599-461e-8fcb-812b6dfd1b59-host\") pod \"crc-debug-r48rb\" (UID: \"76b4e5ff-1599-461e-8fcb-812b6dfd1b59\") " pod="openshift-must-gather-5l8jz/crc-debug-r48rb" Dec 06 02:04:07 crc kubenswrapper[4991]: I1206 02:04:07.788831 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/76b4e5ff-1599-461e-8fcb-812b6dfd1b59-host\") pod \"crc-debug-r48rb\" (UID: \"76b4e5ff-1599-461e-8fcb-812b6dfd1b59\") " pod="openshift-must-gather-5l8jz/crc-debug-r48rb" Dec 06 02:04:07 crc kubenswrapper[4991]: I1206 02:04:07.789043 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/76b4e5ff-1599-461e-8fcb-812b6dfd1b59-host\") pod \"crc-debug-r48rb\" (UID: \"76b4e5ff-1599-461e-8fcb-812b6dfd1b59\") " pod="openshift-must-gather-5l8jz/crc-debug-r48rb" Dec 06 02:04:07 crc kubenswrapper[4991]: I1206 02:04:07.789202 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tts4\" (UniqueName: \"kubernetes.io/projected/76b4e5ff-1599-461e-8fcb-812b6dfd1b59-kube-api-access-8tts4\") pod \"crc-debug-r48rb\" (UID: \"76b4e5ff-1599-461e-8fcb-812b6dfd1b59\") " pod="openshift-must-gather-5l8jz/crc-debug-r48rb" Dec 06 02:04:07 crc kubenswrapper[4991]: I1206 02:04:07.817789 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tts4\" (UniqueName: \"kubernetes.io/projected/76b4e5ff-1599-461e-8fcb-812b6dfd1b59-kube-api-access-8tts4\") pod \"crc-debug-r48rb\" (UID: \"76b4e5ff-1599-461e-8fcb-812b6dfd1b59\") " pod="openshift-must-gather-5l8jz/crc-debug-r48rb" Dec 06 02:04:07 crc kubenswrapper[4991]: I1206 02:04:07.906413 4991 scope.go:117] "RemoveContainer" containerID="8b7068500a9b856fe074f5cb27c16e6b8307546a692a251dc4664ade9325e9da" Dec 06 02:04:07 crc kubenswrapper[4991]: I1206 02:04:07.906449 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5l8jz/crc-debug-7fnrw" Dec 06 02:04:08 crc kubenswrapper[4991]: I1206 02:04:08.003079 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5l8jz/crc-debug-r48rb" Dec 06 02:04:08 crc kubenswrapper[4991]: I1206 02:04:08.917869 4991 generic.go:334] "Generic (PLEG): container finished" podID="76b4e5ff-1599-461e-8fcb-812b6dfd1b59" containerID="9d6e2a50963bf57f128aa45cac8c8eafab099daad8d51ca1cf0584a4da179a56" exitCode=0 Dec 06 02:04:08 crc kubenswrapper[4991]: I1206 02:04:08.917966 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5l8jz/crc-debug-r48rb" event={"ID":"76b4e5ff-1599-461e-8fcb-812b6dfd1b59","Type":"ContainerDied","Data":"9d6e2a50963bf57f128aa45cac8c8eafab099daad8d51ca1cf0584a4da179a56"} Dec 06 02:04:08 crc kubenswrapper[4991]: I1206 02:04:08.918264 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5l8jz/crc-debug-r48rb" event={"ID":"76b4e5ff-1599-461e-8fcb-812b6dfd1b59","Type":"ContainerStarted","Data":"0582620e0773535dd6c7f092e2ef7e98911c4a3e88b060f0c5d3e1e8b4ec8548"} Dec 06 02:04:08 crc kubenswrapper[4991]: I1206 02:04:08.955820 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5l8jz/crc-debug-r48rb"] Dec 06 02:04:08 crc kubenswrapper[4991]: I1206 02:04:08.965841 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5l8jz/crc-debug-r48rb"] Dec 06 02:04:09 crc kubenswrapper[4991]: I1206 02:04:09.053207 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f79f3c7f-6909-48d2-8f7f-7f700860f57f" path="/var/lib/kubelet/pods/f79f3c7f-6909-48d2-8f7f-7f700860f57f/volumes" Dec 06 02:04:10 crc kubenswrapper[4991]: I1206 02:04:10.054391 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5l8jz/crc-debug-r48rb" Dec 06 02:04:10 crc kubenswrapper[4991]: I1206 02:04:10.131279 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tts4\" (UniqueName: \"kubernetes.io/projected/76b4e5ff-1599-461e-8fcb-812b6dfd1b59-kube-api-access-8tts4\") pod \"76b4e5ff-1599-461e-8fcb-812b6dfd1b59\" (UID: \"76b4e5ff-1599-461e-8fcb-812b6dfd1b59\") " Dec 06 02:04:10 crc kubenswrapper[4991]: I1206 02:04:10.131551 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/76b4e5ff-1599-461e-8fcb-812b6dfd1b59-host\") pod \"76b4e5ff-1599-461e-8fcb-812b6dfd1b59\" (UID: \"76b4e5ff-1599-461e-8fcb-812b6dfd1b59\") " Dec 06 02:04:10 crc kubenswrapper[4991]: I1206 02:04:10.131938 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/76b4e5ff-1599-461e-8fcb-812b6dfd1b59-host" (OuterVolumeSpecName: "host") pod "76b4e5ff-1599-461e-8fcb-812b6dfd1b59" (UID: "76b4e5ff-1599-461e-8fcb-812b6dfd1b59"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 02:04:10 crc kubenswrapper[4991]: I1206 02:04:10.132751 4991 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/76b4e5ff-1599-461e-8fcb-812b6dfd1b59-host\") on node \"crc\" DevicePath \"\"" Dec 06 02:04:10 crc kubenswrapper[4991]: I1206 02:04:10.139591 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76b4e5ff-1599-461e-8fcb-812b6dfd1b59-kube-api-access-8tts4" (OuterVolumeSpecName: "kube-api-access-8tts4") pod "76b4e5ff-1599-461e-8fcb-812b6dfd1b59" (UID: "76b4e5ff-1599-461e-8fcb-812b6dfd1b59"). InnerVolumeSpecName "kube-api-access-8tts4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 02:04:10 crc kubenswrapper[4991]: I1206 02:04:10.235209 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tts4\" (UniqueName: \"kubernetes.io/projected/76b4e5ff-1599-461e-8fcb-812b6dfd1b59-kube-api-access-8tts4\") on node \"crc\" DevicePath \"\"" Dec 06 02:04:10 crc kubenswrapper[4991]: I1206 02:04:10.941198 4991 scope.go:117] "RemoveContainer" containerID="9d6e2a50963bf57f128aa45cac8c8eafab099daad8d51ca1cf0584a4da179a56" Dec 06 02:04:10 crc kubenswrapper[4991]: I1206 02:04:10.941276 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5l8jz/crc-debug-r48rb" Dec 06 02:04:11 crc kubenswrapper[4991]: I1206 02:04:11.052945 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76b4e5ff-1599-461e-8fcb-812b6dfd1b59" path="/var/lib/kubelet/pods/76b4e5ff-1599-461e-8fcb-812b6dfd1b59/volumes" Dec 06 02:04:24 crc kubenswrapper[4991]: I1206 02:04:24.188904 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5d7f8d97fd-bwgzp_b53a8fe6-006f-49e8-9598-dd986c58e97a/barbican-api/0.log" Dec 06 02:04:24 crc kubenswrapper[4991]: I1206 02:04:24.314685 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5d7f8d97fd-bwgzp_b53a8fe6-006f-49e8-9598-dd986c58e97a/barbican-api-log/0.log" Dec 06 02:04:24 crc kubenswrapper[4991]: I1206 02:04:24.391814 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7b4c56449d-5hvrn_79ffe5af-2a10-4cd9-b8bc-1c2e49dae8b1/barbican-keystone-listener/0.log" Dec 06 02:04:24 crc kubenswrapper[4991]: I1206 02:04:24.468981 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7b4c56449d-5hvrn_79ffe5af-2a10-4cd9-b8bc-1c2e49dae8b1/barbican-keystone-listener-log/0.log" Dec 06 02:04:24 crc kubenswrapper[4991]: I1206 02:04:24.568715 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7d69cddbc-dkwjg_cc3413e2-6d39-4013-94ac-fb8d1c42e5ff/barbican-worker-log/0.log" Dec 06 02:04:24 crc kubenswrapper[4991]: I1206 02:04:24.575584 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7d69cddbc-dkwjg_cc3413e2-6d39-4013-94ac-fb8d1c42e5ff/barbican-worker/0.log" Dec 06 02:04:24 crc kubenswrapper[4991]: I1206 02:04:24.746906 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-46jmn_aa853f65-564a-473e-a155-b5ac6e57eedf/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 02:04:24 crc kubenswrapper[4991]: I1206 02:04:24.780186 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b4e35617-6fcf-4b85-b5a4-d48d8a95f458/ceilometer-central-agent/0.log" Dec 06 02:04:24 crc kubenswrapper[4991]: I1206 02:04:24.927463 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b4e35617-6fcf-4b85-b5a4-d48d8a95f458/ceilometer-notification-agent/0.log" Dec 06 02:04:24 crc kubenswrapper[4991]: I1206 02:04:24.950031 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b4e35617-6fcf-4b85-b5a4-d48d8a95f458/proxy-httpd/0.log" Dec 06 02:04:25 crc kubenswrapper[4991]: I1206 02:04:25.003647 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b4e35617-6fcf-4b85-b5a4-d48d8a95f458/sg-core/0.log" Dec 06 02:04:25 crc kubenswrapper[4991]: I1206 02:04:25.138810 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_612d471c-b8ae-46ac-b3ff-9920f5615bd4/cinder-api/0.log" Dec 06 02:04:25 crc kubenswrapper[4991]: I1206 02:04:25.149550 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_612d471c-b8ae-46ac-b3ff-9920f5615bd4/cinder-api-log/0.log" Dec 06 02:04:25 crc kubenswrapper[4991]: I1206 02:04:25.266831 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_89779edf-f784-441f-9420-1e03444eea94/cinder-scheduler/0.log" Dec 06 02:04:25 crc kubenswrapper[4991]: I1206 02:04:25.326353 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_89779edf-f784-441f-9420-1e03444eea94/probe/0.log" Dec 06 02:04:25 crc kubenswrapper[4991]: I1206 02:04:25.451260 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-hdf44_62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 02:04:25 crc kubenswrapper[4991]: I1206 02:04:25.517952 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-mwrmt_69c9b4b8-de83-4f43-8670-e9279daadb9b/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 02:04:25 crc kubenswrapper[4991]: I1206 02:04:25.646762 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-69zsb_fa47d36c-149e-4778-b285-fa3b2e71277b/init/0.log" Dec 06 02:04:25 crc kubenswrapper[4991]: I1206 02:04:25.847166 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-69zsb_fa47d36c-149e-4778-b285-fa3b2e71277b/init/0.log" Dec 06 02:04:25 crc kubenswrapper[4991]: I1206 02:04:25.855706 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-69zsb_fa47d36c-149e-4778-b285-fa3b2e71277b/dnsmasq-dns/0.log" Dec 06 02:04:25 crc kubenswrapper[4991]: I1206 02:04:25.875324 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-ctg6w_bc316311-f861-4cc3-bb9f-1a340e3ec877/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 02:04:26 crc kubenswrapper[4991]: I1206 02:04:26.083114 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_7b96bae9-f846-4e7e-b1df-3d51b139dcb5/glance-httpd/0.log" Dec 06 02:04:26 crc kubenswrapper[4991]: I1206 02:04:26.085928 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_7b96bae9-f846-4e7e-b1df-3d51b139dcb5/glance-log/0.log" Dec 06 02:04:26 crc kubenswrapper[4991]: I1206 02:04:26.302487 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_9569cf80-9b29-4a2e-8a9b-a618fb5b0128/glance-log/0.log" Dec 06 02:04:26 crc kubenswrapper[4991]: I1206 02:04:26.318875 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_9569cf80-9b29-4a2e-8a9b-a618fb5b0128/glance-httpd/0.log" Dec 06 02:04:26 crc kubenswrapper[4991]: I1206 02:04:26.470511 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-577fd657dd-psmdm_15de5638-5e2b-4ce1-9d25-8f2830cd9241/horizon/0.log" Dec 06 02:04:26 crc kubenswrapper[4991]: I1206 02:04:26.704728 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-2hls9_a64fe8ac-b9eb-4c7e-917b-56fb0da51872/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 02:04:26 crc kubenswrapper[4991]: I1206 02:04:26.871379 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-gml2f_801ece65-0bd2-49cb-9c03-439c624bb145/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 02:04:26 crc kubenswrapper[4991]: I1206 02:04:26.901654 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-577fd657dd-psmdm_15de5638-5e2b-4ce1-9d25-8f2830cd9241/horizon-log/0.log" Dec 06 02:04:27 crc kubenswrapper[4991]: I1206 02:04:27.124359 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-99654f6-8nq4x_72ca5f78-0a0d-4e09-8975-cba79bab4842/keystone-api/0.log" Dec 06 02:04:27 crc kubenswrapper[4991]: I1206 02:04:27.149094 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29416441-t9lln_79a5c906-606c-4b08-bb99-fadf17a19cfa/keystone-cron/0.log" Dec 06 02:04:27 crc kubenswrapper[4991]: I1206 02:04:27.410063 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-rzttv_908dd8e5-0b1f-494f-8e10-5ee48d1faeb4/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 02:04:27 crc kubenswrapper[4991]: I1206 02:04:27.430469 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_64844b68-4772-49f7-9328-8eed4c925267/kube-state-metrics/0.log" Dec 06 02:04:27 crc kubenswrapper[4991]: I1206 02:04:27.741910 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-779776dbb5-wnlcw_0027a35f-14c5-42ff-b153-359f129f0d2b/neutron-httpd/0.log" Dec 06 02:04:27 crc kubenswrapper[4991]: I1206 02:04:27.826681 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-779776dbb5-wnlcw_0027a35f-14c5-42ff-b153-359f129f0d2b/neutron-api/0.log" Dec 06 02:04:28 crc kubenswrapper[4991]: I1206 02:04:28.041137 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq_46509879-7d53-4e4c-b581-9c48e1b350ad/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 02:04:28 crc kubenswrapper[4991]: I1206 02:04:28.571713 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_380a9e11-2703-494e-aae0-ce7758d34509/nova-cell0-conductor-conductor/0.log" Dec 06 02:04:28 crc kubenswrapper[4991]: I1206 02:04:28.652792 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_d718f384-f95f-4601-afc7-0aa3655b9855/nova-api-log/0.log" Dec 06 02:04:28 crc kubenswrapper[4991]: I1206 02:04:28.766412 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_d718f384-f95f-4601-afc7-0aa3655b9855/nova-api-api/0.log" Dec 06 02:04:28 crc kubenswrapper[4991]: I1206 02:04:28.879596 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_8d67a449-d9a3-4a6a-bd6f-bbdd4c0de1fd/nova-cell1-conductor-conductor/0.log" Dec 06 02:04:28 crc kubenswrapper[4991]: I1206 02:04:28.989558 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_c5be01c8-e833-4da5-a09e-da95ec913708/nova-cell1-novncproxy-novncproxy/0.log" Dec 06 02:04:29 crc kubenswrapper[4991]: I1206 02:04:29.107366 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-jntfk_3b3f0520-92f4-4e3e-aa3e-8950cdc437e9/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 02:04:29 crc kubenswrapper[4991]: I1206 02:04:29.383096 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_bae01666-bf5e-4dc0-a540-440d511229e7/nova-metadata-log/0.log" Dec 06 02:04:29 crc kubenswrapper[4991]: I1206 02:04:29.556657 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_a1581dfb-5721-4ba3-9364-4b891a476821/nova-scheduler-scheduler/0.log" Dec 06 02:04:29 crc kubenswrapper[4991]: I1206 02:04:29.616288 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e937b594-10b6-4742-a47a-7d21f0c19636/mysql-bootstrap/0.log" Dec 06 02:04:29 crc kubenswrapper[4991]: I1206 02:04:29.764015 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e937b594-10b6-4742-a47a-7d21f0c19636/mysql-bootstrap/0.log" Dec 06 02:04:29 crc kubenswrapper[4991]: I1206 02:04:29.826823 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e937b594-10b6-4742-a47a-7d21f0c19636/galera/0.log" Dec 06 02:04:29 crc kubenswrapper[4991]: I1206 02:04:29.968417 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_1b6ac217-8e85-4279-b2d1-7dd35227f828/mysql-bootstrap/0.log" Dec 06 02:04:30 crc kubenswrapper[4991]: I1206 02:04:30.213303 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_1b6ac217-8e85-4279-b2d1-7dd35227f828/mysql-bootstrap/0.log" Dec 06 02:04:30 crc kubenswrapper[4991]: I1206 02:04:30.246463 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_1b6ac217-8e85-4279-b2d1-7dd35227f828/galera/0.log" Dec 06 02:04:30 crc kubenswrapper[4991]: I1206 02:04:30.417794 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_ff108074-7904-43a0-b621-6676980779f6/openstackclient/0.log" Dec 06 02:04:30 crc kubenswrapper[4991]: I1206 02:04:30.522383 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-82ddm_a3995a5a-dfc3-4769-8161-31f7e2234d13/ovn-controller/0.log" Dec 06 02:04:30 crc kubenswrapper[4991]: I1206 02:04:30.544063 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_bae01666-bf5e-4dc0-a540-440d511229e7/nova-metadata-metadata/0.log" Dec 06 02:04:30 crc kubenswrapper[4991]: I1206 02:04:30.737636 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-ltzgn_3e130223-fd73-41e3-8af7-d0637555c009/openstack-network-exporter/0.log" Dec 06 02:04:30 crc kubenswrapper[4991]: I1206 02:04:30.824300 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-529dn_6e642a31-7110-47d8-b322-91900cf49151/ovsdb-server-init/0.log" Dec 06 02:04:31 crc kubenswrapper[4991]: I1206 02:04:31.075291 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-529dn_6e642a31-7110-47d8-b322-91900cf49151/ovsdb-server-init/0.log" Dec 06 02:04:31 crc kubenswrapper[4991]: I1206 02:04:31.097706 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-529dn_6e642a31-7110-47d8-b322-91900cf49151/ovs-vswitchd/0.log" Dec 06 02:04:31 crc kubenswrapper[4991]: I1206 02:04:31.152153 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-529dn_6e642a31-7110-47d8-b322-91900cf49151/ovsdb-server/0.log" Dec 06 02:04:31 crc kubenswrapper[4991]: I1206 02:04:31.330695 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-fcgx6_8ae62c44-69d1-4cbf-abb9-491082bf83e3/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 02:04:31 crc kubenswrapper[4991]: I1206 02:04:31.397558 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_49a9417e-79cd-4b0c-a5b0-5528a9e20edc/ovn-northd/0.log" Dec 06 02:04:31 crc kubenswrapper[4991]: I1206 02:04:31.404239 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_49a9417e-79cd-4b0c-a5b0-5528a9e20edc/openstack-network-exporter/0.log" Dec 06 02:04:31 crc kubenswrapper[4991]: I1206 02:04:31.608425 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e154c690-d20d-463c-af7d-257a3ae60f3c/ovsdbserver-nb/0.log" Dec 06 02:04:31 crc kubenswrapper[4991]: I1206 02:04:31.662967 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e154c690-d20d-463c-af7d-257a3ae60f3c/openstack-network-exporter/0.log" Dec 06 02:04:31 crc kubenswrapper[4991]: I1206 02:04:31.865587 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2c775c23-51a7-4213-9a70-1382e7c55a1e/openstack-network-exporter/0.log" Dec 06 02:04:31 crc kubenswrapper[4991]: I1206 02:04:31.887417 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2c775c23-51a7-4213-9a70-1382e7c55a1e/ovsdbserver-sb/0.log" Dec 06 02:04:32 crc kubenswrapper[4991]: I1206 02:04:32.009508 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6d59dffb86-qrfm9_cd5faf07-40cf-4cce-9c5a-0fb43ffbb993/placement-api/0.log" Dec 06 02:04:32 crc kubenswrapper[4991]: I1206 02:04:32.184209 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6d59dffb86-qrfm9_cd5faf07-40cf-4cce-9c5a-0fb43ffbb993/placement-log/0.log" Dec 06 02:04:32 crc kubenswrapper[4991]: I1206 02:04:32.203002 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_2e3e3138-a6f6-4005-b57d-ee02b9534e57/setup-container/0.log" Dec 06 02:04:32 crc kubenswrapper[4991]: I1206 02:04:32.412743 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_2e3e3138-a6f6-4005-b57d-ee02b9534e57/rabbitmq/0.log" Dec 06 02:04:32 crc kubenswrapper[4991]: I1206 02:04:32.480916 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a/setup-container/0.log" Dec 06 02:04:32 crc kubenswrapper[4991]: I1206 02:04:32.499392 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_2e3e3138-a6f6-4005-b57d-ee02b9534e57/setup-container/0.log" Dec 06 02:04:32 crc kubenswrapper[4991]: I1206 02:04:32.701279 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a/setup-container/0.log" Dec 06 02:04:32 crc kubenswrapper[4991]: I1206 02:04:32.781713 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-hmx9g_8695023e-4250-4f03-960e-873fd9f9fba5/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 02:04:32 crc kubenswrapper[4991]: I1206 02:04:32.806187 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a/rabbitmq/0.log" Dec 06 02:04:33 crc kubenswrapper[4991]: I1206 02:04:33.033039 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-zvphr_eb12fac8-d6c4-4be8-907f-a06cae6bc067/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 02:04:33 crc kubenswrapper[4991]: I1206 02:04:33.069507 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-lxh7l_128953a1-f318-43c1-a2bb-3b006c8600a4/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 02:04:33 crc kubenswrapper[4991]: I1206 02:04:33.516308 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-lwgtr_1c339885-2776-49cc-88b6-e79ff184d76a/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 02:04:33 crc kubenswrapper[4991]: I1206 02:04:33.520560 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-74xt7_dfadc831-65e8-4c83-a3f6-1316db177fb5/ssh-known-hosts-edpm-deployment/0.log" Dec 06 02:04:33 crc kubenswrapper[4991]: I1206 02:04:33.774962 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-ff7f9b9c9-g4mpb_bbcad69c-20b6-46da-95b6-28a7d993ebab/proxy-server/0.log" Dec 06 02:04:33 crc kubenswrapper[4991]: I1206 02:04:33.856632 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-ff7f9b9c9-g4mpb_bbcad69c-20b6-46da-95b6-28a7d993ebab/proxy-httpd/0.log" Dec 06 02:04:33 crc kubenswrapper[4991]: I1206 02:04:33.924047 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-mtqz6_955d4a82-f627-490d-bc6c-cb7420010230/swift-ring-rebalance/0.log" Dec 06 02:04:34 crc kubenswrapper[4991]: I1206 02:04:34.115648 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_04a746be-9ac3-4e2e-a264-6fc99dc51527/account-reaper/0.log" Dec 06 02:04:34 crc kubenswrapper[4991]: I1206 02:04:34.145032 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_04a746be-9ac3-4e2e-a264-6fc99dc51527/account-auditor/0.log" Dec 06 02:04:34 crc kubenswrapper[4991]: I1206 02:04:34.150749 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_04a746be-9ac3-4e2e-a264-6fc99dc51527/account-replicator/0.log" Dec 06 02:04:34 crc kubenswrapper[4991]: I1206 02:04:34.278873 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_04a746be-9ac3-4e2e-a264-6fc99dc51527/account-server/0.log" Dec 06 02:04:34 crc kubenswrapper[4991]: I1206 02:04:34.367105 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_04a746be-9ac3-4e2e-a264-6fc99dc51527/container-replicator/0.log" Dec 06 02:04:34 crc kubenswrapper[4991]: I1206 02:04:34.372894 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_04a746be-9ac3-4e2e-a264-6fc99dc51527/container-auditor/0.log" Dec 06 02:04:34 crc kubenswrapper[4991]: I1206 02:04:34.381157 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_04a746be-9ac3-4e2e-a264-6fc99dc51527/container-server/0.log" Dec 06 02:04:34 crc kubenswrapper[4991]: I1206 02:04:34.486613 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_04a746be-9ac3-4e2e-a264-6fc99dc51527/container-updater/0.log" Dec 06 02:04:34 crc kubenswrapper[4991]: I1206 02:04:34.587385 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_04a746be-9ac3-4e2e-a264-6fc99dc51527/object-auditor/0.log" Dec 06 02:04:34 crc kubenswrapper[4991]: I1206 02:04:34.620319 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_04a746be-9ac3-4e2e-a264-6fc99dc51527/object-expirer/0.log" Dec 06 02:04:34 crc kubenswrapper[4991]: I1206 02:04:34.655934 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_04a746be-9ac3-4e2e-a264-6fc99dc51527/object-replicator/0.log" Dec 06 02:04:34 crc kubenswrapper[4991]: I1206 02:04:34.751249 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_04a746be-9ac3-4e2e-a264-6fc99dc51527/object-server/0.log" Dec 06 02:04:34 crc kubenswrapper[4991]: I1206 02:04:34.795488 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_04a746be-9ac3-4e2e-a264-6fc99dc51527/object-updater/0.log" Dec 06 02:04:34 crc kubenswrapper[4991]: I1206 02:04:34.880800 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_04a746be-9ac3-4e2e-a264-6fc99dc51527/rsync/0.log" Dec 06 02:04:34 crc kubenswrapper[4991]: I1206 02:04:34.927256 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_04a746be-9ac3-4e2e-a264-6fc99dc51527/swift-recon-cron/0.log" Dec 06 02:04:35 crc kubenswrapper[4991]: I1206 02:04:35.119655 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9_3067d2e3-f5b3-45c8-9e37-8a29a2dd153f/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 02:04:35 crc kubenswrapper[4991]: I1206 02:04:35.205973 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_67d48b5b-50a6-4515-9ca7-bc491239938a/tempest-tests-tempest-tests-runner/0.log" Dec 06 02:04:35 crc kubenswrapper[4991]: I1206 02:04:35.388259 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_ce84be0d-7832-4602-a7fb-977d635de15b/test-operator-logs-container/0.log" Dec 06 02:04:35 crc kubenswrapper[4991]: I1206 02:04:35.452296 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-9l6vj_e4c4d67b-be1b-4502-a13d-96a119cbb436/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 02:04:45 crc kubenswrapper[4991]: I1206 02:04:45.273053 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_68a4be52-9eb9-4c1f-b625-de969a73a990/memcached/0.log" Dec 06 02:04:48 crc kubenswrapper[4991]: I1206 02:04:48.098762 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vk9c6"] Dec 06 02:04:48 crc kubenswrapper[4991]: E1206 02:04:48.099778 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76b4e5ff-1599-461e-8fcb-812b6dfd1b59" containerName="container-00" Dec 06 02:04:48 crc kubenswrapper[4991]: I1206 02:04:48.099791 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="76b4e5ff-1599-461e-8fcb-812b6dfd1b59" containerName="container-00" Dec 06 02:04:48 crc kubenswrapper[4991]: I1206 02:04:48.099997 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="76b4e5ff-1599-461e-8fcb-812b6dfd1b59" containerName="container-00" Dec 06 02:04:48 crc kubenswrapper[4991]: I1206 02:04:48.101277 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vk9c6" Dec 06 02:04:48 crc kubenswrapper[4991]: I1206 02:04:48.117078 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vk9c6"] Dec 06 02:04:48 crc kubenswrapper[4991]: I1206 02:04:48.191394 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f11455ae-3bb8-4298-9764-6e39e1297827-utilities\") pod \"community-operators-vk9c6\" (UID: \"f11455ae-3bb8-4298-9764-6e39e1297827\") " pod="openshift-marketplace/community-operators-vk9c6" Dec 06 02:04:48 crc kubenswrapper[4991]: I1206 02:04:48.191855 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f11455ae-3bb8-4298-9764-6e39e1297827-catalog-content\") pod \"community-operators-vk9c6\" (UID: \"f11455ae-3bb8-4298-9764-6e39e1297827\") " pod="openshift-marketplace/community-operators-vk9c6" Dec 06 02:04:48 crc kubenswrapper[4991]: I1206 02:04:48.191905 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wj7qt\" (UniqueName: \"kubernetes.io/projected/f11455ae-3bb8-4298-9764-6e39e1297827-kube-api-access-wj7qt\") pod \"community-operators-vk9c6\" (UID: \"f11455ae-3bb8-4298-9764-6e39e1297827\") " pod="openshift-marketplace/community-operators-vk9c6" Dec 06 02:04:48 crc kubenswrapper[4991]: I1206 02:04:48.294613 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f11455ae-3bb8-4298-9764-6e39e1297827-utilities\") pod \"community-operators-vk9c6\" (UID: \"f11455ae-3bb8-4298-9764-6e39e1297827\") " pod="openshift-marketplace/community-operators-vk9c6" Dec 06 02:04:48 crc kubenswrapper[4991]: I1206 02:04:48.294780 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f11455ae-3bb8-4298-9764-6e39e1297827-catalog-content\") pod \"community-operators-vk9c6\" (UID: \"f11455ae-3bb8-4298-9764-6e39e1297827\") " pod="openshift-marketplace/community-operators-vk9c6" Dec 06 02:04:48 crc kubenswrapper[4991]: I1206 02:04:48.294853 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wj7qt\" (UniqueName: \"kubernetes.io/projected/f11455ae-3bb8-4298-9764-6e39e1297827-kube-api-access-wj7qt\") pod \"community-operators-vk9c6\" (UID: \"f11455ae-3bb8-4298-9764-6e39e1297827\") " pod="openshift-marketplace/community-operators-vk9c6" Dec 06 02:04:48 crc kubenswrapper[4991]: I1206 02:04:48.295290 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f11455ae-3bb8-4298-9764-6e39e1297827-utilities\") pod \"community-operators-vk9c6\" (UID: \"f11455ae-3bb8-4298-9764-6e39e1297827\") " pod="openshift-marketplace/community-operators-vk9c6" Dec 06 02:04:48 crc kubenswrapper[4991]: I1206 02:04:48.295661 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f11455ae-3bb8-4298-9764-6e39e1297827-catalog-content\") pod \"community-operators-vk9c6\" (UID: \"f11455ae-3bb8-4298-9764-6e39e1297827\") " pod="openshift-marketplace/community-operators-vk9c6" Dec 06 02:04:48 crc kubenswrapper[4991]: I1206 02:04:48.314628 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wj7qt\" (UniqueName: \"kubernetes.io/projected/f11455ae-3bb8-4298-9764-6e39e1297827-kube-api-access-wj7qt\") pod \"community-operators-vk9c6\" (UID: \"f11455ae-3bb8-4298-9764-6e39e1297827\") " pod="openshift-marketplace/community-operators-vk9c6" Dec 06 02:04:48 crc kubenswrapper[4991]: I1206 02:04:48.428264 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vk9c6" Dec 06 02:04:49 crc kubenswrapper[4991]: I1206 02:04:49.270740 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vk9c6"] Dec 06 02:04:49 crc kubenswrapper[4991]: I1206 02:04:49.381391 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vk9c6" event={"ID":"f11455ae-3bb8-4298-9764-6e39e1297827","Type":"ContainerStarted","Data":"a29a002fef1873226ea54f56aa54adec93aaa09bcadad10adb78555eddd2408c"} Dec 06 02:04:50 crc kubenswrapper[4991]: I1206 02:04:50.396575 4991 generic.go:334] "Generic (PLEG): container finished" podID="f11455ae-3bb8-4298-9764-6e39e1297827" containerID="f7218e3fad994e43c0412edd4ebd2106a12bc51ac464d4ffc224d07ea5571de4" exitCode=0 Dec 06 02:04:50 crc kubenswrapper[4991]: I1206 02:04:50.396655 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vk9c6" event={"ID":"f11455ae-3bb8-4298-9764-6e39e1297827","Type":"ContainerDied","Data":"f7218e3fad994e43c0412edd4ebd2106a12bc51ac464d4ffc224d07ea5571de4"} Dec 06 02:04:51 crc kubenswrapper[4991]: I1206 02:04:51.408653 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vk9c6" event={"ID":"f11455ae-3bb8-4298-9764-6e39e1297827","Type":"ContainerStarted","Data":"86f584897fcd5bc39e325c36f3926b4c0bd0ace84da1291253acff14a20e654d"} Dec 06 02:04:52 crc kubenswrapper[4991]: I1206 02:04:52.424083 4991 generic.go:334] "Generic (PLEG): container finished" podID="f11455ae-3bb8-4298-9764-6e39e1297827" containerID="86f584897fcd5bc39e325c36f3926b4c0bd0ace84da1291253acff14a20e654d" exitCode=0 Dec 06 02:04:52 crc kubenswrapper[4991]: I1206 02:04:52.424203 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vk9c6" event={"ID":"f11455ae-3bb8-4298-9764-6e39e1297827","Type":"ContainerDied","Data":"86f584897fcd5bc39e325c36f3926b4c0bd0ace84da1291253acff14a20e654d"} Dec 06 02:04:53 crc kubenswrapper[4991]: I1206 02:04:53.463214 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vk9c6" event={"ID":"f11455ae-3bb8-4298-9764-6e39e1297827","Type":"ContainerStarted","Data":"9604831446f1ea1bcef79bec35c6a17a51d1739405ac4055951495b191142c40"} Dec 06 02:04:53 crc kubenswrapper[4991]: I1206 02:04:53.491804 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vk9c6" podStartSLOduration=3.033494575 podStartE2EDuration="5.491783448s" podCreationTimestamp="2025-12-06 02:04:48 +0000 UTC" firstStartedPulling="2025-12-06 02:04:50.400687172 +0000 UTC m=+3763.750977650" lastFinishedPulling="2025-12-06 02:04:52.858976045 +0000 UTC m=+3766.209266523" observedRunningTime="2025-12-06 02:04:53.48353238 +0000 UTC m=+3766.833822868" watchObservedRunningTime="2025-12-06 02:04:53.491783448 +0000 UTC m=+3766.842073926" Dec 06 02:04:58 crc kubenswrapper[4991]: I1206 02:04:58.429230 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vk9c6" Dec 06 02:04:58 crc kubenswrapper[4991]: I1206 02:04:58.430902 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vk9c6" Dec 06 02:04:58 crc kubenswrapper[4991]: I1206 02:04:58.489295 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vk9c6" Dec 06 02:04:58 crc kubenswrapper[4991]: I1206 02:04:58.564146 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vk9c6" Dec 06 02:04:58 crc kubenswrapper[4991]: I1206 02:04:58.729478 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vk9c6"] Dec 06 02:05:00 crc kubenswrapper[4991]: I1206 02:05:00.536397 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vk9c6" podUID="f11455ae-3bb8-4298-9764-6e39e1297827" containerName="registry-server" containerID="cri-o://9604831446f1ea1bcef79bec35c6a17a51d1739405ac4055951495b191142c40" gracePeriod=2 Dec 06 02:05:01 crc kubenswrapper[4991]: I1206 02:05:01.013088 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vk9c6" Dec 06 02:05:01 crc kubenswrapper[4991]: I1206 02:05:01.095492 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f11455ae-3bb8-4298-9764-6e39e1297827-catalog-content\") pod \"f11455ae-3bb8-4298-9764-6e39e1297827\" (UID: \"f11455ae-3bb8-4298-9764-6e39e1297827\") " Dec 06 02:05:01 crc kubenswrapper[4991]: I1206 02:05:01.095598 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wj7qt\" (UniqueName: \"kubernetes.io/projected/f11455ae-3bb8-4298-9764-6e39e1297827-kube-api-access-wj7qt\") pod \"f11455ae-3bb8-4298-9764-6e39e1297827\" (UID: \"f11455ae-3bb8-4298-9764-6e39e1297827\") " Dec 06 02:05:01 crc kubenswrapper[4991]: I1206 02:05:01.095753 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f11455ae-3bb8-4298-9764-6e39e1297827-utilities\") pod \"f11455ae-3bb8-4298-9764-6e39e1297827\" (UID: \"f11455ae-3bb8-4298-9764-6e39e1297827\") " Dec 06 02:05:01 crc kubenswrapper[4991]: I1206 02:05:01.097058 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f11455ae-3bb8-4298-9764-6e39e1297827-utilities" (OuterVolumeSpecName: "utilities") pod "f11455ae-3bb8-4298-9764-6e39e1297827" (UID: "f11455ae-3bb8-4298-9764-6e39e1297827"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 02:05:01 crc kubenswrapper[4991]: I1206 02:05:01.111481 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f11455ae-3bb8-4298-9764-6e39e1297827-kube-api-access-wj7qt" (OuterVolumeSpecName: "kube-api-access-wj7qt") pod "f11455ae-3bb8-4298-9764-6e39e1297827" (UID: "f11455ae-3bb8-4298-9764-6e39e1297827"). InnerVolumeSpecName "kube-api-access-wj7qt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 02:05:01 crc kubenswrapper[4991]: I1206 02:05:01.156332 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f11455ae-3bb8-4298-9764-6e39e1297827-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f11455ae-3bb8-4298-9764-6e39e1297827" (UID: "f11455ae-3bb8-4298-9764-6e39e1297827"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 02:05:01 crc kubenswrapper[4991]: I1206 02:05:01.198471 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wj7qt\" (UniqueName: \"kubernetes.io/projected/f11455ae-3bb8-4298-9764-6e39e1297827-kube-api-access-wj7qt\") on node \"crc\" DevicePath \"\"" Dec 06 02:05:01 crc kubenswrapper[4991]: I1206 02:05:01.198504 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f11455ae-3bb8-4298-9764-6e39e1297827-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 02:05:01 crc kubenswrapper[4991]: I1206 02:05:01.198514 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f11455ae-3bb8-4298-9764-6e39e1297827-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 02:05:01 crc kubenswrapper[4991]: I1206 02:05:01.546741 4991 generic.go:334] "Generic (PLEG): container finished" podID="f11455ae-3bb8-4298-9764-6e39e1297827" containerID="9604831446f1ea1bcef79bec35c6a17a51d1739405ac4055951495b191142c40" exitCode=0 Dec 06 02:05:01 crc kubenswrapper[4991]: I1206 02:05:01.546806 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vk9c6" event={"ID":"f11455ae-3bb8-4298-9764-6e39e1297827","Type":"ContainerDied","Data":"9604831446f1ea1bcef79bec35c6a17a51d1739405ac4055951495b191142c40"} Dec 06 02:05:01 crc kubenswrapper[4991]: I1206 02:05:01.546819 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vk9c6" Dec 06 02:05:01 crc kubenswrapper[4991]: I1206 02:05:01.546846 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vk9c6" event={"ID":"f11455ae-3bb8-4298-9764-6e39e1297827","Type":"ContainerDied","Data":"a29a002fef1873226ea54f56aa54adec93aaa09bcadad10adb78555eddd2408c"} Dec 06 02:05:01 crc kubenswrapper[4991]: I1206 02:05:01.546871 4991 scope.go:117] "RemoveContainer" containerID="9604831446f1ea1bcef79bec35c6a17a51d1739405ac4055951495b191142c40" Dec 06 02:05:01 crc kubenswrapper[4991]: I1206 02:05:01.580993 4991 scope.go:117] "RemoveContainer" containerID="86f584897fcd5bc39e325c36f3926b4c0bd0ace84da1291253acff14a20e654d" Dec 06 02:05:01 crc kubenswrapper[4991]: I1206 02:05:01.590803 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vk9c6"] Dec 06 02:05:01 crc kubenswrapper[4991]: I1206 02:05:01.601430 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vk9c6"] Dec 06 02:05:01 crc kubenswrapper[4991]: I1206 02:05:01.602790 4991 scope.go:117] "RemoveContainer" containerID="f7218e3fad994e43c0412edd4ebd2106a12bc51ac464d4ffc224d07ea5571de4" Dec 06 02:05:01 crc kubenswrapper[4991]: I1206 02:05:01.646101 4991 scope.go:117] "RemoveContainer" containerID="9604831446f1ea1bcef79bec35c6a17a51d1739405ac4055951495b191142c40" Dec 06 02:05:01 crc kubenswrapper[4991]: E1206 02:05:01.646497 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9604831446f1ea1bcef79bec35c6a17a51d1739405ac4055951495b191142c40\": container with ID starting with 9604831446f1ea1bcef79bec35c6a17a51d1739405ac4055951495b191142c40 not found: ID does not exist" containerID="9604831446f1ea1bcef79bec35c6a17a51d1739405ac4055951495b191142c40" Dec 06 02:05:01 crc kubenswrapper[4991]: I1206 02:05:01.646533 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9604831446f1ea1bcef79bec35c6a17a51d1739405ac4055951495b191142c40"} err="failed to get container status \"9604831446f1ea1bcef79bec35c6a17a51d1739405ac4055951495b191142c40\": rpc error: code = NotFound desc = could not find container \"9604831446f1ea1bcef79bec35c6a17a51d1739405ac4055951495b191142c40\": container with ID starting with 9604831446f1ea1bcef79bec35c6a17a51d1739405ac4055951495b191142c40 not found: ID does not exist" Dec 06 02:05:01 crc kubenswrapper[4991]: I1206 02:05:01.646565 4991 scope.go:117] "RemoveContainer" containerID="86f584897fcd5bc39e325c36f3926b4c0bd0ace84da1291253acff14a20e654d" Dec 06 02:05:01 crc kubenswrapper[4991]: E1206 02:05:01.647033 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86f584897fcd5bc39e325c36f3926b4c0bd0ace84da1291253acff14a20e654d\": container with ID starting with 86f584897fcd5bc39e325c36f3926b4c0bd0ace84da1291253acff14a20e654d not found: ID does not exist" containerID="86f584897fcd5bc39e325c36f3926b4c0bd0ace84da1291253acff14a20e654d" Dec 06 02:05:01 crc kubenswrapper[4991]: I1206 02:05:01.647069 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86f584897fcd5bc39e325c36f3926b4c0bd0ace84da1291253acff14a20e654d"} err="failed to get container status \"86f584897fcd5bc39e325c36f3926b4c0bd0ace84da1291253acff14a20e654d\": rpc error: code = NotFound desc = could not find container \"86f584897fcd5bc39e325c36f3926b4c0bd0ace84da1291253acff14a20e654d\": container with ID starting with 86f584897fcd5bc39e325c36f3926b4c0bd0ace84da1291253acff14a20e654d not found: ID does not exist" Dec 06 02:05:01 crc kubenswrapper[4991]: I1206 02:05:01.647112 4991 scope.go:117] "RemoveContainer" containerID="f7218e3fad994e43c0412edd4ebd2106a12bc51ac464d4ffc224d07ea5571de4" Dec 06 02:05:01 crc kubenswrapper[4991]: E1206 02:05:01.647406 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7218e3fad994e43c0412edd4ebd2106a12bc51ac464d4ffc224d07ea5571de4\": container with ID starting with f7218e3fad994e43c0412edd4ebd2106a12bc51ac464d4ffc224d07ea5571de4 not found: ID does not exist" containerID="f7218e3fad994e43c0412edd4ebd2106a12bc51ac464d4ffc224d07ea5571de4" Dec 06 02:05:01 crc kubenswrapper[4991]: I1206 02:05:01.647433 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7218e3fad994e43c0412edd4ebd2106a12bc51ac464d4ffc224d07ea5571de4"} err="failed to get container status \"f7218e3fad994e43c0412edd4ebd2106a12bc51ac464d4ffc224d07ea5571de4\": rpc error: code = NotFound desc = could not find container \"f7218e3fad994e43c0412edd4ebd2106a12bc51ac464d4ffc224d07ea5571de4\": container with ID starting with f7218e3fad994e43c0412edd4ebd2106a12bc51ac464d4ffc224d07ea5571de4 not found: ID does not exist" Dec 06 02:05:03 crc kubenswrapper[4991]: I1206 02:05:03.049523 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f11455ae-3bb8-4298-9764-6e39e1297827" path="/var/lib/kubelet/pods/f11455ae-3bb8-4298-9764-6e39e1297827/volumes" Dec 06 02:05:04 crc kubenswrapper[4991]: I1206 02:05:04.478491 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-crkd2_66cf73ac-d6ca-4a8c-855b-3d26bec5adb7/manager/0.log" Dec 06 02:05:04 crc kubenswrapper[4991]: I1206 02:05:04.489592 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-crkd2_66cf73ac-d6ca-4a8c-855b-3d26bec5adb7/kube-rbac-proxy/0.log" Dec 06 02:05:04 crc kubenswrapper[4991]: I1206 02:05:04.643866 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66v82fh_8263afa1-30a1-4b88-b31a-7ca24507edbf/util/0.log" Dec 06 02:05:04 crc kubenswrapper[4991]: I1206 02:05:04.837647 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66v82fh_8263afa1-30a1-4b88-b31a-7ca24507edbf/util/0.log" Dec 06 02:05:05 crc kubenswrapper[4991]: I1206 02:05:05.276224 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66v82fh_8263afa1-30a1-4b88-b31a-7ca24507edbf/pull/0.log" Dec 06 02:05:05 crc kubenswrapper[4991]: I1206 02:05:05.278837 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66v82fh_8263afa1-30a1-4b88-b31a-7ca24507edbf/pull/0.log" Dec 06 02:05:05 crc kubenswrapper[4991]: I1206 02:05:05.465261 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66v82fh_8263afa1-30a1-4b88-b31a-7ca24507edbf/util/0.log" Dec 06 02:05:05 crc kubenswrapper[4991]: I1206 02:05:05.502442 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66v82fh_8263afa1-30a1-4b88-b31a-7ca24507edbf/pull/0.log" Dec 06 02:05:05 crc kubenswrapper[4991]: I1206 02:05:05.508801 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66v82fh_8263afa1-30a1-4b88-b31a-7ca24507edbf/extract/0.log" Dec 06 02:05:05 crc kubenswrapper[4991]: I1206 02:05:05.690933 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-gbqt8_c99be400-34b2-431b-ba71-9c6c03f70c91/kube-rbac-proxy/0.log" Dec 06 02:05:05 crc kubenswrapper[4991]: I1206 02:05:05.720876 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-gbqt8_c99be400-34b2-431b-ba71-9c6c03f70c91/manager/0.log" Dec 06 02:05:05 crc kubenswrapper[4991]: I1206 02:05:05.759826 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-twvbc_7ceea028-825c-4235-b1bb-52adec27f46d/kube-rbac-proxy/0.log" Dec 06 02:05:05 crc kubenswrapper[4991]: I1206 02:05:05.874263 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-twvbc_7ceea028-825c-4235-b1bb-52adec27f46d/manager/0.log" Dec 06 02:05:05 crc kubenswrapper[4991]: I1206 02:05:05.980525 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-hll9p_4275f29c-130a-4e02-aeb5-6410643706b8/kube-rbac-proxy/0.log" Dec 06 02:05:06 crc kubenswrapper[4991]: I1206 02:05:06.019975 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-hll9p_4275f29c-130a-4e02-aeb5-6410643706b8/manager/0.log" Dec 06 02:05:06 crc kubenswrapper[4991]: I1206 02:05:06.171398 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-5j6br_14a62a5b-5719-4192-b2d8-40dc2e000f41/kube-rbac-proxy/0.log" Dec 06 02:05:06 crc kubenswrapper[4991]: I1206 02:05:06.197530 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-5j6br_14a62a5b-5719-4192-b2d8-40dc2e000f41/manager/0.log" Dec 06 02:05:06 crc kubenswrapper[4991]: I1206 02:05:06.297737 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-h9lxn_fd88f0ae-0cbd-4958-a05a-51e18d3ff32f/kube-rbac-proxy/0.log" Dec 06 02:05:06 crc kubenswrapper[4991]: I1206 02:05:06.380965 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-h9lxn_fd88f0ae-0cbd-4958-a05a-51e18d3ff32f/manager/0.log" Dec 06 02:05:06 crc kubenswrapper[4991]: I1206 02:05:06.442102 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-96kw2_37dd0775-3a19-4005-9cf0-7c038c293004/kube-rbac-proxy/0.log" Dec 06 02:05:06 crc kubenswrapper[4991]: I1206 02:05:06.653559 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-96kw2_37dd0775-3a19-4005-9cf0-7c038c293004/manager/0.log" Dec 06 02:05:06 crc kubenswrapper[4991]: I1206 02:05:06.687426 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-ftl59_852a1018-8ddc-4a27-ba0c-8a1fcf0d010f/kube-rbac-proxy/0.log" Dec 06 02:05:06 crc kubenswrapper[4991]: I1206 02:05:06.742542 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-ftl59_852a1018-8ddc-4a27-ba0c-8a1fcf0d010f/manager/0.log" Dec 06 02:05:06 crc kubenswrapper[4991]: I1206 02:05:06.854400 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-lwqvk_d21383b5-1b67-4b8f-a176-05769a28528d/kube-rbac-proxy/0.log" Dec 06 02:05:06 crc kubenswrapper[4991]: I1206 02:05:06.984067 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-lwqvk_d21383b5-1b67-4b8f-a176-05769a28528d/manager/0.log" Dec 06 02:05:07 crc kubenswrapper[4991]: I1206 02:05:07.081330 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-trx9l_7f3695d3-1d23-4779-90a6-d1b62ac3edc5/kube-rbac-proxy/0.log" Dec 06 02:05:07 crc kubenswrapper[4991]: I1206 02:05:07.125472 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-trx9l_7f3695d3-1d23-4779-90a6-d1b62ac3edc5/manager/0.log" Dec 06 02:05:07 crc kubenswrapper[4991]: I1206 02:05:07.196967 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-4pvnj_4fafae92-8cd1-450f-9e3b-7abd1020ab75/kube-rbac-proxy/0.log" Dec 06 02:05:07 crc kubenswrapper[4991]: I1206 02:05:07.286280 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-4pvnj_4fafae92-8cd1-450f-9e3b-7abd1020ab75/manager/0.log" Dec 06 02:05:07 crc kubenswrapper[4991]: I1206 02:05:07.398611 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-rlhms_e5351a0d-ece3-441f-95f6-ccc4281e9462/kube-rbac-proxy/0.log" Dec 06 02:05:07 crc kubenswrapper[4991]: I1206 02:05:07.471053 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-rlhms_e5351a0d-ece3-441f-95f6-ccc4281e9462/manager/0.log" Dec 06 02:05:07 crc kubenswrapper[4991]: I1206 02:05:07.571875 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-xjgls_8c5d96f9-58a1-4ee7-82dd-f71237ef875c/kube-rbac-proxy/0.log" Dec 06 02:05:07 crc kubenswrapper[4991]: I1206 02:05:07.650566 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-xjgls_8c5d96f9-58a1-4ee7-82dd-f71237ef875c/manager/0.log" Dec 06 02:05:07 crc kubenswrapper[4991]: I1206 02:05:07.699815 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-4hx8k_91bdc603-1933-4f22-9ecd-d3e95f2f0924/kube-rbac-proxy/0.log" Dec 06 02:05:07 crc kubenswrapper[4991]: I1206 02:05:07.713542 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-4hx8k_91bdc603-1933-4f22-9ecd-d3e95f2f0924/manager/0.log" Dec 06 02:05:07 crc kubenswrapper[4991]: I1206 02:05:07.844907 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4wxbnr_1d549c0d-f4fd-4eeb-852b-78e302f65110/kube-rbac-proxy/0.log" Dec 06 02:05:07 crc kubenswrapper[4991]: I1206 02:05:07.885198 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4wxbnr_1d549c0d-f4fd-4eeb-852b-78e302f65110/manager/0.log" Dec 06 02:05:08 crc kubenswrapper[4991]: I1206 02:05:08.244037 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-qv7hh_8ed7eb4a-d10c-40bd-9db4-2f8c0434fd75/registry-server/0.log" Dec 06 02:05:08 crc kubenswrapper[4991]: I1206 02:05:08.386883 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-9phkz_a4fd6e71-89f4-41d3-beed-a894e45b8fe8/kube-rbac-proxy/0.log" Dec 06 02:05:08 crc kubenswrapper[4991]: I1206 02:05:08.405541 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5f6c6584bc-6swc7_dd84f245-235a-406c-be63-88e9170e9902/operator/0.log" Dec 06 02:05:08 crc kubenswrapper[4991]: I1206 02:05:08.600863 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-9phkz_a4fd6e71-89f4-41d3-beed-a894e45b8fe8/manager/0.log" Dec 06 02:05:08 crc kubenswrapper[4991]: I1206 02:05:08.627373 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-72txt_725ee172-29cd-40c6-b0a1-9be795634566/kube-rbac-proxy/0.log" Dec 06 02:05:08 crc kubenswrapper[4991]: I1206 02:05:08.705123 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-72txt_725ee172-29cd-40c6-b0a1-9be795634566/manager/0.log" Dec 06 02:05:08 crc kubenswrapper[4991]: I1206 02:05:08.928102 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-z5dnf_5cfe648a-c326-46aa-83ed-31659feb31d0/operator/0.log" Dec 06 02:05:08 crc kubenswrapper[4991]: I1206 02:05:08.980081 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-gkl7x_6b1fa1e2-0eb6-48bf-bebc-93db7d90a644/kube-rbac-proxy/0.log" Dec 06 02:05:09 crc kubenswrapper[4991]: I1206 02:05:09.152952 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7d5d6748cd-5gjhx_a60f2feb-b041-4bb0-b718-9e669f3cf9c8/manager/0.log" Dec 06 02:05:09 crc kubenswrapper[4991]: I1206 02:05:09.156508 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-gdckl_54706fa6-a00c-4f66-bce5-660d5475ff7b/kube-rbac-proxy/0.log" Dec 06 02:05:09 crc kubenswrapper[4991]: I1206 02:05:09.170073 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-gkl7x_6b1fa1e2-0eb6-48bf-bebc-93db7d90a644/manager/0.log" Dec 06 02:05:09 crc kubenswrapper[4991]: I1206 02:05:09.291953 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-gdckl_54706fa6-a00c-4f66-bce5-660d5475ff7b/manager/0.log" Dec 06 02:05:09 crc kubenswrapper[4991]: I1206 02:05:09.316827 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-5v8fh_7869a8de-b32f-4600-af69-49a42ec4d096/kube-rbac-proxy/0.log" Dec 06 02:05:09 crc kubenswrapper[4991]: I1206 02:05:09.373247 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-5v8fh_7869a8de-b32f-4600-af69-49a42ec4d096/manager/0.log" Dec 06 02:05:09 crc kubenswrapper[4991]: I1206 02:05:09.467192 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-r6xm6_35ecf5a7-ec56-4cfc-a96b-84dada809373/kube-rbac-proxy/0.log" Dec 06 02:05:09 crc kubenswrapper[4991]: I1206 02:05:09.502051 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-r6xm6_35ecf5a7-ec56-4cfc-a96b-84dada809373/manager/0.log" Dec 06 02:05:16 crc kubenswrapper[4991]: I1206 02:05:16.739309 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 02:05:16 crc kubenswrapper[4991]: I1206 02:05:16.739922 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 02:05:28 crc kubenswrapper[4991]: I1206 02:05:28.537805 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-c9qfb_9a829457-575c-4b6b-bc3b-68c117691597/control-plane-machine-set-operator/0.log" Dec 06 02:05:28 crc kubenswrapper[4991]: I1206 02:05:28.723816 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-hwqgl_6f04204c-29c6-4b62-b4f4-28935672a20e/kube-rbac-proxy/0.log" Dec 06 02:05:28 crc kubenswrapper[4991]: I1206 02:05:28.767783 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-hwqgl_6f04204c-29c6-4b62-b4f4-28935672a20e/machine-api-operator/0.log" Dec 06 02:05:42 crc kubenswrapper[4991]: I1206 02:05:42.122662 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-4w5qq_4c9fcb25-80e7-4c9a-b31d-68a4b1b82411/cert-manager-controller/0.log" Dec 06 02:05:42 crc kubenswrapper[4991]: I1206 02:05:42.216933 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-wcfbf_8bcf50ab-be54-4a1a-95ca-f43426ebf55e/cert-manager-cainjector/0.log" Dec 06 02:05:42 crc kubenswrapper[4991]: I1206 02:05:42.315790 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-2zqr8_217fa056-c208-4c13-af81-12e5e1c4d231/cert-manager-webhook/0.log" Dec 06 02:05:46 crc kubenswrapper[4991]: I1206 02:05:46.740412 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 02:05:46 crc kubenswrapper[4991]: I1206 02:05:46.741468 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 02:05:55 crc kubenswrapper[4991]: I1206 02:05:55.494673 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-gm4xg_7b08a3d0-cce4-42df-b89a-a9b11ef39a0d/nmstate-console-plugin/0.log" Dec 06 02:05:55 crc kubenswrapper[4991]: I1206 02:05:55.738996 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-6vvck_ac4eae73-f13b-4e52-9b25-f837bba897b0/nmstate-handler/0.log" Dec 06 02:05:55 crc kubenswrapper[4991]: I1206 02:05:55.741747 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-9czh7_8ba61c7d-dc05-4834-9e0e-e18bb755d1a9/kube-rbac-proxy/0.log" Dec 06 02:05:55 crc kubenswrapper[4991]: I1206 02:05:55.808895 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-9czh7_8ba61c7d-dc05-4834-9e0e-e18bb755d1a9/nmstate-metrics/0.log" Dec 06 02:05:55 crc kubenswrapper[4991]: I1206 02:05:55.903128 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-nl2cl_47befc6f-6b9a-4a39-a029-1d55f420cf3b/nmstate-operator/0.log" Dec 06 02:05:56 crc kubenswrapper[4991]: I1206 02:05:56.023228 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-q6ftt_461dd677-1efc-468e-833f-125c37a2c7b8/nmstate-webhook/0.log" Dec 06 02:06:10 crc kubenswrapper[4991]: I1206 02:06:10.976903 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-qtv46_8110ff83-33b6-4f27-946e-383e8bff44c7/kube-rbac-proxy/0.log" Dec 06 02:06:11 crc kubenswrapper[4991]: I1206 02:06:11.125938 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-qtv46_8110ff83-33b6-4f27-946e-383e8bff44c7/controller/0.log" Dec 06 02:06:11 crc kubenswrapper[4991]: I1206 02:06:11.243627 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fkgqs_a82fdef0-9a99-4150-b86b-c8839caa41a1/cp-frr-files/0.log" Dec 06 02:06:11 crc kubenswrapper[4991]: I1206 02:06:11.382790 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fkgqs_a82fdef0-9a99-4150-b86b-c8839caa41a1/cp-reloader/0.log" Dec 06 02:06:11 crc kubenswrapper[4991]: I1206 02:06:11.420488 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fkgqs_a82fdef0-9a99-4150-b86b-c8839caa41a1/cp-frr-files/0.log" Dec 06 02:06:11 crc kubenswrapper[4991]: I1206 02:06:11.460642 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fkgqs_a82fdef0-9a99-4150-b86b-c8839caa41a1/cp-reloader/0.log" Dec 06 02:06:11 crc kubenswrapper[4991]: I1206 02:06:11.469087 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fkgqs_a82fdef0-9a99-4150-b86b-c8839caa41a1/cp-metrics/0.log" Dec 06 02:06:11 crc kubenswrapper[4991]: I1206 02:06:11.619024 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fkgqs_a82fdef0-9a99-4150-b86b-c8839caa41a1/cp-reloader/0.log" Dec 06 02:06:11 crc kubenswrapper[4991]: I1206 02:06:11.671107 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fkgqs_a82fdef0-9a99-4150-b86b-c8839caa41a1/cp-frr-files/0.log" Dec 06 02:06:11 crc kubenswrapper[4991]: I1206 02:06:11.676618 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fkgqs_a82fdef0-9a99-4150-b86b-c8839caa41a1/cp-metrics/0.log" Dec 06 02:06:11 crc kubenswrapper[4991]: I1206 02:06:11.692282 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fkgqs_a82fdef0-9a99-4150-b86b-c8839caa41a1/cp-metrics/0.log" Dec 06 02:06:11 crc kubenswrapper[4991]: I1206 02:06:11.864110 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fkgqs_a82fdef0-9a99-4150-b86b-c8839caa41a1/cp-reloader/0.log" Dec 06 02:06:11 crc kubenswrapper[4991]: I1206 02:06:11.882241 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fkgqs_a82fdef0-9a99-4150-b86b-c8839caa41a1/cp-metrics/0.log" Dec 06 02:06:11 crc kubenswrapper[4991]: I1206 02:06:11.885795 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fkgqs_a82fdef0-9a99-4150-b86b-c8839caa41a1/cp-frr-files/0.log" Dec 06 02:06:11 crc kubenswrapper[4991]: I1206 02:06:11.903430 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fkgqs_a82fdef0-9a99-4150-b86b-c8839caa41a1/controller/0.log" Dec 06 02:06:12 crc kubenswrapper[4991]: I1206 02:06:12.109996 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fkgqs_a82fdef0-9a99-4150-b86b-c8839caa41a1/kube-rbac-proxy/0.log" Dec 06 02:06:12 crc kubenswrapper[4991]: I1206 02:06:12.123267 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fkgqs_a82fdef0-9a99-4150-b86b-c8839caa41a1/kube-rbac-proxy-frr/0.log" Dec 06 02:06:12 crc kubenswrapper[4991]: I1206 02:06:12.124059 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fkgqs_a82fdef0-9a99-4150-b86b-c8839caa41a1/frr-metrics/0.log" Dec 06 02:06:12 crc kubenswrapper[4991]: I1206 02:06:12.304047 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fkgqs_a82fdef0-9a99-4150-b86b-c8839caa41a1/reloader/0.log" Dec 06 02:06:12 crc kubenswrapper[4991]: I1206 02:06:12.374532 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-t5xxb_61ac5c00-6180-4a7b-a4c0-fd98da3bb456/frr-k8s-webhook-server/0.log" Dec 06 02:06:12 crc kubenswrapper[4991]: I1206 02:06:12.602848 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-545bb986f8-rh6kd_4fd49c2f-3ac6-4444-ab0a-4d9a47617377/manager/0.log" Dec 06 02:06:12 crc kubenswrapper[4991]: I1206 02:06:12.788882 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-8495449b99-p7f45_2eba891b-78cc-4a1c-8f65-a09829f349e8/webhook-server/0.log" Dec 06 02:06:12 crc kubenswrapper[4991]: I1206 02:06:12.935040 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-4hfhs_79797e2b-b768-414a-8f3f-0a68ae045269/kube-rbac-proxy/0.log" Dec 06 02:06:13 crc kubenswrapper[4991]: I1206 02:06:13.554126 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fkgqs_a82fdef0-9a99-4150-b86b-c8839caa41a1/frr/0.log" Dec 06 02:06:13 crc kubenswrapper[4991]: I1206 02:06:13.566522 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-4hfhs_79797e2b-b768-414a-8f3f-0a68ae045269/speaker/0.log" Dec 06 02:06:16 crc kubenswrapper[4991]: I1206 02:06:16.739965 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 02:06:16 crc kubenswrapper[4991]: I1206 02:06:16.740369 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 02:06:16 crc kubenswrapper[4991]: I1206 02:06:16.740426 4991 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" Dec 06 02:06:16 crc kubenswrapper[4991]: I1206 02:06:16.741343 4991 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3f69fd2784f79b950316e24a856a47d6fb8cc89cd4f6585f544e9f30b9f5185a"} pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 02:06:16 crc kubenswrapper[4991]: I1206 02:06:16.741401 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" containerID="cri-o://3f69fd2784f79b950316e24a856a47d6fb8cc89cd4f6585f544e9f30b9f5185a" gracePeriod=600 Dec 06 02:06:16 crc kubenswrapper[4991]: E1206 02:06:16.876723 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 02:06:17 crc kubenswrapper[4991]: I1206 02:06:17.348569 4991 generic.go:334] "Generic (PLEG): container finished" podID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerID="3f69fd2784f79b950316e24a856a47d6fb8cc89cd4f6585f544e9f30b9f5185a" exitCode=0 Dec 06 02:06:17 crc kubenswrapper[4991]: I1206 02:06:17.348616 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" event={"ID":"f35cea72-9e31-4432-9e3b-16a50ebce794","Type":"ContainerDied","Data":"3f69fd2784f79b950316e24a856a47d6fb8cc89cd4f6585f544e9f30b9f5185a"} Dec 06 02:06:17 crc kubenswrapper[4991]: I1206 02:06:17.348649 4991 scope.go:117] "RemoveContainer" containerID="569db4596366e9518cb632f3539d5e3f236a4f822ba96150dcc6d1c03fe9e86d" Dec 06 02:06:17 crc kubenswrapper[4991]: I1206 02:06:17.349390 4991 scope.go:117] "RemoveContainer" containerID="3f69fd2784f79b950316e24a856a47d6fb8cc89cd4f6585f544e9f30b9f5185a" Dec 06 02:06:17 crc kubenswrapper[4991]: E1206 02:06:17.349712 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 02:06:27 crc kubenswrapper[4991]: I1206 02:06:27.037578 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnzf99_6f2005a0-682b-485a-bb40-b60cd9683c2e/util/0.log" Dec 06 02:06:27 crc kubenswrapper[4991]: I1206 02:06:27.254719 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnzf99_6f2005a0-682b-485a-bb40-b60cd9683c2e/pull/0.log" Dec 06 02:06:27 crc kubenswrapper[4991]: I1206 02:06:27.276052 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnzf99_6f2005a0-682b-485a-bb40-b60cd9683c2e/util/0.log" Dec 06 02:06:27 crc kubenswrapper[4991]: I1206 02:06:27.280041 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnzf99_6f2005a0-682b-485a-bb40-b60cd9683c2e/pull/0.log" Dec 06 02:06:27 crc kubenswrapper[4991]: I1206 02:06:27.431898 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnzf99_6f2005a0-682b-485a-bb40-b60cd9683c2e/extract/0.log" Dec 06 02:06:27 crc kubenswrapper[4991]: I1206 02:06:27.455872 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnzf99_6f2005a0-682b-485a-bb40-b60cd9683c2e/pull/0.log" Dec 06 02:06:27 crc kubenswrapper[4991]: I1206 02:06:27.489452 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnzf99_6f2005a0-682b-485a-bb40-b60cd9683c2e/util/0.log" Dec 06 02:06:27 crc kubenswrapper[4991]: I1206 02:06:27.609625 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x7pmx_c4cc76d8-965d-4228-8e93-4d9e736a9f95/util/0.log" Dec 06 02:06:27 crc kubenswrapper[4991]: I1206 02:06:27.814014 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x7pmx_c4cc76d8-965d-4228-8e93-4d9e736a9f95/pull/0.log" Dec 06 02:06:27 crc kubenswrapper[4991]: I1206 02:06:27.830322 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x7pmx_c4cc76d8-965d-4228-8e93-4d9e736a9f95/util/0.log" Dec 06 02:06:27 crc kubenswrapper[4991]: I1206 02:06:27.844593 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x7pmx_c4cc76d8-965d-4228-8e93-4d9e736a9f95/pull/0.log" Dec 06 02:06:27 crc kubenswrapper[4991]: I1206 02:06:27.996392 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x7pmx_c4cc76d8-965d-4228-8e93-4d9e736a9f95/util/0.log" Dec 06 02:06:28 crc kubenswrapper[4991]: I1206 02:06:28.018815 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x7pmx_c4cc76d8-965d-4228-8e93-4d9e736a9f95/pull/0.log" Dec 06 02:06:28 crc kubenswrapper[4991]: I1206 02:06:28.038079 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x7pmx_c4cc76d8-965d-4228-8e93-4d9e736a9f95/extract/0.log" Dec 06 02:06:28 crc kubenswrapper[4991]: I1206 02:06:28.202764 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-w9mn9_97d40092-c83e-4410-9535-73e4aad9eafc/extract-utilities/0.log" Dec 06 02:06:28 crc kubenswrapper[4991]: I1206 02:06:28.388201 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-w9mn9_97d40092-c83e-4410-9535-73e4aad9eafc/extract-utilities/0.log" Dec 06 02:06:28 crc kubenswrapper[4991]: I1206 02:06:28.406665 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-w9mn9_97d40092-c83e-4410-9535-73e4aad9eafc/extract-content/0.log" Dec 06 02:06:28 crc kubenswrapper[4991]: I1206 02:06:28.421923 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-w9mn9_97d40092-c83e-4410-9535-73e4aad9eafc/extract-content/0.log" Dec 06 02:06:28 crc kubenswrapper[4991]: I1206 02:06:28.570900 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-w9mn9_97d40092-c83e-4410-9535-73e4aad9eafc/extract-utilities/0.log" Dec 06 02:06:28 crc kubenswrapper[4991]: I1206 02:06:28.576550 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-w9mn9_97d40092-c83e-4410-9535-73e4aad9eafc/extract-content/0.log" Dec 06 02:06:28 crc kubenswrapper[4991]: I1206 02:06:28.750237 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-v9ncc_d9f080b8-d4ac-4dd5-9ce8-850cb9303ae1/extract-utilities/0.log" Dec 06 02:06:28 crc kubenswrapper[4991]: I1206 02:06:28.961807 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-v9ncc_d9f080b8-d4ac-4dd5-9ce8-850cb9303ae1/extract-utilities/0.log" Dec 06 02:06:28 crc kubenswrapper[4991]: I1206 02:06:28.988701 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-w9mn9_97d40092-c83e-4410-9535-73e4aad9eafc/registry-server/0.log" Dec 06 02:06:28 crc kubenswrapper[4991]: I1206 02:06:28.993927 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-v9ncc_d9f080b8-d4ac-4dd5-9ce8-850cb9303ae1/extract-content/0.log" Dec 06 02:06:28 crc kubenswrapper[4991]: I1206 02:06:28.999377 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-v9ncc_d9f080b8-d4ac-4dd5-9ce8-850cb9303ae1/extract-content/0.log" Dec 06 02:06:29 crc kubenswrapper[4991]: I1206 02:06:29.181912 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-v9ncc_d9f080b8-d4ac-4dd5-9ce8-850cb9303ae1/extract-utilities/0.log" Dec 06 02:06:29 crc kubenswrapper[4991]: I1206 02:06:29.199606 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-v9ncc_d9f080b8-d4ac-4dd5-9ce8-850cb9303ae1/extract-content/0.log" Dec 06 02:06:29 crc kubenswrapper[4991]: I1206 02:06:29.416404 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-xs7vk_10a1dada-c415-427c-a5ed-36eac2391451/marketplace-operator/0.log" Dec 06 02:06:29 crc kubenswrapper[4991]: I1206 02:06:29.521672 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-chxpn_a9de55b2-75a9-4843-bfee-297b2522e7f0/extract-utilities/0.log" Dec 06 02:06:29 crc kubenswrapper[4991]: I1206 02:06:29.721701 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-chxpn_a9de55b2-75a9-4843-bfee-297b2522e7f0/extract-content/0.log" Dec 06 02:06:29 crc kubenswrapper[4991]: I1206 02:06:29.788612 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-chxpn_a9de55b2-75a9-4843-bfee-297b2522e7f0/extract-utilities/0.log" Dec 06 02:06:29 crc kubenswrapper[4991]: I1206 02:06:29.835502 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-chxpn_a9de55b2-75a9-4843-bfee-297b2522e7f0/extract-content/0.log" Dec 06 02:06:29 crc kubenswrapper[4991]: I1206 02:06:29.906165 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-v9ncc_d9f080b8-d4ac-4dd5-9ce8-850cb9303ae1/registry-server/0.log" Dec 06 02:06:29 crc kubenswrapper[4991]: I1206 02:06:29.940543 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-chxpn_a9de55b2-75a9-4843-bfee-297b2522e7f0/extract-utilities/0.log" Dec 06 02:06:30 crc kubenswrapper[4991]: I1206 02:06:30.027725 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-chxpn_a9de55b2-75a9-4843-bfee-297b2522e7f0/extract-content/0.log" Dec 06 02:06:30 crc kubenswrapper[4991]: I1206 02:06:30.038538 4991 scope.go:117] "RemoveContainer" containerID="3f69fd2784f79b950316e24a856a47d6fb8cc89cd4f6585f544e9f30b9f5185a" Dec 06 02:06:30 crc kubenswrapper[4991]: E1206 02:06:30.038817 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 02:06:30 crc kubenswrapper[4991]: I1206 02:06:30.169103 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wb2rc_aa0283c0-7c60-47d5-9bf8-28bf2c1d0ecc/extract-utilities/0.log" Dec 06 02:06:30 crc kubenswrapper[4991]: I1206 02:06:30.182502 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-chxpn_a9de55b2-75a9-4843-bfee-297b2522e7f0/registry-server/0.log" Dec 06 02:06:30 crc kubenswrapper[4991]: I1206 02:06:30.385795 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wb2rc_aa0283c0-7c60-47d5-9bf8-28bf2c1d0ecc/extract-utilities/0.log" Dec 06 02:06:30 crc kubenswrapper[4991]: I1206 02:06:30.385928 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wb2rc_aa0283c0-7c60-47d5-9bf8-28bf2c1d0ecc/extract-content/0.log" Dec 06 02:06:30 crc kubenswrapper[4991]: I1206 02:06:30.420672 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wb2rc_aa0283c0-7c60-47d5-9bf8-28bf2c1d0ecc/extract-content/0.log" Dec 06 02:06:30 crc kubenswrapper[4991]: I1206 02:06:30.554935 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wb2rc_aa0283c0-7c60-47d5-9bf8-28bf2c1d0ecc/extract-content/0.log" Dec 06 02:06:30 crc kubenswrapper[4991]: I1206 02:06:30.591791 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wb2rc_aa0283c0-7c60-47d5-9bf8-28bf2c1d0ecc/extract-utilities/0.log" Dec 06 02:06:31 crc kubenswrapper[4991]: I1206 02:06:31.074887 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wb2rc_aa0283c0-7c60-47d5-9bf8-28bf2c1d0ecc/registry-server/0.log" Dec 06 02:06:42 crc kubenswrapper[4991]: I1206 02:06:42.037941 4991 scope.go:117] "RemoveContainer" containerID="3f69fd2784f79b950316e24a856a47d6fb8cc89cd4f6585f544e9f30b9f5185a" Dec 06 02:06:42 crc kubenswrapper[4991]: E1206 02:06:42.039076 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 02:06:54 crc kubenswrapper[4991]: I1206 02:06:54.038227 4991 scope.go:117] "RemoveContainer" containerID="3f69fd2784f79b950316e24a856a47d6fb8cc89cd4f6585f544e9f30b9f5185a" Dec 06 02:06:54 crc kubenswrapper[4991]: E1206 02:06:54.038946 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 02:07:06 crc kubenswrapper[4991]: I1206 02:07:06.039297 4991 scope.go:117] "RemoveContainer" containerID="3f69fd2784f79b950316e24a856a47d6fb8cc89cd4f6585f544e9f30b9f5185a" Dec 06 02:07:06 crc kubenswrapper[4991]: E1206 02:07:06.040574 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 02:07:18 crc kubenswrapper[4991]: I1206 02:07:18.040198 4991 scope.go:117] "RemoveContainer" containerID="3f69fd2784f79b950316e24a856a47d6fb8cc89cd4f6585f544e9f30b9f5185a" Dec 06 02:07:18 crc kubenswrapper[4991]: E1206 02:07:18.040895 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 02:07:30 crc kubenswrapper[4991]: I1206 02:07:30.038251 4991 scope.go:117] "RemoveContainer" containerID="3f69fd2784f79b950316e24a856a47d6fb8cc89cd4f6585f544e9f30b9f5185a" Dec 06 02:07:30 crc kubenswrapper[4991]: E1206 02:07:30.038960 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 02:07:41 crc kubenswrapper[4991]: I1206 02:07:41.038526 4991 scope.go:117] "RemoveContainer" containerID="3f69fd2784f79b950316e24a856a47d6fb8cc89cd4f6585f544e9f30b9f5185a" Dec 06 02:07:41 crc kubenswrapper[4991]: E1206 02:07:41.039519 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 02:07:54 crc kubenswrapper[4991]: I1206 02:07:54.040247 4991 scope.go:117] "RemoveContainer" containerID="3f69fd2784f79b950316e24a856a47d6fb8cc89cd4f6585f544e9f30b9f5185a" Dec 06 02:07:54 crc kubenswrapper[4991]: E1206 02:07:54.042431 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 02:08:09 crc kubenswrapper[4991]: I1206 02:08:09.038491 4991 scope.go:117] "RemoveContainer" containerID="3f69fd2784f79b950316e24a856a47d6fb8cc89cd4f6585f544e9f30b9f5185a" Dec 06 02:08:09 crc kubenswrapper[4991]: E1206 02:08:09.039432 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 02:08:18 crc kubenswrapper[4991]: I1206 02:08:18.202306 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6s7gs"] Dec 06 02:08:18 crc kubenswrapper[4991]: E1206 02:08:18.203456 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f11455ae-3bb8-4298-9764-6e39e1297827" containerName="registry-server" Dec 06 02:08:18 crc kubenswrapper[4991]: I1206 02:08:18.203474 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="f11455ae-3bb8-4298-9764-6e39e1297827" containerName="registry-server" Dec 06 02:08:18 crc kubenswrapper[4991]: E1206 02:08:18.203501 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f11455ae-3bb8-4298-9764-6e39e1297827" containerName="extract-utilities" Dec 06 02:08:18 crc kubenswrapper[4991]: I1206 02:08:18.203510 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="f11455ae-3bb8-4298-9764-6e39e1297827" containerName="extract-utilities" Dec 06 02:08:18 crc kubenswrapper[4991]: E1206 02:08:18.203537 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f11455ae-3bb8-4298-9764-6e39e1297827" containerName="extract-content" Dec 06 02:08:18 crc kubenswrapper[4991]: I1206 02:08:18.203545 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="f11455ae-3bb8-4298-9764-6e39e1297827" containerName="extract-content" Dec 06 02:08:18 crc kubenswrapper[4991]: I1206 02:08:18.203782 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="f11455ae-3bb8-4298-9764-6e39e1297827" containerName="registry-server" Dec 06 02:08:18 crc kubenswrapper[4991]: I1206 02:08:18.205423 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6s7gs" Dec 06 02:08:18 crc kubenswrapper[4991]: I1206 02:08:18.227350 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6s7gs"] Dec 06 02:08:18 crc kubenswrapper[4991]: I1206 02:08:18.324072 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6ltp\" (UniqueName: \"kubernetes.io/projected/a8579788-e9f8-4bb1-baf3-2315faa34123-kube-api-access-c6ltp\") pod \"redhat-operators-6s7gs\" (UID: \"a8579788-e9f8-4bb1-baf3-2315faa34123\") " pod="openshift-marketplace/redhat-operators-6s7gs" Dec 06 02:08:18 crc kubenswrapper[4991]: I1206 02:08:18.324205 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8579788-e9f8-4bb1-baf3-2315faa34123-catalog-content\") pod \"redhat-operators-6s7gs\" (UID: \"a8579788-e9f8-4bb1-baf3-2315faa34123\") " pod="openshift-marketplace/redhat-operators-6s7gs" Dec 06 02:08:18 crc kubenswrapper[4991]: I1206 02:08:18.324246 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8579788-e9f8-4bb1-baf3-2315faa34123-utilities\") pod \"redhat-operators-6s7gs\" (UID: \"a8579788-e9f8-4bb1-baf3-2315faa34123\") " pod="openshift-marketplace/redhat-operators-6s7gs" Dec 06 02:08:18 crc kubenswrapper[4991]: I1206 02:08:18.425963 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6ltp\" (UniqueName: \"kubernetes.io/projected/a8579788-e9f8-4bb1-baf3-2315faa34123-kube-api-access-c6ltp\") pod \"redhat-operators-6s7gs\" (UID: \"a8579788-e9f8-4bb1-baf3-2315faa34123\") " pod="openshift-marketplace/redhat-operators-6s7gs" Dec 06 02:08:18 crc kubenswrapper[4991]: I1206 02:08:18.426053 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8579788-e9f8-4bb1-baf3-2315faa34123-catalog-content\") pod \"redhat-operators-6s7gs\" (UID: \"a8579788-e9f8-4bb1-baf3-2315faa34123\") " pod="openshift-marketplace/redhat-operators-6s7gs" Dec 06 02:08:18 crc kubenswrapper[4991]: I1206 02:08:18.426092 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8579788-e9f8-4bb1-baf3-2315faa34123-utilities\") pod \"redhat-operators-6s7gs\" (UID: \"a8579788-e9f8-4bb1-baf3-2315faa34123\") " pod="openshift-marketplace/redhat-operators-6s7gs" Dec 06 02:08:18 crc kubenswrapper[4991]: I1206 02:08:18.426522 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8579788-e9f8-4bb1-baf3-2315faa34123-catalog-content\") pod \"redhat-operators-6s7gs\" (UID: \"a8579788-e9f8-4bb1-baf3-2315faa34123\") " pod="openshift-marketplace/redhat-operators-6s7gs" Dec 06 02:08:18 crc kubenswrapper[4991]: I1206 02:08:18.426550 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8579788-e9f8-4bb1-baf3-2315faa34123-utilities\") pod \"redhat-operators-6s7gs\" (UID: \"a8579788-e9f8-4bb1-baf3-2315faa34123\") " pod="openshift-marketplace/redhat-operators-6s7gs" Dec 06 02:08:18 crc kubenswrapper[4991]: I1206 02:08:18.449220 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6ltp\" (UniqueName: \"kubernetes.io/projected/a8579788-e9f8-4bb1-baf3-2315faa34123-kube-api-access-c6ltp\") pod \"redhat-operators-6s7gs\" (UID: \"a8579788-e9f8-4bb1-baf3-2315faa34123\") " pod="openshift-marketplace/redhat-operators-6s7gs" Dec 06 02:08:18 crc kubenswrapper[4991]: I1206 02:08:18.548518 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6s7gs" Dec 06 02:08:18 crc kubenswrapper[4991]: I1206 02:08:18.712230 4991 generic.go:334] "Generic (PLEG): container finished" podID="4bd7f7ba-4ca7-4a22-a400-10f272a42e5f" containerID="c8f4607ef8c4aafd717123a4f52f8e3c0ed4986df5fa6e2ad0146eed6f60831d" exitCode=0 Dec 06 02:08:18 crc kubenswrapper[4991]: I1206 02:08:18.712326 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5l8jz/must-gather-wqhp2" event={"ID":"4bd7f7ba-4ca7-4a22-a400-10f272a42e5f","Type":"ContainerDied","Data":"c8f4607ef8c4aafd717123a4f52f8e3c0ed4986df5fa6e2ad0146eed6f60831d"} Dec 06 02:08:18 crc kubenswrapper[4991]: I1206 02:08:18.713519 4991 scope.go:117] "RemoveContainer" containerID="c8f4607ef8c4aafd717123a4f52f8e3c0ed4986df5fa6e2ad0146eed6f60831d" Dec 06 02:08:18 crc kubenswrapper[4991]: I1206 02:08:18.964399 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5l8jz_must-gather-wqhp2_4bd7f7ba-4ca7-4a22-a400-10f272a42e5f/gather/0.log" Dec 06 02:08:19 crc kubenswrapper[4991]: I1206 02:08:19.058325 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6s7gs"] Dec 06 02:08:19 crc kubenswrapper[4991]: I1206 02:08:19.722239 4991 generic.go:334] "Generic (PLEG): container finished" podID="a8579788-e9f8-4bb1-baf3-2315faa34123" containerID="2322b61c8730f13b35e3a7679f451e7a290292ae69912effd02c96beb9866db8" exitCode=0 Dec 06 02:08:19 crc kubenswrapper[4991]: I1206 02:08:19.722288 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6s7gs" event={"ID":"a8579788-e9f8-4bb1-baf3-2315faa34123","Type":"ContainerDied","Data":"2322b61c8730f13b35e3a7679f451e7a290292ae69912effd02c96beb9866db8"} Dec 06 02:08:19 crc kubenswrapper[4991]: I1206 02:08:19.722316 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6s7gs" event={"ID":"a8579788-e9f8-4bb1-baf3-2315faa34123","Type":"ContainerStarted","Data":"acc7423eea9edab35cff2585a6d741b0c59c0c2b4f329916694f3778efcc9909"} Dec 06 02:08:19 crc kubenswrapper[4991]: I1206 02:08:19.725038 4991 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 02:08:21 crc kubenswrapper[4991]: I1206 02:08:21.745037 4991 generic.go:334] "Generic (PLEG): container finished" podID="a8579788-e9f8-4bb1-baf3-2315faa34123" containerID="b42423d6f2ec0ae3867d3f2152e440fc3300a9bcbb42fa80297b1866f0ed46da" exitCode=0 Dec 06 02:08:21 crc kubenswrapper[4991]: I1206 02:08:21.745138 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6s7gs" event={"ID":"a8579788-e9f8-4bb1-baf3-2315faa34123","Type":"ContainerDied","Data":"b42423d6f2ec0ae3867d3f2152e440fc3300a9bcbb42fa80297b1866f0ed46da"} Dec 06 02:08:22 crc kubenswrapper[4991]: I1206 02:08:22.039336 4991 scope.go:117] "RemoveContainer" containerID="3f69fd2784f79b950316e24a856a47d6fb8cc89cd4f6585f544e9f30b9f5185a" Dec 06 02:08:22 crc kubenswrapper[4991]: E1206 02:08:22.039828 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 02:08:22 crc kubenswrapper[4991]: I1206 02:08:22.765258 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6s7gs" event={"ID":"a8579788-e9f8-4bb1-baf3-2315faa34123","Type":"ContainerStarted","Data":"15f32d9cb80dcecdc4b129a8ce7f0bf861d8c04072e07ba16995188bfc33a367"} Dec 06 02:08:22 crc kubenswrapper[4991]: I1206 02:08:22.797550 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6s7gs" podStartSLOduration=2.341361606 podStartE2EDuration="4.79750841s" podCreationTimestamp="2025-12-06 02:08:18 +0000 UTC" firstStartedPulling="2025-12-06 02:08:19.72481284 +0000 UTC m=+3973.075103308" lastFinishedPulling="2025-12-06 02:08:22.180959644 +0000 UTC m=+3975.531250112" observedRunningTime="2025-12-06 02:08:22.790474514 +0000 UTC m=+3976.140765022" watchObservedRunningTime="2025-12-06 02:08:22.79750841 +0000 UTC m=+3976.147798878" Dec 06 02:08:27 crc kubenswrapper[4991]: I1206 02:08:27.321267 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5l8jz/must-gather-wqhp2"] Dec 06 02:08:27 crc kubenswrapper[4991]: I1206 02:08:27.322416 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-5l8jz/must-gather-wqhp2" podUID="4bd7f7ba-4ca7-4a22-a400-10f272a42e5f" containerName="copy" containerID="cri-o://d79d6e9cea10f86c85722e1387ea68540194aeb5c964682c3f9f1290e37ac355" gracePeriod=2 Dec 06 02:08:27 crc kubenswrapper[4991]: I1206 02:08:27.332349 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5l8jz/must-gather-wqhp2"] Dec 06 02:08:27 crc kubenswrapper[4991]: I1206 02:08:27.827581 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5l8jz_must-gather-wqhp2_4bd7f7ba-4ca7-4a22-a400-10f272a42e5f/copy/0.log" Dec 06 02:08:27 crc kubenswrapper[4991]: I1206 02:08:27.828410 4991 generic.go:334] "Generic (PLEG): container finished" podID="4bd7f7ba-4ca7-4a22-a400-10f272a42e5f" containerID="d79d6e9cea10f86c85722e1387ea68540194aeb5c964682c3f9f1290e37ac355" exitCode=143 Dec 06 02:08:28 crc kubenswrapper[4991]: I1206 02:08:28.266868 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5l8jz_must-gather-wqhp2_4bd7f7ba-4ca7-4a22-a400-10f272a42e5f/copy/0.log" Dec 06 02:08:28 crc kubenswrapper[4991]: I1206 02:08:28.267665 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5l8jz/must-gather-wqhp2" Dec 06 02:08:28 crc kubenswrapper[4991]: I1206 02:08:28.357175 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4bd7f7ba-4ca7-4a22-a400-10f272a42e5f-must-gather-output\") pod \"4bd7f7ba-4ca7-4a22-a400-10f272a42e5f\" (UID: \"4bd7f7ba-4ca7-4a22-a400-10f272a42e5f\") " Dec 06 02:08:28 crc kubenswrapper[4991]: I1206 02:08:28.357380 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbmbv\" (UniqueName: \"kubernetes.io/projected/4bd7f7ba-4ca7-4a22-a400-10f272a42e5f-kube-api-access-cbmbv\") pod \"4bd7f7ba-4ca7-4a22-a400-10f272a42e5f\" (UID: \"4bd7f7ba-4ca7-4a22-a400-10f272a42e5f\") " Dec 06 02:08:28 crc kubenswrapper[4991]: I1206 02:08:28.364370 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bd7f7ba-4ca7-4a22-a400-10f272a42e5f-kube-api-access-cbmbv" (OuterVolumeSpecName: "kube-api-access-cbmbv") pod "4bd7f7ba-4ca7-4a22-a400-10f272a42e5f" (UID: "4bd7f7ba-4ca7-4a22-a400-10f272a42e5f"). InnerVolumeSpecName "kube-api-access-cbmbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 02:08:28 crc kubenswrapper[4991]: I1206 02:08:28.459392 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbmbv\" (UniqueName: \"kubernetes.io/projected/4bd7f7ba-4ca7-4a22-a400-10f272a42e5f-kube-api-access-cbmbv\") on node \"crc\" DevicePath \"\"" Dec 06 02:08:28 crc kubenswrapper[4991]: I1206 02:08:28.501954 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bd7f7ba-4ca7-4a22-a400-10f272a42e5f-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "4bd7f7ba-4ca7-4a22-a400-10f272a42e5f" (UID: "4bd7f7ba-4ca7-4a22-a400-10f272a42e5f"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 02:08:28 crc kubenswrapper[4991]: I1206 02:08:28.548941 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6s7gs" Dec 06 02:08:28 crc kubenswrapper[4991]: I1206 02:08:28.549261 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6s7gs" Dec 06 02:08:28 crc kubenswrapper[4991]: I1206 02:08:28.561245 4991 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4bd7f7ba-4ca7-4a22-a400-10f272a42e5f-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 06 02:08:28 crc kubenswrapper[4991]: I1206 02:08:28.622737 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6s7gs" Dec 06 02:08:28 crc kubenswrapper[4991]: I1206 02:08:28.859146 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5l8jz_must-gather-wqhp2_4bd7f7ba-4ca7-4a22-a400-10f272a42e5f/copy/0.log" Dec 06 02:08:28 crc kubenswrapper[4991]: I1206 02:08:28.859882 4991 scope.go:117] "RemoveContainer" containerID="d79d6e9cea10f86c85722e1387ea68540194aeb5c964682c3f9f1290e37ac355" Dec 06 02:08:28 crc kubenswrapper[4991]: I1206 02:08:28.859905 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5l8jz/must-gather-wqhp2" Dec 06 02:08:28 crc kubenswrapper[4991]: I1206 02:08:28.889312 4991 scope.go:117] "RemoveContainer" containerID="c8f4607ef8c4aafd717123a4f52f8e3c0ed4986df5fa6e2ad0146eed6f60831d" Dec 06 02:08:28 crc kubenswrapper[4991]: I1206 02:08:28.924175 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6s7gs" Dec 06 02:08:28 crc kubenswrapper[4991]: I1206 02:08:28.991812 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6s7gs"] Dec 06 02:08:29 crc kubenswrapper[4991]: I1206 02:08:29.050635 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bd7f7ba-4ca7-4a22-a400-10f272a42e5f" path="/var/lib/kubelet/pods/4bd7f7ba-4ca7-4a22-a400-10f272a42e5f/volumes" Dec 06 02:08:30 crc kubenswrapper[4991]: I1206 02:08:30.884150 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6s7gs" podUID="a8579788-e9f8-4bb1-baf3-2315faa34123" containerName="registry-server" containerID="cri-o://15f32d9cb80dcecdc4b129a8ce7f0bf861d8c04072e07ba16995188bfc33a367" gracePeriod=2 Dec 06 02:08:31 crc kubenswrapper[4991]: I1206 02:08:31.462426 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6s7gs" Dec 06 02:08:31 crc kubenswrapper[4991]: I1206 02:08:31.541952 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6ltp\" (UniqueName: \"kubernetes.io/projected/a8579788-e9f8-4bb1-baf3-2315faa34123-kube-api-access-c6ltp\") pod \"a8579788-e9f8-4bb1-baf3-2315faa34123\" (UID: \"a8579788-e9f8-4bb1-baf3-2315faa34123\") " Dec 06 02:08:31 crc kubenswrapper[4991]: I1206 02:08:31.542426 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8579788-e9f8-4bb1-baf3-2315faa34123-catalog-content\") pod \"a8579788-e9f8-4bb1-baf3-2315faa34123\" (UID: \"a8579788-e9f8-4bb1-baf3-2315faa34123\") " Dec 06 02:08:31 crc kubenswrapper[4991]: I1206 02:08:31.542552 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8579788-e9f8-4bb1-baf3-2315faa34123-utilities\") pod \"a8579788-e9f8-4bb1-baf3-2315faa34123\" (UID: \"a8579788-e9f8-4bb1-baf3-2315faa34123\") " Dec 06 02:08:31 crc kubenswrapper[4991]: I1206 02:08:31.543707 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8579788-e9f8-4bb1-baf3-2315faa34123-utilities" (OuterVolumeSpecName: "utilities") pod "a8579788-e9f8-4bb1-baf3-2315faa34123" (UID: "a8579788-e9f8-4bb1-baf3-2315faa34123"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 02:08:31 crc kubenswrapper[4991]: I1206 02:08:31.544193 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8579788-e9f8-4bb1-baf3-2315faa34123-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 02:08:31 crc kubenswrapper[4991]: I1206 02:08:31.548265 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8579788-e9f8-4bb1-baf3-2315faa34123-kube-api-access-c6ltp" (OuterVolumeSpecName: "kube-api-access-c6ltp") pod "a8579788-e9f8-4bb1-baf3-2315faa34123" (UID: "a8579788-e9f8-4bb1-baf3-2315faa34123"). InnerVolumeSpecName "kube-api-access-c6ltp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 02:08:31 crc kubenswrapper[4991]: I1206 02:08:31.646459 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6ltp\" (UniqueName: \"kubernetes.io/projected/a8579788-e9f8-4bb1-baf3-2315faa34123-kube-api-access-c6ltp\") on node \"crc\" DevicePath \"\"" Dec 06 02:08:31 crc kubenswrapper[4991]: I1206 02:08:31.899172 4991 generic.go:334] "Generic (PLEG): container finished" podID="a8579788-e9f8-4bb1-baf3-2315faa34123" containerID="15f32d9cb80dcecdc4b129a8ce7f0bf861d8c04072e07ba16995188bfc33a367" exitCode=0 Dec 06 02:08:31 crc kubenswrapper[4991]: I1206 02:08:31.899249 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6s7gs" event={"ID":"a8579788-e9f8-4bb1-baf3-2315faa34123","Type":"ContainerDied","Data":"15f32d9cb80dcecdc4b129a8ce7f0bf861d8c04072e07ba16995188bfc33a367"} Dec 06 02:08:31 crc kubenswrapper[4991]: I1206 02:08:31.899330 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6s7gs" event={"ID":"a8579788-e9f8-4bb1-baf3-2315faa34123","Type":"ContainerDied","Data":"acc7423eea9edab35cff2585a6d741b0c59c0c2b4f329916694f3778efcc9909"} Dec 06 02:08:31 crc kubenswrapper[4991]: I1206 02:08:31.899344 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6s7gs" Dec 06 02:08:31 crc kubenswrapper[4991]: I1206 02:08:31.899364 4991 scope.go:117] "RemoveContainer" containerID="15f32d9cb80dcecdc4b129a8ce7f0bf861d8c04072e07ba16995188bfc33a367" Dec 06 02:08:31 crc kubenswrapper[4991]: I1206 02:08:31.928433 4991 scope.go:117] "RemoveContainer" containerID="b42423d6f2ec0ae3867d3f2152e440fc3300a9bcbb42fa80297b1866f0ed46da" Dec 06 02:08:31 crc kubenswrapper[4991]: I1206 02:08:31.964254 4991 scope.go:117] "RemoveContainer" containerID="2322b61c8730f13b35e3a7679f451e7a290292ae69912effd02c96beb9866db8" Dec 06 02:08:32 crc kubenswrapper[4991]: I1206 02:08:32.011869 4991 scope.go:117] "RemoveContainer" containerID="15f32d9cb80dcecdc4b129a8ce7f0bf861d8c04072e07ba16995188bfc33a367" Dec 06 02:08:32 crc kubenswrapper[4991]: E1206 02:08:32.012515 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15f32d9cb80dcecdc4b129a8ce7f0bf861d8c04072e07ba16995188bfc33a367\": container with ID starting with 15f32d9cb80dcecdc4b129a8ce7f0bf861d8c04072e07ba16995188bfc33a367 not found: ID does not exist" containerID="15f32d9cb80dcecdc4b129a8ce7f0bf861d8c04072e07ba16995188bfc33a367" Dec 06 02:08:32 crc kubenswrapper[4991]: I1206 02:08:32.012564 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15f32d9cb80dcecdc4b129a8ce7f0bf861d8c04072e07ba16995188bfc33a367"} err="failed to get container status \"15f32d9cb80dcecdc4b129a8ce7f0bf861d8c04072e07ba16995188bfc33a367\": rpc error: code = NotFound desc = could not find container \"15f32d9cb80dcecdc4b129a8ce7f0bf861d8c04072e07ba16995188bfc33a367\": container with ID starting with 15f32d9cb80dcecdc4b129a8ce7f0bf861d8c04072e07ba16995188bfc33a367 not found: ID does not exist" Dec 06 02:08:32 crc kubenswrapper[4991]: I1206 02:08:32.012592 4991 scope.go:117] "RemoveContainer" containerID="b42423d6f2ec0ae3867d3f2152e440fc3300a9bcbb42fa80297b1866f0ed46da" Dec 06 02:08:32 crc kubenswrapper[4991]: E1206 02:08:32.015943 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b42423d6f2ec0ae3867d3f2152e440fc3300a9bcbb42fa80297b1866f0ed46da\": container with ID starting with b42423d6f2ec0ae3867d3f2152e440fc3300a9bcbb42fa80297b1866f0ed46da not found: ID does not exist" containerID="b42423d6f2ec0ae3867d3f2152e440fc3300a9bcbb42fa80297b1866f0ed46da" Dec 06 02:08:32 crc kubenswrapper[4991]: I1206 02:08:32.016029 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b42423d6f2ec0ae3867d3f2152e440fc3300a9bcbb42fa80297b1866f0ed46da"} err="failed to get container status \"b42423d6f2ec0ae3867d3f2152e440fc3300a9bcbb42fa80297b1866f0ed46da\": rpc error: code = NotFound desc = could not find container \"b42423d6f2ec0ae3867d3f2152e440fc3300a9bcbb42fa80297b1866f0ed46da\": container with ID starting with b42423d6f2ec0ae3867d3f2152e440fc3300a9bcbb42fa80297b1866f0ed46da not found: ID does not exist" Dec 06 02:08:32 crc kubenswrapper[4991]: I1206 02:08:32.016070 4991 scope.go:117] "RemoveContainer" containerID="2322b61c8730f13b35e3a7679f451e7a290292ae69912effd02c96beb9866db8" Dec 06 02:08:32 crc kubenswrapper[4991]: E1206 02:08:32.016558 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2322b61c8730f13b35e3a7679f451e7a290292ae69912effd02c96beb9866db8\": container with ID starting with 2322b61c8730f13b35e3a7679f451e7a290292ae69912effd02c96beb9866db8 not found: ID does not exist" containerID="2322b61c8730f13b35e3a7679f451e7a290292ae69912effd02c96beb9866db8" Dec 06 02:08:32 crc kubenswrapper[4991]: I1206 02:08:32.016598 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2322b61c8730f13b35e3a7679f451e7a290292ae69912effd02c96beb9866db8"} err="failed to get container status \"2322b61c8730f13b35e3a7679f451e7a290292ae69912effd02c96beb9866db8\": rpc error: code = NotFound desc = could not find container \"2322b61c8730f13b35e3a7679f451e7a290292ae69912effd02c96beb9866db8\": container with ID starting with 2322b61c8730f13b35e3a7679f451e7a290292ae69912effd02c96beb9866db8 not found: ID does not exist" Dec 06 02:08:33 crc kubenswrapper[4991]: I1206 02:08:33.039171 4991 scope.go:117] "RemoveContainer" containerID="3f69fd2784f79b950316e24a856a47d6fb8cc89cd4f6585f544e9f30b9f5185a" Dec 06 02:08:33 crc kubenswrapper[4991]: E1206 02:08:33.039920 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 02:08:33 crc kubenswrapper[4991]: I1206 02:08:33.232611 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8579788-e9f8-4bb1-baf3-2315faa34123-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a8579788-e9f8-4bb1-baf3-2315faa34123" (UID: "a8579788-e9f8-4bb1-baf3-2315faa34123"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 02:08:33 crc kubenswrapper[4991]: I1206 02:08:33.280530 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8579788-e9f8-4bb1-baf3-2315faa34123-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 02:08:33 crc kubenswrapper[4991]: I1206 02:08:33.446918 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6s7gs"] Dec 06 02:08:33 crc kubenswrapper[4991]: I1206 02:08:33.458925 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6s7gs"] Dec 06 02:08:35 crc kubenswrapper[4991]: I1206 02:08:35.072194 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8579788-e9f8-4bb1-baf3-2315faa34123" path="/var/lib/kubelet/pods/a8579788-e9f8-4bb1-baf3-2315faa34123/volumes" Dec 06 02:08:44 crc kubenswrapper[4991]: I1206 02:08:44.039223 4991 scope.go:117] "RemoveContainer" containerID="3f69fd2784f79b950316e24a856a47d6fb8cc89cd4f6585f544e9f30b9f5185a" Dec 06 02:08:44 crc kubenswrapper[4991]: E1206 02:08:44.040415 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 02:08:56 crc kubenswrapper[4991]: I1206 02:08:56.039385 4991 scope.go:117] "RemoveContainer" containerID="3f69fd2784f79b950316e24a856a47d6fb8cc89cd4f6585f544e9f30b9f5185a" Dec 06 02:08:56 crc kubenswrapper[4991]: E1206 02:08:56.042165 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 02:09:08 crc kubenswrapper[4991]: I1206 02:09:08.037936 4991 scope.go:117] "RemoveContainer" containerID="3f69fd2784f79b950316e24a856a47d6fb8cc89cd4f6585f544e9f30b9f5185a" Dec 06 02:09:08 crc kubenswrapper[4991]: E1206 02:09:08.038803 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 02:09:23 crc kubenswrapper[4991]: I1206 02:09:23.039199 4991 scope.go:117] "RemoveContainer" containerID="3f69fd2784f79b950316e24a856a47d6fb8cc89cd4f6585f544e9f30b9f5185a" Dec 06 02:09:23 crc kubenswrapper[4991]: E1206 02:09:23.040147 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 02:09:36 crc kubenswrapper[4991]: I1206 02:09:36.038534 4991 scope.go:117] "RemoveContainer" containerID="3f69fd2784f79b950316e24a856a47d6fb8cc89cd4f6585f544e9f30b9f5185a" Dec 06 02:09:36 crc kubenswrapper[4991]: E1206 02:09:36.039671 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 02:09:48 crc kubenswrapper[4991]: I1206 02:09:48.038408 4991 scope.go:117] "RemoveContainer" containerID="3f69fd2784f79b950316e24a856a47d6fb8cc89cd4f6585f544e9f30b9f5185a" Dec 06 02:09:48 crc kubenswrapper[4991]: E1206 02:09:48.039432 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 02:10:00 crc kubenswrapper[4991]: I1206 02:10:00.038405 4991 scope.go:117] "RemoveContainer" containerID="3f69fd2784f79b950316e24a856a47d6fb8cc89cd4f6585f544e9f30b9f5185a" Dec 06 02:10:00 crc kubenswrapper[4991]: E1206 02:10:00.039362 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 02:10:11 crc kubenswrapper[4991]: I1206 02:10:11.039848 4991 scope.go:117] "RemoveContainer" containerID="3f69fd2784f79b950316e24a856a47d6fb8cc89cd4f6585f544e9f30b9f5185a" Dec 06 02:10:11 crc kubenswrapper[4991]: E1206 02:10:11.041260 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 02:10:14 crc kubenswrapper[4991]: I1206 02:10:14.211874 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-85d2n"] Dec 06 02:10:14 crc kubenswrapper[4991]: E1206 02:10:14.213025 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bd7f7ba-4ca7-4a22-a400-10f272a42e5f" containerName="gather" Dec 06 02:10:14 crc kubenswrapper[4991]: I1206 02:10:14.213044 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bd7f7ba-4ca7-4a22-a400-10f272a42e5f" containerName="gather" Dec 06 02:10:14 crc kubenswrapper[4991]: E1206 02:10:14.213063 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bd7f7ba-4ca7-4a22-a400-10f272a42e5f" containerName="copy" Dec 06 02:10:14 crc kubenswrapper[4991]: I1206 02:10:14.213071 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bd7f7ba-4ca7-4a22-a400-10f272a42e5f" containerName="copy" Dec 06 02:10:14 crc kubenswrapper[4991]: E1206 02:10:14.213086 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8579788-e9f8-4bb1-baf3-2315faa34123" containerName="registry-server" Dec 06 02:10:14 crc kubenswrapper[4991]: I1206 02:10:14.213094 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8579788-e9f8-4bb1-baf3-2315faa34123" containerName="registry-server" Dec 06 02:10:14 crc kubenswrapper[4991]: E1206 02:10:14.213127 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8579788-e9f8-4bb1-baf3-2315faa34123" containerName="extract-content" Dec 06 02:10:14 crc kubenswrapper[4991]: I1206 02:10:14.213135 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8579788-e9f8-4bb1-baf3-2315faa34123" containerName="extract-content" Dec 06 02:10:14 crc kubenswrapper[4991]: E1206 02:10:14.213158 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8579788-e9f8-4bb1-baf3-2315faa34123" containerName="extract-utilities" Dec 06 02:10:14 crc kubenswrapper[4991]: I1206 02:10:14.213166 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8579788-e9f8-4bb1-baf3-2315faa34123" containerName="extract-utilities" Dec 06 02:10:14 crc kubenswrapper[4991]: I1206 02:10:14.213401 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bd7f7ba-4ca7-4a22-a400-10f272a42e5f" containerName="gather" Dec 06 02:10:14 crc kubenswrapper[4991]: I1206 02:10:14.213419 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8579788-e9f8-4bb1-baf3-2315faa34123" containerName="registry-server" Dec 06 02:10:14 crc kubenswrapper[4991]: I1206 02:10:14.213450 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bd7f7ba-4ca7-4a22-a400-10f272a42e5f" containerName="copy" Dec 06 02:10:14 crc kubenswrapper[4991]: I1206 02:10:14.215243 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-85d2n" Dec 06 02:10:14 crc kubenswrapper[4991]: I1206 02:10:14.225688 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-85d2n"] Dec 06 02:10:14 crc kubenswrapper[4991]: I1206 02:10:14.322065 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6303302d-5d89-467e-825a-cfd8c2d9191e-catalog-content\") pod \"certified-operators-85d2n\" (UID: \"6303302d-5d89-467e-825a-cfd8c2d9191e\") " pod="openshift-marketplace/certified-operators-85d2n" Dec 06 02:10:14 crc kubenswrapper[4991]: I1206 02:10:14.322121 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkv9v\" (UniqueName: \"kubernetes.io/projected/6303302d-5d89-467e-825a-cfd8c2d9191e-kube-api-access-pkv9v\") pod \"certified-operators-85d2n\" (UID: \"6303302d-5d89-467e-825a-cfd8c2d9191e\") " pod="openshift-marketplace/certified-operators-85d2n" Dec 06 02:10:14 crc kubenswrapper[4991]: I1206 02:10:14.322146 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6303302d-5d89-467e-825a-cfd8c2d9191e-utilities\") pod \"certified-operators-85d2n\" (UID: \"6303302d-5d89-467e-825a-cfd8c2d9191e\") " pod="openshift-marketplace/certified-operators-85d2n" Dec 06 02:10:14 crc kubenswrapper[4991]: I1206 02:10:14.424538 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6303302d-5d89-467e-825a-cfd8c2d9191e-catalog-content\") pod \"certified-operators-85d2n\" (UID: \"6303302d-5d89-467e-825a-cfd8c2d9191e\") " pod="openshift-marketplace/certified-operators-85d2n" Dec 06 02:10:14 crc kubenswrapper[4991]: I1206 02:10:14.424632 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkv9v\" (UniqueName: \"kubernetes.io/projected/6303302d-5d89-467e-825a-cfd8c2d9191e-kube-api-access-pkv9v\") pod \"certified-operators-85d2n\" (UID: \"6303302d-5d89-467e-825a-cfd8c2d9191e\") " pod="openshift-marketplace/certified-operators-85d2n" Dec 06 02:10:14 crc kubenswrapper[4991]: I1206 02:10:14.424668 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6303302d-5d89-467e-825a-cfd8c2d9191e-utilities\") pod \"certified-operators-85d2n\" (UID: \"6303302d-5d89-467e-825a-cfd8c2d9191e\") " pod="openshift-marketplace/certified-operators-85d2n" Dec 06 02:10:14 crc kubenswrapper[4991]: I1206 02:10:14.425167 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6303302d-5d89-467e-825a-cfd8c2d9191e-catalog-content\") pod \"certified-operators-85d2n\" (UID: \"6303302d-5d89-467e-825a-cfd8c2d9191e\") " pod="openshift-marketplace/certified-operators-85d2n" Dec 06 02:10:14 crc kubenswrapper[4991]: I1206 02:10:14.425190 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6303302d-5d89-467e-825a-cfd8c2d9191e-utilities\") pod \"certified-operators-85d2n\" (UID: \"6303302d-5d89-467e-825a-cfd8c2d9191e\") " pod="openshift-marketplace/certified-operators-85d2n" Dec 06 02:10:14 crc kubenswrapper[4991]: I1206 02:10:14.447011 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkv9v\" (UniqueName: \"kubernetes.io/projected/6303302d-5d89-467e-825a-cfd8c2d9191e-kube-api-access-pkv9v\") pod \"certified-operators-85d2n\" (UID: \"6303302d-5d89-467e-825a-cfd8c2d9191e\") " pod="openshift-marketplace/certified-operators-85d2n" Dec 06 02:10:14 crc kubenswrapper[4991]: I1206 02:10:14.550625 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-85d2n" Dec 06 02:10:15 crc kubenswrapper[4991]: I1206 02:10:15.081374 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-85d2n"] Dec 06 02:10:15 crc kubenswrapper[4991]: I1206 02:10:15.551309 4991 generic.go:334] "Generic (PLEG): container finished" podID="6303302d-5d89-467e-825a-cfd8c2d9191e" containerID="a0d8199a2748a4d06f98fe4b99c3b33b73a4c48b58475a19c343fd4871de1814" exitCode=0 Dec 06 02:10:15 crc kubenswrapper[4991]: I1206 02:10:15.551452 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-85d2n" event={"ID":"6303302d-5d89-467e-825a-cfd8c2d9191e","Type":"ContainerDied","Data":"a0d8199a2748a4d06f98fe4b99c3b33b73a4c48b58475a19c343fd4871de1814"} Dec 06 02:10:15 crc kubenswrapper[4991]: I1206 02:10:15.551639 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-85d2n" event={"ID":"6303302d-5d89-467e-825a-cfd8c2d9191e","Type":"ContainerStarted","Data":"05c589f847b5fd44eab3b082827f2e1bbaff251570dce2de4adb0c57cd6e900e"} Dec 06 02:10:17 crc kubenswrapper[4991]: I1206 02:10:17.584975 4991 generic.go:334] "Generic (PLEG): container finished" podID="6303302d-5d89-467e-825a-cfd8c2d9191e" containerID="3f4774688a2d4b5e519a3430f7578dfe256a8027f3e48922cf9a33b923ba2657" exitCode=0 Dec 06 02:10:17 crc kubenswrapper[4991]: I1206 02:10:17.585097 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-85d2n" event={"ID":"6303302d-5d89-467e-825a-cfd8c2d9191e","Type":"ContainerDied","Data":"3f4774688a2d4b5e519a3430f7578dfe256a8027f3e48922cf9a33b923ba2657"} Dec 06 02:10:18 crc kubenswrapper[4991]: I1206 02:10:18.597467 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-85d2n" event={"ID":"6303302d-5d89-467e-825a-cfd8c2d9191e","Type":"ContainerStarted","Data":"0ac5ae7700096fbbe2550d1772852e9dc20614c5c79b1b1d0a7d35c32b7ba08d"} Dec 06 02:10:18 crc kubenswrapper[4991]: I1206 02:10:18.628935 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-85d2n" podStartSLOduration=2.168082102 podStartE2EDuration="4.628912348s" podCreationTimestamp="2025-12-06 02:10:14 +0000 UTC" firstStartedPulling="2025-12-06 02:10:15.555565491 +0000 UTC m=+4088.905855959" lastFinishedPulling="2025-12-06 02:10:18.016395737 +0000 UTC m=+4091.366686205" observedRunningTime="2025-12-06 02:10:18.618826501 +0000 UTC m=+4091.969116989" watchObservedRunningTime="2025-12-06 02:10:18.628912348 +0000 UTC m=+4091.979202826" Dec 06 02:10:24 crc kubenswrapper[4991]: I1206 02:10:24.039039 4991 scope.go:117] "RemoveContainer" containerID="3f69fd2784f79b950316e24a856a47d6fb8cc89cd4f6585f544e9f30b9f5185a" Dec 06 02:10:24 crc kubenswrapper[4991]: E1206 02:10:24.040082 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 02:10:24 crc kubenswrapper[4991]: I1206 02:10:24.551551 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-85d2n" Dec 06 02:10:24 crc kubenswrapper[4991]: I1206 02:10:24.551954 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-85d2n" Dec 06 02:10:24 crc kubenswrapper[4991]: I1206 02:10:24.630862 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-85d2n" Dec 06 02:10:24 crc kubenswrapper[4991]: I1206 02:10:24.743241 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-85d2n" Dec 06 02:10:24 crc kubenswrapper[4991]: I1206 02:10:24.883404 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-85d2n"] Dec 06 02:10:26 crc kubenswrapper[4991]: I1206 02:10:26.683452 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-85d2n" podUID="6303302d-5d89-467e-825a-cfd8c2d9191e" containerName="registry-server" containerID="cri-o://0ac5ae7700096fbbe2550d1772852e9dc20614c5c79b1b1d0a7d35c32b7ba08d" gracePeriod=2 Dec 06 02:10:27 crc kubenswrapper[4991]: I1206 02:10:27.238850 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-85d2n" Dec 06 02:10:27 crc kubenswrapper[4991]: I1206 02:10:27.327537 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6303302d-5d89-467e-825a-cfd8c2d9191e-utilities\") pod \"6303302d-5d89-467e-825a-cfd8c2d9191e\" (UID: \"6303302d-5d89-467e-825a-cfd8c2d9191e\") " Dec 06 02:10:27 crc kubenswrapper[4991]: I1206 02:10:27.327748 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkv9v\" (UniqueName: \"kubernetes.io/projected/6303302d-5d89-467e-825a-cfd8c2d9191e-kube-api-access-pkv9v\") pod \"6303302d-5d89-467e-825a-cfd8c2d9191e\" (UID: \"6303302d-5d89-467e-825a-cfd8c2d9191e\") " Dec 06 02:10:27 crc kubenswrapper[4991]: I1206 02:10:27.327803 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6303302d-5d89-467e-825a-cfd8c2d9191e-catalog-content\") pod \"6303302d-5d89-467e-825a-cfd8c2d9191e\" (UID: \"6303302d-5d89-467e-825a-cfd8c2d9191e\") " Dec 06 02:10:27 crc kubenswrapper[4991]: I1206 02:10:27.328339 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6303302d-5d89-467e-825a-cfd8c2d9191e-utilities" (OuterVolumeSpecName: "utilities") pod "6303302d-5d89-467e-825a-cfd8c2d9191e" (UID: "6303302d-5d89-467e-825a-cfd8c2d9191e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 02:10:27 crc kubenswrapper[4991]: I1206 02:10:27.332897 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6303302d-5d89-467e-825a-cfd8c2d9191e-kube-api-access-pkv9v" (OuterVolumeSpecName: "kube-api-access-pkv9v") pod "6303302d-5d89-467e-825a-cfd8c2d9191e" (UID: "6303302d-5d89-467e-825a-cfd8c2d9191e"). InnerVolumeSpecName "kube-api-access-pkv9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 02:10:27 crc kubenswrapper[4991]: I1206 02:10:27.386614 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6303302d-5d89-467e-825a-cfd8c2d9191e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6303302d-5d89-467e-825a-cfd8c2d9191e" (UID: "6303302d-5d89-467e-825a-cfd8c2d9191e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 02:10:27 crc kubenswrapper[4991]: I1206 02:10:27.430662 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6303302d-5d89-467e-825a-cfd8c2d9191e-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 02:10:27 crc kubenswrapper[4991]: I1206 02:10:27.430714 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkv9v\" (UniqueName: \"kubernetes.io/projected/6303302d-5d89-467e-825a-cfd8c2d9191e-kube-api-access-pkv9v\") on node \"crc\" DevicePath \"\"" Dec 06 02:10:27 crc kubenswrapper[4991]: I1206 02:10:27.430730 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6303302d-5d89-467e-825a-cfd8c2d9191e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 02:10:27 crc kubenswrapper[4991]: I1206 02:10:27.700452 4991 generic.go:334] "Generic (PLEG): container finished" podID="6303302d-5d89-467e-825a-cfd8c2d9191e" containerID="0ac5ae7700096fbbe2550d1772852e9dc20614c5c79b1b1d0a7d35c32b7ba08d" exitCode=0 Dec 06 02:10:27 crc kubenswrapper[4991]: I1206 02:10:27.700529 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-85d2n" event={"ID":"6303302d-5d89-467e-825a-cfd8c2d9191e","Type":"ContainerDied","Data":"0ac5ae7700096fbbe2550d1772852e9dc20614c5c79b1b1d0a7d35c32b7ba08d"} Dec 06 02:10:27 crc kubenswrapper[4991]: I1206 02:10:27.700585 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-85d2n" event={"ID":"6303302d-5d89-467e-825a-cfd8c2d9191e","Type":"ContainerDied","Data":"05c589f847b5fd44eab3b082827f2e1bbaff251570dce2de4adb0c57cd6e900e"} Dec 06 02:10:27 crc kubenswrapper[4991]: I1206 02:10:27.700622 4991 scope.go:117] "RemoveContainer" containerID="0ac5ae7700096fbbe2550d1772852e9dc20614c5c79b1b1d0a7d35c32b7ba08d" Dec 06 02:10:27 crc kubenswrapper[4991]: I1206 02:10:27.701490 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-85d2n" Dec 06 02:10:27 crc kubenswrapper[4991]: I1206 02:10:27.727649 4991 scope.go:117] "RemoveContainer" containerID="3f4774688a2d4b5e519a3430f7578dfe256a8027f3e48922cf9a33b923ba2657" Dec 06 02:10:27 crc kubenswrapper[4991]: I1206 02:10:27.745276 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-85d2n"] Dec 06 02:10:27 crc kubenswrapper[4991]: I1206 02:10:27.754254 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-85d2n"] Dec 06 02:10:27 crc kubenswrapper[4991]: I1206 02:10:27.780209 4991 scope.go:117] "RemoveContainer" containerID="a0d8199a2748a4d06f98fe4b99c3b33b73a4c48b58475a19c343fd4871de1814" Dec 06 02:10:27 crc kubenswrapper[4991]: I1206 02:10:27.811393 4991 scope.go:117] "RemoveContainer" containerID="0ac5ae7700096fbbe2550d1772852e9dc20614c5c79b1b1d0a7d35c32b7ba08d" Dec 06 02:10:27 crc kubenswrapper[4991]: E1206 02:10:27.811838 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ac5ae7700096fbbe2550d1772852e9dc20614c5c79b1b1d0a7d35c32b7ba08d\": container with ID starting with 0ac5ae7700096fbbe2550d1772852e9dc20614c5c79b1b1d0a7d35c32b7ba08d not found: ID does not exist" containerID="0ac5ae7700096fbbe2550d1772852e9dc20614c5c79b1b1d0a7d35c32b7ba08d" Dec 06 02:10:27 crc kubenswrapper[4991]: I1206 02:10:27.811946 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ac5ae7700096fbbe2550d1772852e9dc20614c5c79b1b1d0a7d35c32b7ba08d"} err="failed to get container status \"0ac5ae7700096fbbe2550d1772852e9dc20614c5c79b1b1d0a7d35c32b7ba08d\": rpc error: code = NotFound desc = could not find container \"0ac5ae7700096fbbe2550d1772852e9dc20614c5c79b1b1d0a7d35c32b7ba08d\": container with ID starting with 0ac5ae7700096fbbe2550d1772852e9dc20614c5c79b1b1d0a7d35c32b7ba08d not found: ID does not exist" Dec 06 02:10:27 crc kubenswrapper[4991]: I1206 02:10:27.812052 4991 scope.go:117] "RemoveContainer" containerID="3f4774688a2d4b5e519a3430f7578dfe256a8027f3e48922cf9a33b923ba2657" Dec 06 02:10:27 crc kubenswrapper[4991]: E1206 02:10:27.812399 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f4774688a2d4b5e519a3430f7578dfe256a8027f3e48922cf9a33b923ba2657\": container with ID starting with 3f4774688a2d4b5e519a3430f7578dfe256a8027f3e48922cf9a33b923ba2657 not found: ID does not exist" containerID="3f4774688a2d4b5e519a3430f7578dfe256a8027f3e48922cf9a33b923ba2657" Dec 06 02:10:27 crc kubenswrapper[4991]: I1206 02:10:27.812479 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f4774688a2d4b5e519a3430f7578dfe256a8027f3e48922cf9a33b923ba2657"} err="failed to get container status \"3f4774688a2d4b5e519a3430f7578dfe256a8027f3e48922cf9a33b923ba2657\": rpc error: code = NotFound desc = could not find container \"3f4774688a2d4b5e519a3430f7578dfe256a8027f3e48922cf9a33b923ba2657\": container with ID starting with 3f4774688a2d4b5e519a3430f7578dfe256a8027f3e48922cf9a33b923ba2657 not found: ID does not exist" Dec 06 02:10:27 crc kubenswrapper[4991]: I1206 02:10:27.812543 4991 scope.go:117] "RemoveContainer" containerID="a0d8199a2748a4d06f98fe4b99c3b33b73a4c48b58475a19c343fd4871de1814" Dec 06 02:10:27 crc kubenswrapper[4991]: E1206 02:10:27.812885 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0d8199a2748a4d06f98fe4b99c3b33b73a4c48b58475a19c343fd4871de1814\": container with ID starting with a0d8199a2748a4d06f98fe4b99c3b33b73a4c48b58475a19c343fd4871de1814 not found: ID does not exist" containerID="a0d8199a2748a4d06f98fe4b99c3b33b73a4c48b58475a19c343fd4871de1814" Dec 06 02:10:27 crc kubenswrapper[4991]: I1206 02:10:27.812978 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0d8199a2748a4d06f98fe4b99c3b33b73a4c48b58475a19c343fd4871de1814"} err="failed to get container status \"a0d8199a2748a4d06f98fe4b99c3b33b73a4c48b58475a19c343fd4871de1814\": rpc error: code = NotFound desc = could not find container \"a0d8199a2748a4d06f98fe4b99c3b33b73a4c48b58475a19c343fd4871de1814\": container with ID starting with a0d8199a2748a4d06f98fe4b99c3b33b73a4c48b58475a19c343fd4871de1814 not found: ID does not exist" Dec 06 02:10:29 crc kubenswrapper[4991]: I1206 02:10:29.058756 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6303302d-5d89-467e-825a-cfd8c2d9191e" path="/var/lib/kubelet/pods/6303302d-5d89-467e-825a-cfd8c2d9191e/volumes" Dec 06 02:10:38 crc kubenswrapper[4991]: I1206 02:10:38.038931 4991 scope.go:117] "RemoveContainer" containerID="3f69fd2784f79b950316e24a856a47d6fb8cc89cd4f6585f544e9f30b9f5185a" Dec 06 02:10:38 crc kubenswrapper[4991]: E1206 02:10:38.040402 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 02:10:49 crc kubenswrapper[4991]: I1206 02:10:49.038977 4991 scope.go:117] "RemoveContainer" containerID="3f69fd2784f79b950316e24a856a47d6fb8cc89cd4f6585f544e9f30b9f5185a" Dec 06 02:10:49 crc kubenswrapper[4991]: E1206 02:10:49.041941 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 02:11:00 crc kubenswrapper[4991]: I1206 02:11:00.046392 4991 scope.go:117] "RemoveContainer" containerID="3f69fd2784f79b950316e24a856a47d6fb8cc89cd4f6585f544e9f30b9f5185a" Dec 06 02:11:00 crc kubenswrapper[4991]: E1206 02:11:00.047178 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 02:11:14 crc kubenswrapper[4991]: I1206 02:11:14.038203 4991 scope.go:117] "RemoveContainer" containerID="3f69fd2784f79b950316e24a856a47d6fb8cc89cd4f6585f544e9f30b9f5185a" Dec 06 02:11:14 crc kubenswrapper[4991]: E1206 02:11:14.039239 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 02:11:27 crc kubenswrapper[4991]: I1206 02:11:27.003939 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n2nnk/must-gather-2fsl5"] Dec 06 02:11:27 crc kubenswrapper[4991]: E1206 02:11:27.004841 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6303302d-5d89-467e-825a-cfd8c2d9191e" containerName="registry-server" Dec 06 02:11:27 crc kubenswrapper[4991]: I1206 02:11:27.004853 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="6303302d-5d89-467e-825a-cfd8c2d9191e" containerName="registry-server" Dec 06 02:11:27 crc kubenswrapper[4991]: E1206 02:11:27.004867 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6303302d-5d89-467e-825a-cfd8c2d9191e" containerName="extract-content" Dec 06 02:11:27 crc kubenswrapper[4991]: I1206 02:11:27.004874 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="6303302d-5d89-467e-825a-cfd8c2d9191e" containerName="extract-content" Dec 06 02:11:27 crc kubenswrapper[4991]: E1206 02:11:27.004892 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6303302d-5d89-467e-825a-cfd8c2d9191e" containerName="extract-utilities" Dec 06 02:11:27 crc kubenswrapper[4991]: I1206 02:11:27.004899 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="6303302d-5d89-467e-825a-cfd8c2d9191e" containerName="extract-utilities" Dec 06 02:11:27 crc kubenswrapper[4991]: I1206 02:11:27.005103 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="6303302d-5d89-467e-825a-cfd8c2d9191e" containerName="registry-server" Dec 06 02:11:27 crc kubenswrapper[4991]: I1206 02:11:27.006114 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2nnk/must-gather-2fsl5" Dec 06 02:11:27 crc kubenswrapper[4991]: I1206 02:11:27.008352 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-n2nnk"/"default-dockercfg-wvtk6" Dec 06 02:11:27 crc kubenswrapper[4991]: I1206 02:11:27.008363 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-n2nnk"/"openshift-service-ca.crt" Dec 06 02:11:27 crc kubenswrapper[4991]: I1206 02:11:27.009277 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-n2nnk"/"kube-root-ca.crt" Dec 06 02:11:27 crc kubenswrapper[4991]: I1206 02:11:27.026375 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-n2nnk/must-gather-2fsl5"] Dec 06 02:11:27 crc kubenswrapper[4991]: I1206 02:11:27.040887 4991 scope.go:117] "RemoveContainer" containerID="3f69fd2784f79b950316e24a856a47d6fb8cc89cd4f6585f544e9f30b9f5185a" Dec 06 02:11:27 crc kubenswrapper[4991]: I1206 02:11:27.098262 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mcnh\" (UniqueName: \"kubernetes.io/projected/79cf9126-93c7-4477-a7ad-fd310b716db4-kube-api-access-6mcnh\") pod \"must-gather-2fsl5\" (UID: \"79cf9126-93c7-4477-a7ad-fd310b716db4\") " pod="openshift-must-gather-n2nnk/must-gather-2fsl5" Dec 06 02:11:27 crc kubenswrapper[4991]: I1206 02:11:27.099506 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/79cf9126-93c7-4477-a7ad-fd310b716db4-must-gather-output\") pod \"must-gather-2fsl5\" (UID: \"79cf9126-93c7-4477-a7ad-fd310b716db4\") " pod="openshift-must-gather-n2nnk/must-gather-2fsl5" Dec 06 02:11:27 crc kubenswrapper[4991]: I1206 02:11:27.200742 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mcnh\" (UniqueName: \"kubernetes.io/projected/79cf9126-93c7-4477-a7ad-fd310b716db4-kube-api-access-6mcnh\") pod \"must-gather-2fsl5\" (UID: \"79cf9126-93c7-4477-a7ad-fd310b716db4\") " pod="openshift-must-gather-n2nnk/must-gather-2fsl5" Dec 06 02:11:27 crc kubenswrapper[4991]: I1206 02:11:27.201086 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/79cf9126-93c7-4477-a7ad-fd310b716db4-must-gather-output\") pod \"must-gather-2fsl5\" (UID: \"79cf9126-93c7-4477-a7ad-fd310b716db4\") " pod="openshift-must-gather-n2nnk/must-gather-2fsl5" Dec 06 02:11:27 crc kubenswrapper[4991]: I1206 02:11:27.201638 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/79cf9126-93c7-4477-a7ad-fd310b716db4-must-gather-output\") pod \"must-gather-2fsl5\" (UID: \"79cf9126-93c7-4477-a7ad-fd310b716db4\") " pod="openshift-must-gather-n2nnk/must-gather-2fsl5" Dec 06 02:11:27 crc kubenswrapper[4991]: I1206 02:11:27.241194 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mcnh\" (UniqueName: \"kubernetes.io/projected/79cf9126-93c7-4477-a7ad-fd310b716db4-kube-api-access-6mcnh\") pod \"must-gather-2fsl5\" (UID: \"79cf9126-93c7-4477-a7ad-fd310b716db4\") " pod="openshift-must-gather-n2nnk/must-gather-2fsl5" Dec 06 02:11:27 crc kubenswrapper[4991]: I1206 02:11:27.325382 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2nnk/must-gather-2fsl5" Dec 06 02:11:27 crc kubenswrapper[4991]: I1206 02:11:27.455342 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" event={"ID":"f35cea72-9e31-4432-9e3b-16a50ebce794","Type":"ContainerStarted","Data":"acdbee5b0dca7edd2b37629b57df167a9453fec9ce78915f11caf35e1a51bfe0"} Dec 06 02:11:27 crc kubenswrapper[4991]: I1206 02:11:27.926143 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-n2nnk/must-gather-2fsl5"] Dec 06 02:11:28 crc kubenswrapper[4991]: I1206 02:11:28.465379 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n2nnk/must-gather-2fsl5" event={"ID":"79cf9126-93c7-4477-a7ad-fd310b716db4","Type":"ContainerStarted","Data":"f5672bfa3ad30959a08c4a894dde2c067d46106efc4594d3136d3b231175f3ae"} Dec 06 02:11:28 crc kubenswrapper[4991]: I1206 02:11:28.466173 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n2nnk/must-gather-2fsl5" event={"ID":"79cf9126-93c7-4477-a7ad-fd310b716db4","Type":"ContainerStarted","Data":"069679d256cef451a0e66142b17c5133903ede73900306db0c9182098f1be591"} Dec 06 02:11:29 crc kubenswrapper[4991]: I1206 02:11:29.476840 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n2nnk/must-gather-2fsl5" event={"ID":"79cf9126-93c7-4477-a7ad-fd310b716db4","Type":"ContainerStarted","Data":"218f4da2d95d861357f740c41588478ee075fcbc153364e53df3e84e9f7ecad0"} Dec 06 02:11:29 crc kubenswrapper[4991]: I1206 02:11:29.516253 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-n2nnk/must-gather-2fsl5" podStartSLOduration=3.516228469 podStartE2EDuration="3.516228469s" podCreationTimestamp="2025-12-06 02:11:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 02:11:29.500474771 +0000 UTC m=+4162.850765239" watchObservedRunningTime="2025-12-06 02:11:29.516228469 +0000 UTC m=+4162.866518957" Dec 06 02:11:32 crc kubenswrapper[4991]: I1206 02:11:32.546790 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n2nnk/crc-debug-6rq4x"] Dec 06 02:11:32 crc kubenswrapper[4991]: I1206 02:11:32.549194 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2nnk/crc-debug-6rq4x" Dec 06 02:11:32 crc kubenswrapper[4991]: I1206 02:11:32.618739 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1c3c72d5-a138-467e-8d50-f43f798ee882-host\") pod \"crc-debug-6rq4x\" (UID: \"1c3c72d5-a138-467e-8d50-f43f798ee882\") " pod="openshift-must-gather-n2nnk/crc-debug-6rq4x" Dec 06 02:11:32 crc kubenswrapper[4991]: I1206 02:11:32.619186 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7mqw\" (UniqueName: \"kubernetes.io/projected/1c3c72d5-a138-467e-8d50-f43f798ee882-kube-api-access-t7mqw\") pod \"crc-debug-6rq4x\" (UID: \"1c3c72d5-a138-467e-8d50-f43f798ee882\") " pod="openshift-must-gather-n2nnk/crc-debug-6rq4x" Dec 06 02:11:32 crc kubenswrapper[4991]: I1206 02:11:32.720863 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7mqw\" (UniqueName: \"kubernetes.io/projected/1c3c72d5-a138-467e-8d50-f43f798ee882-kube-api-access-t7mqw\") pod \"crc-debug-6rq4x\" (UID: \"1c3c72d5-a138-467e-8d50-f43f798ee882\") " pod="openshift-must-gather-n2nnk/crc-debug-6rq4x" Dec 06 02:11:32 crc kubenswrapper[4991]: I1206 02:11:32.721375 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1c3c72d5-a138-467e-8d50-f43f798ee882-host\") pod \"crc-debug-6rq4x\" (UID: \"1c3c72d5-a138-467e-8d50-f43f798ee882\") " pod="openshift-must-gather-n2nnk/crc-debug-6rq4x" Dec 06 02:11:32 crc kubenswrapper[4991]: I1206 02:11:32.721524 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1c3c72d5-a138-467e-8d50-f43f798ee882-host\") pod \"crc-debug-6rq4x\" (UID: \"1c3c72d5-a138-467e-8d50-f43f798ee882\") " pod="openshift-must-gather-n2nnk/crc-debug-6rq4x" Dec 06 02:11:32 crc kubenswrapper[4991]: I1206 02:11:32.745298 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7mqw\" (UniqueName: \"kubernetes.io/projected/1c3c72d5-a138-467e-8d50-f43f798ee882-kube-api-access-t7mqw\") pod \"crc-debug-6rq4x\" (UID: \"1c3c72d5-a138-467e-8d50-f43f798ee882\") " pod="openshift-must-gather-n2nnk/crc-debug-6rq4x" Dec 06 02:11:32 crc kubenswrapper[4991]: I1206 02:11:32.870994 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2nnk/crc-debug-6rq4x" Dec 06 02:11:33 crc kubenswrapper[4991]: I1206 02:11:33.517553 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n2nnk/crc-debug-6rq4x" event={"ID":"1c3c72d5-a138-467e-8d50-f43f798ee882","Type":"ContainerStarted","Data":"7bb8fa2528aad2fea8e781407c65ef5d74386220828229537196b321a35c5459"} Dec 06 02:11:33 crc kubenswrapper[4991]: I1206 02:11:33.518104 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n2nnk/crc-debug-6rq4x" event={"ID":"1c3c72d5-a138-467e-8d50-f43f798ee882","Type":"ContainerStarted","Data":"4bc2705ef300b0206a51d4ee78349060de99071862259a887f9ad711b4eaf334"} Dec 06 02:11:33 crc kubenswrapper[4991]: I1206 02:11:33.542585 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-n2nnk/crc-debug-6rq4x" podStartSLOduration=1.542560248 podStartE2EDuration="1.542560248s" podCreationTimestamp="2025-12-06 02:11:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 02:11:33.542558898 +0000 UTC m=+4166.892849386" watchObservedRunningTime="2025-12-06 02:11:33.542560248 +0000 UTC m=+4166.892850716" Dec 06 02:12:06 crc kubenswrapper[4991]: I1206 02:12:06.828362 4991 generic.go:334] "Generic (PLEG): container finished" podID="1c3c72d5-a138-467e-8d50-f43f798ee882" containerID="7bb8fa2528aad2fea8e781407c65ef5d74386220828229537196b321a35c5459" exitCode=0 Dec 06 02:12:06 crc kubenswrapper[4991]: I1206 02:12:06.828467 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n2nnk/crc-debug-6rq4x" event={"ID":"1c3c72d5-a138-467e-8d50-f43f798ee882","Type":"ContainerDied","Data":"7bb8fa2528aad2fea8e781407c65ef5d74386220828229537196b321a35c5459"} Dec 06 02:12:08 crc kubenswrapper[4991]: I1206 02:12:08.200290 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2nnk/crc-debug-6rq4x" Dec 06 02:12:08 crc kubenswrapper[4991]: I1206 02:12:08.238699 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n2nnk/crc-debug-6rq4x"] Dec 06 02:12:08 crc kubenswrapper[4991]: I1206 02:12:08.245762 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n2nnk/crc-debug-6rq4x"] Dec 06 02:12:08 crc kubenswrapper[4991]: I1206 02:12:08.305145 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7mqw\" (UniqueName: \"kubernetes.io/projected/1c3c72d5-a138-467e-8d50-f43f798ee882-kube-api-access-t7mqw\") pod \"1c3c72d5-a138-467e-8d50-f43f798ee882\" (UID: \"1c3c72d5-a138-467e-8d50-f43f798ee882\") " Dec 06 02:12:08 crc kubenswrapper[4991]: I1206 02:12:08.305245 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1c3c72d5-a138-467e-8d50-f43f798ee882-host\") pod \"1c3c72d5-a138-467e-8d50-f43f798ee882\" (UID: \"1c3c72d5-a138-467e-8d50-f43f798ee882\") " Dec 06 02:12:08 crc kubenswrapper[4991]: I1206 02:12:08.305343 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c3c72d5-a138-467e-8d50-f43f798ee882-host" (OuterVolumeSpecName: "host") pod "1c3c72d5-a138-467e-8d50-f43f798ee882" (UID: "1c3c72d5-a138-467e-8d50-f43f798ee882"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 02:12:08 crc kubenswrapper[4991]: I1206 02:12:08.305828 4991 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1c3c72d5-a138-467e-8d50-f43f798ee882-host\") on node \"crc\" DevicePath \"\"" Dec 06 02:12:08 crc kubenswrapper[4991]: I1206 02:12:08.313191 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c3c72d5-a138-467e-8d50-f43f798ee882-kube-api-access-t7mqw" (OuterVolumeSpecName: "kube-api-access-t7mqw") pod "1c3c72d5-a138-467e-8d50-f43f798ee882" (UID: "1c3c72d5-a138-467e-8d50-f43f798ee882"). InnerVolumeSpecName "kube-api-access-t7mqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 02:12:08 crc kubenswrapper[4991]: I1206 02:12:08.407867 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7mqw\" (UniqueName: \"kubernetes.io/projected/1c3c72d5-a138-467e-8d50-f43f798ee882-kube-api-access-t7mqw\") on node \"crc\" DevicePath \"\"" Dec 06 02:12:08 crc kubenswrapper[4991]: I1206 02:12:08.854686 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4bc2705ef300b0206a51d4ee78349060de99071862259a887f9ad711b4eaf334" Dec 06 02:12:08 crc kubenswrapper[4991]: I1206 02:12:08.854755 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2nnk/crc-debug-6rq4x" Dec 06 02:12:09 crc kubenswrapper[4991]: I1206 02:12:09.058932 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c3c72d5-a138-467e-8d50-f43f798ee882" path="/var/lib/kubelet/pods/1c3c72d5-a138-467e-8d50-f43f798ee882/volumes" Dec 06 02:12:09 crc kubenswrapper[4991]: I1206 02:12:09.478333 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n2nnk/crc-debug-cq4sg"] Dec 06 02:12:09 crc kubenswrapper[4991]: E1206 02:12:09.479156 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c3c72d5-a138-467e-8d50-f43f798ee882" containerName="container-00" Dec 06 02:12:09 crc kubenswrapper[4991]: I1206 02:12:09.479175 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c3c72d5-a138-467e-8d50-f43f798ee882" containerName="container-00" Dec 06 02:12:09 crc kubenswrapper[4991]: I1206 02:12:09.479448 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c3c72d5-a138-467e-8d50-f43f798ee882" containerName="container-00" Dec 06 02:12:09 crc kubenswrapper[4991]: I1206 02:12:09.482164 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2nnk/crc-debug-cq4sg" Dec 06 02:12:09 crc kubenswrapper[4991]: I1206 02:12:09.631965 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqtmk\" (UniqueName: \"kubernetes.io/projected/d70b9988-47e5-47c2-9870-0f82af528183-kube-api-access-jqtmk\") pod \"crc-debug-cq4sg\" (UID: \"d70b9988-47e5-47c2-9870-0f82af528183\") " pod="openshift-must-gather-n2nnk/crc-debug-cq4sg" Dec 06 02:12:09 crc kubenswrapper[4991]: I1206 02:12:09.632412 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d70b9988-47e5-47c2-9870-0f82af528183-host\") pod \"crc-debug-cq4sg\" (UID: \"d70b9988-47e5-47c2-9870-0f82af528183\") " pod="openshift-must-gather-n2nnk/crc-debug-cq4sg" Dec 06 02:12:09 crc kubenswrapper[4991]: I1206 02:12:09.735015 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqtmk\" (UniqueName: \"kubernetes.io/projected/d70b9988-47e5-47c2-9870-0f82af528183-kube-api-access-jqtmk\") pod \"crc-debug-cq4sg\" (UID: \"d70b9988-47e5-47c2-9870-0f82af528183\") " pod="openshift-must-gather-n2nnk/crc-debug-cq4sg" Dec 06 02:12:09 crc kubenswrapper[4991]: I1206 02:12:09.735140 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d70b9988-47e5-47c2-9870-0f82af528183-host\") pod \"crc-debug-cq4sg\" (UID: \"d70b9988-47e5-47c2-9870-0f82af528183\") " pod="openshift-must-gather-n2nnk/crc-debug-cq4sg" Dec 06 02:12:09 crc kubenswrapper[4991]: I1206 02:12:09.735330 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d70b9988-47e5-47c2-9870-0f82af528183-host\") pod \"crc-debug-cq4sg\" (UID: \"d70b9988-47e5-47c2-9870-0f82af528183\") " pod="openshift-must-gather-n2nnk/crc-debug-cq4sg" Dec 06 02:12:09 crc kubenswrapper[4991]: I1206 02:12:09.760629 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqtmk\" (UniqueName: \"kubernetes.io/projected/d70b9988-47e5-47c2-9870-0f82af528183-kube-api-access-jqtmk\") pod \"crc-debug-cq4sg\" (UID: \"d70b9988-47e5-47c2-9870-0f82af528183\") " pod="openshift-must-gather-n2nnk/crc-debug-cq4sg" Dec 06 02:12:09 crc kubenswrapper[4991]: I1206 02:12:09.801590 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2nnk/crc-debug-cq4sg" Dec 06 02:12:09 crc kubenswrapper[4991]: I1206 02:12:09.866109 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n2nnk/crc-debug-cq4sg" event={"ID":"d70b9988-47e5-47c2-9870-0f82af528183","Type":"ContainerStarted","Data":"6c7fadef58169e468a4486be02ea1ec2878ff4dffa468c94e0300afb0dc6a041"} Dec 06 02:12:10 crc kubenswrapper[4991]: I1206 02:12:10.876439 4991 generic.go:334] "Generic (PLEG): container finished" podID="d70b9988-47e5-47c2-9870-0f82af528183" containerID="918572af3cfaa906d3309814fe043f12ff9ce7e5952fefd76c3fc7b25039187a" exitCode=0 Dec 06 02:12:10 crc kubenswrapper[4991]: I1206 02:12:10.876511 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n2nnk/crc-debug-cq4sg" event={"ID":"d70b9988-47e5-47c2-9870-0f82af528183","Type":"ContainerDied","Data":"918572af3cfaa906d3309814fe043f12ff9ce7e5952fefd76c3fc7b25039187a"} Dec 06 02:12:11 crc kubenswrapper[4991]: I1206 02:12:11.406947 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n2nnk/crc-debug-cq4sg"] Dec 06 02:12:11 crc kubenswrapper[4991]: I1206 02:12:11.415465 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n2nnk/crc-debug-cq4sg"] Dec 06 02:12:11 crc kubenswrapper[4991]: I1206 02:12:11.986793 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2nnk/crc-debug-cq4sg" Dec 06 02:12:12 crc kubenswrapper[4991]: I1206 02:12:12.081532 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqtmk\" (UniqueName: \"kubernetes.io/projected/d70b9988-47e5-47c2-9870-0f82af528183-kube-api-access-jqtmk\") pod \"d70b9988-47e5-47c2-9870-0f82af528183\" (UID: \"d70b9988-47e5-47c2-9870-0f82af528183\") " Dec 06 02:12:12 crc kubenswrapper[4991]: I1206 02:12:12.081681 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d70b9988-47e5-47c2-9870-0f82af528183-host\") pod \"d70b9988-47e5-47c2-9870-0f82af528183\" (UID: \"d70b9988-47e5-47c2-9870-0f82af528183\") " Dec 06 02:12:12 crc kubenswrapper[4991]: I1206 02:12:12.081825 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d70b9988-47e5-47c2-9870-0f82af528183-host" (OuterVolumeSpecName: "host") pod "d70b9988-47e5-47c2-9870-0f82af528183" (UID: "d70b9988-47e5-47c2-9870-0f82af528183"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 02:12:12 crc kubenswrapper[4991]: I1206 02:12:12.082250 4991 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d70b9988-47e5-47c2-9870-0f82af528183-host\") on node \"crc\" DevicePath \"\"" Dec 06 02:12:12 crc kubenswrapper[4991]: I1206 02:12:12.088734 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d70b9988-47e5-47c2-9870-0f82af528183-kube-api-access-jqtmk" (OuterVolumeSpecName: "kube-api-access-jqtmk") pod "d70b9988-47e5-47c2-9870-0f82af528183" (UID: "d70b9988-47e5-47c2-9870-0f82af528183"). InnerVolumeSpecName "kube-api-access-jqtmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 02:12:12 crc kubenswrapper[4991]: I1206 02:12:12.184219 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqtmk\" (UniqueName: \"kubernetes.io/projected/d70b9988-47e5-47c2-9870-0f82af528183-kube-api-access-jqtmk\") on node \"crc\" DevicePath \"\"" Dec 06 02:12:12 crc kubenswrapper[4991]: I1206 02:12:12.653740 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n2nnk/crc-debug-gkd89"] Dec 06 02:12:12 crc kubenswrapper[4991]: E1206 02:12:12.654577 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d70b9988-47e5-47c2-9870-0f82af528183" containerName="container-00" Dec 06 02:12:12 crc kubenswrapper[4991]: I1206 02:12:12.654599 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="d70b9988-47e5-47c2-9870-0f82af528183" containerName="container-00" Dec 06 02:12:12 crc kubenswrapper[4991]: I1206 02:12:12.654853 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="d70b9988-47e5-47c2-9870-0f82af528183" containerName="container-00" Dec 06 02:12:12 crc kubenswrapper[4991]: I1206 02:12:12.655549 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2nnk/crc-debug-gkd89" Dec 06 02:12:12 crc kubenswrapper[4991]: I1206 02:12:12.795508 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqshv\" (UniqueName: \"kubernetes.io/projected/7c465c11-6dc6-44de-b251-b7d2ff1f82f1-kube-api-access-pqshv\") pod \"crc-debug-gkd89\" (UID: \"7c465c11-6dc6-44de-b251-b7d2ff1f82f1\") " pod="openshift-must-gather-n2nnk/crc-debug-gkd89" Dec 06 02:12:12 crc kubenswrapper[4991]: I1206 02:12:12.795754 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7c465c11-6dc6-44de-b251-b7d2ff1f82f1-host\") pod \"crc-debug-gkd89\" (UID: \"7c465c11-6dc6-44de-b251-b7d2ff1f82f1\") " pod="openshift-must-gather-n2nnk/crc-debug-gkd89" Dec 06 02:12:12 crc kubenswrapper[4991]: I1206 02:12:12.897410 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqshv\" (UniqueName: \"kubernetes.io/projected/7c465c11-6dc6-44de-b251-b7d2ff1f82f1-kube-api-access-pqshv\") pod \"crc-debug-gkd89\" (UID: \"7c465c11-6dc6-44de-b251-b7d2ff1f82f1\") " pod="openshift-must-gather-n2nnk/crc-debug-gkd89" Dec 06 02:12:12 crc kubenswrapper[4991]: I1206 02:12:12.897617 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7c465c11-6dc6-44de-b251-b7d2ff1f82f1-host\") pod \"crc-debug-gkd89\" (UID: \"7c465c11-6dc6-44de-b251-b7d2ff1f82f1\") " pod="openshift-must-gather-n2nnk/crc-debug-gkd89" Dec 06 02:12:12 crc kubenswrapper[4991]: I1206 02:12:12.897749 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7c465c11-6dc6-44de-b251-b7d2ff1f82f1-host\") pod \"crc-debug-gkd89\" (UID: \"7c465c11-6dc6-44de-b251-b7d2ff1f82f1\") " pod="openshift-must-gather-n2nnk/crc-debug-gkd89" Dec 06 02:12:12 crc kubenswrapper[4991]: I1206 02:12:12.900917 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c7fadef58169e468a4486be02ea1ec2878ff4dffa468c94e0300afb0dc6a041" Dec 06 02:12:12 crc kubenswrapper[4991]: I1206 02:12:12.901042 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2nnk/crc-debug-cq4sg" Dec 06 02:12:12 crc kubenswrapper[4991]: I1206 02:12:12.919249 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqshv\" (UniqueName: \"kubernetes.io/projected/7c465c11-6dc6-44de-b251-b7d2ff1f82f1-kube-api-access-pqshv\") pod \"crc-debug-gkd89\" (UID: \"7c465c11-6dc6-44de-b251-b7d2ff1f82f1\") " pod="openshift-must-gather-n2nnk/crc-debug-gkd89" Dec 06 02:12:12 crc kubenswrapper[4991]: I1206 02:12:12.980623 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2nnk/crc-debug-gkd89" Dec 06 02:12:13 crc kubenswrapper[4991]: I1206 02:12:13.050830 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d70b9988-47e5-47c2-9870-0f82af528183" path="/var/lib/kubelet/pods/d70b9988-47e5-47c2-9870-0f82af528183/volumes" Dec 06 02:12:13 crc kubenswrapper[4991]: I1206 02:12:13.913755 4991 generic.go:334] "Generic (PLEG): container finished" podID="7c465c11-6dc6-44de-b251-b7d2ff1f82f1" containerID="89199a855f82fde0ea4dbb05ff96144e639c1a7c299f8e6bc30bdf8a0ac100ed" exitCode=0 Dec 06 02:12:13 crc kubenswrapper[4991]: I1206 02:12:13.913861 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n2nnk/crc-debug-gkd89" event={"ID":"7c465c11-6dc6-44de-b251-b7d2ff1f82f1","Type":"ContainerDied","Data":"89199a855f82fde0ea4dbb05ff96144e639c1a7c299f8e6bc30bdf8a0ac100ed"} Dec 06 02:12:13 crc kubenswrapper[4991]: I1206 02:12:13.914254 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n2nnk/crc-debug-gkd89" event={"ID":"7c465c11-6dc6-44de-b251-b7d2ff1f82f1","Type":"ContainerStarted","Data":"1eb8023d31c40a484c3451452d2d2c2d93a7aa280957b0418b96238a148db29d"} Dec 06 02:12:13 crc kubenswrapper[4991]: I1206 02:12:13.966962 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n2nnk/crc-debug-gkd89"] Dec 06 02:12:13 crc kubenswrapper[4991]: I1206 02:12:13.978302 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n2nnk/crc-debug-gkd89"] Dec 06 02:12:15 crc kubenswrapper[4991]: I1206 02:12:15.047809 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2nnk/crc-debug-gkd89" Dec 06 02:12:15 crc kubenswrapper[4991]: I1206 02:12:15.142558 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7c465c11-6dc6-44de-b251-b7d2ff1f82f1-host\") pod \"7c465c11-6dc6-44de-b251-b7d2ff1f82f1\" (UID: \"7c465c11-6dc6-44de-b251-b7d2ff1f82f1\") " Dec 06 02:12:15 crc kubenswrapper[4991]: I1206 02:12:15.142653 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqshv\" (UniqueName: \"kubernetes.io/projected/7c465c11-6dc6-44de-b251-b7d2ff1f82f1-kube-api-access-pqshv\") pod \"7c465c11-6dc6-44de-b251-b7d2ff1f82f1\" (UID: \"7c465c11-6dc6-44de-b251-b7d2ff1f82f1\") " Dec 06 02:12:15 crc kubenswrapper[4991]: I1206 02:12:15.142689 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7c465c11-6dc6-44de-b251-b7d2ff1f82f1-host" (OuterVolumeSpecName: "host") pod "7c465c11-6dc6-44de-b251-b7d2ff1f82f1" (UID: "7c465c11-6dc6-44de-b251-b7d2ff1f82f1"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 02:12:15 crc kubenswrapper[4991]: I1206 02:12:15.143340 4991 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7c465c11-6dc6-44de-b251-b7d2ff1f82f1-host\") on node \"crc\" DevicePath \"\"" Dec 06 02:12:15 crc kubenswrapper[4991]: I1206 02:12:15.150233 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c465c11-6dc6-44de-b251-b7d2ff1f82f1-kube-api-access-pqshv" (OuterVolumeSpecName: "kube-api-access-pqshv") pod "7c465c11-6dc6-44de-b251-b7d2ff1f82f1" (UID: "7c465c11-6dc6-44de-b251-b7d2ff1f82f1"). InnerVolumeSpecName "kube-api-access-pqshv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 02:12:15 crc kubenswrapper[4991]: I1206 02:12:15.245051 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqshv\" (UniqueName: \"kubernetes.io/projected/7c465c11-6dc6-44de-b251-b7d2ff1f82f1-kube-api-access-pqshv\") on node \"crc\" DevicePath \"\"" Dec 06 02:12:15 crc kubenswrapper[4991]: I1206 02:12:15.945197 4991 scope.go:117] "RemoveContainer" containerID="89199a855f82fde0ea4dbb05ff96144e639c1a7c299f8e6bc30bdf8a0ac100ed" Dec 06 02:12:15 crc kubenswrapper[4991]: I1206 02:12:15.945220 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2nnk/crc-debug-gkd89" Dec 06 02:12:17 crc kubenswrapper[4991]: I1206 02:12:17.058372 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c465c11-6dc6-44de-b251-b7d2ff1f82f1" path="/var/lib/kubelet/pods/7c465c11-6dc6-44de-b251-b7d2ff1f82f1/volumes" Dec 06 02:12:46 crc kubenswrapper[4991]: I1206 02:12:46.664589 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5d7f8d97fd-bwgzp_b53a8fe6-006f-49e8-9598-dd986c58e97a/barbican-api/0.log" Dec 06 02:12:46 crc kubenswrapper[4991]: I1206 02:12:46.792761 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5d7f8d97fd-bwgzp_b53a8fe6-006f-49e8-9598-dd986c58e97a/barbican-api-log/0.log" Dec 06 02:12:46 crc kubenswrapper[4991]: I1206 02:12:46.844953 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7b4c56449d-5hvrn_79ffe5af-2a10-4cd9-b8bc-1c2e49dae8b1/barbican-keystone-listener/0.log" Dec 06 02:12:46 crc kubenswrapper[4991]: I1206 02:12:46.956482 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7b4c56449d-5hvrn_79ffe5af-2a10-4cd9-b8bc-1c2e49dae8b1/barbican-keystone-listener-log/0.log" Dec 06 02:12:47 crc kubenswrapper[4991]: I1206 02:12:47.057861 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7d69cddbc-dkwjg_cc3413e2-6d39-4013-94ac-fb8d1c42e5ff/barbican-worker/0.log" Dec 06 02:12:47 crc kubenswrapper[4991]: I1206 02:12:47.133539 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7d69cddbc-dkwjg_cc3413e2-6d39-4013-94ac-fb8d1c42e5ff/barbican-worker-log/0.log" Dec 06 02:12:47 crc kubenswrapper[4991]: I1206 02:12:47.270560 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-46jmn_aa853f65-564a-473e-a155-b5ac6e57eedf/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 02:12:47 crc kubenswrapper[4991]: I1206 02:12:47.371797 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b4e35617-6fcf-4b85-b5a4-d48d8a95f458/ceilometer-central-agent/0.log" Dec 06 02:12:47 crc kubenswrapper[4991]: I1206 02:12:47.505625 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b4e35617-6fcf-4b85-b5a4-d48d8a95f458/ceilometer-notification-agent/0.log" Dec 06 02:12:47 crc kubenswrapper[4991]: I1206 02:12:47.542857 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b4e35617-6fcf-4b85-b5a4-d48d8a95f458/proxy-httpd/0.log" Dec 06 02:12:47 crc kubenswrapper[4991]: I1206 02:12:47.585468 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b4e35617-6fcf-4b85-b5a4-d48d8a95f458/sg-core/0.log" Dec 06 02:12:47 crc kubenswrapper[4991]: I1206 02:12:47.780112 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_612d471c-b8ae-46ac-b3ff-9920f5615bd4/cinder-api/0.log" Dec 06 02:12:47 crc kubenswrapper[4991]: I1206 02:12:47.799488 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_612d471c-b8ae-46ac-b3ff-9920f5615bd4/cinder-api-log/0.log" Dec 06 02:12:47 crc kubenswrapper[4991]: I1206 02:12:47.974869 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_89779edf-f784-441f-9420-1e03444eea94/cinder-scheduler/0.log" Dec 06 02:12:48 crc kubenswrapper[4991]: I1206 02:12:48.022817 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_89779edf-f784-441f-9420-1e03444eea94/probe/0.log" Dec 06 02:12:48 crc kubenswrapper[4991]: I1206 02:12:48.088464 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-hdf44_62d5caf2-21e5-4c94-b8f5-b5cf0f3804d8/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 02:12:48 crc kubenswrapper[4991]: I1206 02:12:48.227401 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-mwrmt_69c9b4b8-de83-4f43-8670-e9279daadb9b/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 02:12:48 crc kubenswrapper[4991]: I1206 02:12:48.394753 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-69zsb_fa47d36c-149e-4778-b285-fa3b2e71277b/init/0.log" Dec 06 02:12:48 crc kubenswrapper[4991]: I1206 02:12:48.456806 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-69zsb_fa47d36c-149e-4778-b285-fa3b2e71277b/init/0.log" Dec 06 02:12:48 crc kubenswrapper[4991]: I1206 02:12:48.552243 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-69zsb_fa47d36c-149e-4778-b285-fa3b2e71277b/dnsmasq-dns/0.log" Dec 06 02:12:48 crc kubenswrapper[4991]: I1206 02:12:48.707728 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-ctg6w_bc316311-f861-4cc3-bb9f-1a340e3ec877/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 02:12:48 crc kubenswrapper[4991]: I1206 02:12:48.812186 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_7b96bae9-f846-4e7e-b1df-3d51b139dcb5/glance-httpd/0.log" Dec 06 02:12:48 crc kubenswrapper[4991]: I1206 02:12:48.901950 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_7b96bae9-f846-4e7e-b1df-3d51b139dcb5/glance-log/0.log" Dec 06 02:12:48 crc kubenswrapper[4991]: I1206 02:12:48.996603 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_9569cf80-9b29-4a2e-8a9b-a618fb5b0128/glance-log/0.log" Dec 06 02:12:49 crc kubenswrapper[4991]: I1206 02:12:49.013442 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_9569cf80-9b29-4a2e-8a9b-a618fb5b0128/glance-httpd/0.log" Dec 06 02:12:49 crc kubenswrapper[4991]: I1206 02:12:49.388937 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-577fd657dd-psmdm_15de5638-5e2b-4ce1-9d25-8f2830cd9241/horizon/0.log" Dec 06 02:12:49 crc kubenswrapper[4991]: I1206 02:12:49.455620 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-2hls9_a64fe8ac-b9eb-4c7e-917b-56fb0da51872/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 02:12:49 crc kubenswrapper[4991]: I1206 02:12:49.718665 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-gml2f_801ece65-0bd2-49cb-9c03-439c624bb145/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 02:12:49 crc kubenswrapper[4991]: I1206 02:12:49.723318 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-577fd657dd-psmdm_15de5638-5e2b-4ce1-9d25-8f2830cd9241/horizon-log/0.log" Dec 06 02:12:50 crc kubenswrapper[4991]: I1206 02:12:50.200704 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29416441-t9lln_79a5c906-606c-4b08-bb99-fadf17a19cfa/keystone-cron/0.log" Dec 06 02:12:50 crc kubenswrapper[4991]: I1206 02:12:50.248822 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-99654f6-8nq4x_72ca5f78-0a0d-4e09-8975-cba79bab4842/keystone-api/0.log" Dec 06 02:12:50 crc kubenswrapper[4991]: I1206 02:12:50.393397 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_64844b68-4772-49f7-9328-8eed4c925267/kube-state-metrics/0.log" Dec 06 02:12:50 crc kubenswrapper[4991]: I1206 02:12:50.474234 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-rzttv_908dd8e5-0b1f-494f-8e10-5ee48d1faeb4/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 02:12:50 crc kubenswrapper[4991]: I1206 02:12:50.902130 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-779776dbb5-wnlcw_0027a35f-14c5-42ff-b153-359f129f0d2b/neutron-api/0.log" Dec 06 02:12:50 crc kubenswrapper[4991]: I1206 02:12:50.911747 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-779776dbb5-wnlcw_0027a35f-14c5-42ff-b153-359f129f0d2b/neutron-httpd/0.log" Dec 06 02:12:51 crc kubenswrapper[4991]: I1206 02:12:51.003046 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-lxkxq_46509879-7d53-4e4c-b581-9c48e1b350ad/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 02:12:51 crc kubenswrapper[4991]: I1206 02:12:51.601058 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_d718f384-f95f-4601-afc7-0aa3655b9855/nova-api-log/0.log" Dec 06 02:12:51 crc kubenswrapper[4991]: I1206 02:12:51.716814 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_380a9e11-2703-494e-aae0-ce7758d34509/nova-cell0-conductor-conductor/0.log" Dec 06 02:12:52 crc kubenswrapper[4991]: I1206 02:12:52.115820 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_8d67a449-d9a3-4a6a-bd6f-bbdd4c0de1fd/nova-cell1-conductor-conductor/0.log" Dec 06 02:12:52 crc kubenswrapper[4991]: I1206 02:12:52.223021 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_c5be01c8-e833-4da5-a09e-da95ec913708/nova-cell1-novncproxy-novncproxy/0.log" Dec 06 02:12:52 crc kubenswrapper[4991]: I1206 02:12:52.273953 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_d718f384-f95f-4601-afc7-0aa3655b9855/nova-api-api/0.log" Dec 06 02:12:52 crc kubenswrapper[4991]: I1206 02:12:52.388945 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-jntfk_3b3f0520-92f4-4e3e-aa3e-8950cdc437e9/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 02:12:52 crc kubenswrapper[4991]: I1206 02:12:52.594204 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_bae01666-bf5e-4dc0-a540-440d511229e7/nova-metadata-log/0.log" Dec 06 02:12:52 crc kubenswrapper[4991]: I1206 02:12:52.934013 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e937b594-10b6-4742-a47a-7d21f0c19636/mysql-bootstrap/0.log" Dec 06 02:12:53 crc kubenswrapper[4991]: I1206 02:12:53.025411 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_a1581dfb-5721-4ba3-9364-4b891a476821/nova-scheduler-scheduler/0.log" Dec 06 02:12:53 crc kubenswrapper[4991]: I1206 02:12:53.152182 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e937b594-10b6-4742-a47a-7d21f0c19636/mysql-bootstrap/0.log" Dec 06 02:12:53 crc kubenswrapper[4991]: I1206 02:12:53.195166 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e937b594-10b6-4742-a47a-7d21f0c19636/galera/0.log" Dec 06 02:12:53 crc kubenswrapper[4991]: I1206 02:12:53.401176 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_1b6ac217-8e85-4279-b2d1-7dd35227f828/mysql-bootstrap/0.log" Dec 06 02:12:54 crc kubenswrapper[4991]: I1206 02:12:54.201775 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_bae01666-bf5e-4dc0-a540-440d511229e7/nova-metadata-metadata/0.log" Dec 06 02:12:54 crc kubenswrapper[4991]: I1206 02:12:54.216187 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_1b6ac217-8e85-4279-b2d1-7dd35227f828/mysql-bootstrap/0.log" Dec 06 02:12:54 crc kubenswrapper[4991]: I1206 02:12:54.221675 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_1b6ac217-8e85-4279-b2d1-7dd35227f828/galera/0.log" Dec 06 02:12:54 crc kubenswrapper[4991]: I1206 02:12:54.431793 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_ff108074-7904-43a0-b621-6676980779f6/openstackclient/0.log" Dec 06 02:12:54 crc kubenswrapper[4991]: I1206 02:12:54.509412 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-82ddm_a3995a5a-dfc3-4769-8161-31f7e2234d13/ovn-controller/0.log" Dec 06 02:12:54 crc kubenswrapper[4991]: I1206 02:12:54.661605 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-ltzgn_3e130223-fd73-41e3-8af7-d0637555c009/openstack-network-exporter/0.log" Dec 06 02:12:54 crc kubenswrapper[4991]: I1206 02:12:54.728044 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-529dn_6e642a31-7110-47d8-b322-91900cf49151/ovsdb-server-init/0.log" Dec 06 02:12:54 crc kubenswrapper[4991]: I1206 02:12:54.967709 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-529dn_6e642a31-7110-47d8-b322-91900cf49151/ovsdb-server-init/0.log" Dec 06 02:12:55 crc kubenswrapper[4991]: I1206 02:12:55.008085 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-529dn_6e642a31-7110-47d8-b322-91900cf49151/ovsdb-server/0.log" Dec 06 02:12:55 crc kubenswrapper[4991]: I1206 02:12:55.125433 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-529dn_6e642a31-7110-47d8-b322-91900cf49151/ovs-vswitchd/0.log" Dec 06 02:12:55 crc kubenswrapper[4991]: I1206 02:12:55.257046 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-fcgx6_8ae62c44-69d1-4cbf-abb9-491082bf83e3/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 02:12:55 crc kubenswrapper[4991]: I1206 02:12:55.350438 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_49a9417e-79cd-4b0c-a5b0-5528a9e20edc/openstack-network-exporter/0.log" Dec 06 02:12:55 crc kubenswrapper[4991]: I1206 02:12:55.424948 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_49a9417e-79cd-4b0c-a5b0-5528a9e20edc/ovn-northd/0.log" Dec 06 02:12:55 crc kubenswrapper[4991]: I1206 02:12:55.533289 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e154c690-d20d-463c-af7d-257a3ae60f3c/openstack-network-exporter/0.log" Dec 06 02:12:55 crc kubenswrapper[4991]: I1206 02:12:55.594629 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e154c690-d20d-463c-af7d-257a3ae60f3c/ovsdbserver-nb/0.log" Dec 06 02:12:56 crc kubenswrapper[4991]: I1206 02:12:56.258122 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2c775c23-51a7-4213-9a70-1382e7c55a1e/openstack-network-exporter/0.log" Dec 06 02:12:56 crc kubenswrapper[4991]: I1206 02:12:56.263017 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2c775c23-51a7-4213-9a70-1382e7c55a1e/ovsdbserver-sb/0.log" Dec 06 02:12:56 crc kubenswrapper[4991]: I1206 02:12:56.360460 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6d59dffb86-qrfm9_cd5faf07-40cf-4cce-9c5a-0fb43ffbb993/placement-api/0.log" Dec 06 02:12:56 crc kubenswrapper[4991]: I1206 02:12:56.549786 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_2e3e3138-a6f6-4005-b57d-ee02b9534e57/setup-container/0.log" Dec 06 02:12:56 crc kubenswrapper[4991]: I1206 02:12:56.581622 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6d59dffb86-qrfm9_cd5faf07-40cf-4cce-9c5a-0fb43ffbb993/placement-log/0.log" Dec 06 02:12:56 crc kubenswrapper[4991]: I1206 02:12:56.793502 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a/setup-container/0.log" Dec 06 02:12:56 crc kubenswrapper[4991]: I1206 02:12:56.827366 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_2e3e3138-a6f6-4005-b57d-ee02b9534e57/setup-container/0.log" Dec 06 02:12:56 crc kubenswrapper[4991]: I1206 02:12:56.848917 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_2e3e3138-a6f6-4005-b57d-ee02b9534e57/rabbitmq/0.log" Dec 06 02:12:57 crc kubenswrapper[4991]: I1206 02:12:57.123602 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-hmx9g_8695023e-4250-4f03-960e-873fd9f9fba5/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 02:12:57 crc kubenswrapper[4991]: I1206 02:12:57.156198 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a/rabbitmq/0.log" Dec 06 02:12:57 crc kubenswrapper[4991]: I1206 02:12:57.166598 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f06bb7d4-b2e9-4bd5-8ae4-8cd7458dcf8a/setup-container/0.log" Dec 06 02:12:57 crc kubenswrapper[4991]: I1206 02:12:57.373113 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-zvphr_eb12fac8-d6c4-4be8-907f-a06cae6bc067/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 02:12:57 crc kubenswrapper[4991]: I1206 02:12:57.645888 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-lxh7l_128953a1-f318-43c1-a2bb-3b006c8600a4/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 02:12:57 crc kubenswrapper[4991]: I1206 02:12:57.835773 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-lwgtr_1c339885-2776-49cc-88b6-e79ff184d76a/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 02:12:57 crc kubenswrapper[4991]: I1206 02:12:57.937781 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-74xt7_dfadc831-65e8-4c83-a3f6-1316db177fb5/ssh-known-hosts-edpm-deployment/0.log" Dec 06 02:12:58 crc kubenswrapper[4991]: I1206 02:12:58.208455 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-ff7f9b9c9-g4mpb_bbcad69c-20b6-46da-95b6-28a7d993ebab/proxy-httpd/0.log" Dec 06 02:12:58 crc kubenswrapper[4991]: I1206 02:12:58.234591 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-ff7f9b9c9-g4mpb_bbcad69c-20b6-46da-95b6-28a7d993ebab/proxy-server/0.log" Dec 06 02:12:58 crc kubenswrapper[4991]: I1206 02:12:58.346372 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-mtqz6_955d4a82-f627-490d-bc6c-cb7420010230/swift-ring-rebalance/0.log" Dec 06 02:12:58 crc kubenswrapper[4991]: I1206 02:12:58.491761 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_04a746be-9ac3-4e2e-a264-6fc99dc51527/account-auditor/0.log" Dec 06 02:12:58 crc kubenswrapper[4991]: I1206 02:12:58.494022 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_04a746be-9ac3-4e2e-a264-6fc99dc51527/account-reaper/0.log" Dec 06 02:12:58 crc kubenswrapper[4991]: I1206 02:12:58.638028 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_04a746be-9ac3-4e2e-a264-6fc99dc51527/account-replicator/0.log" Dec 06 02:12:58 crc kubenswrapper[4991]: I1206 02:12:58.718872 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_04a746be-9ac3-4e2e-a264-6fc99dc51527/container-auditor/0.log" Dec 06 02:12:58 crc kubenswrapper[4991]: I1206 02:12:58.733394 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_04a746be-9ac3-4e2e-a264-6fc99dc51527/account-server/0.log" Dec 06 02:12:58 crc kubenswrapper[4991]: I1206 02:12:58.757513 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_04a746be-9ac3-4e2e-a264-6fc99dc51527/container-replicator/0.log" Dec 06 02:12:58 crc kubenswrapper[4991]: I1206 02:12:58.891802 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_04a746be-9ac3-4e2e-a264-6fc99dc51527/container-server/0.log" Dec 06 02:12:58 crc kubenswrapper[4991]: I1206 02:12:58.893531 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_04a746be-9ac3-4e2e-a264-6fc99dc51527/container-updater/0.log" Dec 06 02:12:58 crc kubenswrapper[4991]: I1206 02:12:58.999272 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_04a746be-9ac3-4e2e-a264-6fc99dc51527/object-auditor/0.log" Dec 06 02:12:59 crc kubenswrapper[4991]: I1206 02:12:59.036590 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_04a746be-9ac3-4e2e-a264-6fc99dc51527/object-expirer/0.log" Dec 06 02:12:59 crc kubenswrapper[4991]: I1206 02:12:59.125042 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_04a746be-9ac3-4e2e-a264-6fc99dc51527/object-server/0.log" Dec 06 02:12:59 crc kubenswrapper[4991]: I1206 02:12:59.183951 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_04a746be-9ac3-4e2e-a264-6fc99dc51527/object-replicator/0.log" Dec 06 02:12:59 crc kubenswrapper[4991]: I1206 02:12:59.280778 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_04a746be-9ac3-4e2e-a264-6fc99dc51527/object-updater/0.log" Dec 06 02:12:59 crc kubenswrapper[4991]: I1206 02:12:59.284577 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_04a746be-9ac3-4e2e-a264-6fc99dc51527/rsync/0.log" Dec 06 02:12:59 crc kubenswrapper[4991]: I1206 02:12:59.331958 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_04a746be-9ac3-4e2e-a264-6fc99dc51527/swift-recon-cron/0.log" Dec 06 02:12:59 crc kubenswrapper[4991]: I1206 02:12:59.567958 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_67d48b5b-50a6-4515-9ca7-bc491239938a/tempest-tests-tempest-tests-runner/0.log" Dec 06 02:12:59 crc kubenswrapper[4991]: I1206 02:12:59.627546 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-bxrx9_3067d2e3-f5b3-45c8-9e37-8a29a2dd153f/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 02:12:59 crc kubenswrapper[4991]: I1206 02:12:59.794609 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_ce84be0d-7832-4602-a7fb-977d635de15b/test-operator-logs-container/0.log" Dec 06 02:12:59 crc kubenswrapper[4991]: I1206 02:12:59.902266 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-9l6vj_e4c4d67b-be1b-4502-a13d-96a119cbb436/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 02:13:10 crc kubenswrapper[4991]: I1206 02:13:10.133441 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_68a4be52-9eb9-4c1f-b625-de969a73a990/memcached/0.log" Dec 06 02:13:29 crc kubenswrapper[4991]: I1206 02:13:29.660550 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-crkd2_66cf73ac-d6ca-4a8c-855b-3d26bec5adb7/kube-rbac-proxy/0.log" Dec 06 02:13:29 crc kubenswrapper[4991]: I1206 02:13:29.747390 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-crkd2_66cf73ac-d6ca-4a8c-855b-3d26bec5adb7/manager/0.log" Dec 06 02:13:29 crc kubenswrapper[4991]: I1206 02:13:29.878402 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66v82fh_8263afa1-30a1-4b88-b31a-7ca24507edbf/util/0.log" Dec 06 02:13:30 crc kubenswrapper[4991]: I1206 02:13:30.105744 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66v82fh_8263afa1-30a1-4b88-b31a-7ca24507edbf/pull/0.log" Dec 06 02:13:30 crc kubenswrapper[4991]: I1206 02:13:30.112617 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66v82fh_8263afa1-30a1-4b88-b31a-7ca24507edbf/util/0.log" Dec 06 02:13:30 crc kubenswrapper[4991]: I1206 02:13:30.131999 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66v82fh_8263afa1-30a1-4b88-b31a-7ca24507edbf/pull/0.log" Dec 06 02:13:30 crc kubenswrapper[4991]: I1206 02:13:30.293388 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66v82fh_8263afa1-30a1-4b88-b31a-7ca24507edbf/util/0.log" Dec 06 02:13:30 crc kubenswrapper[4991]: I1206 02:13:30.295887 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66v82fh_8263afa1-30a1-4b88-b31a-7ca24507edbf/extract/0.log" Dec 06 02:13:30 crc kubenswrapper[4991]: I1206 02:13:30.310677 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_be5d9185a2c751a808eec367d0701d79d32720cb6a12d8d6e6299f3a66v82fh_8263afa1-30a1-4b88-b31a-7ca24507edbf/pull/0.log" Dec 06 02:13:30 crc kubenswrapper[4991]: I1206 02:13:30.470455 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-gbqt8_c99be400-34b2-431b-ba71-9c6c03f70c91/kube-rbac-proxy/0.log" Dec 06 02:13:30 crc kubenswrapper[4991]: I1206 02:13:30.522651 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-gbqt8_c99be400-34b2-431b-ba71-9c6c03f70c91/manager/0.log" Dec 06 02:13:30 crc kubenswrapper[4991]: I1206 02:13:30.574075 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-twvbc_7ceea028-825c-4235-b1bb-52adec27f46d/kube-rbac-proxy/0.log" Dec 06 02:13:30 crc kubenswrapper[4991]: I1206 02:13:30.666885 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-twvbc_7ceea028-825c-4235-b1bb-52adec27f46d/manager/0.log" Dec 06 02:13:30 crc kubenswrapper[4991]: I1206 02:13:30.738914 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-hll9p_4275f29c-130a-4e02-aeb5-6410643706b8/kube-rbac-proxy/0.log" Dec 06 02:13:30 crc kubenswrapper[4991]: I1206 02:13:30.839683 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-hll9p_4275f29c-130a-4e02-aeb5-6410643706b8/manager/0.log" Dec 06 02:13:30 crc kubenswrapper[4991]: I1206 02:13:30.937323 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-5j6br_14a62a5b-5719-4192-b2d8-40dc2e000f41/kube-rbac-proxy/0.log" Dec 06 02:13:30 crc kubenswrapper[4991]: I1206 02:13:30.967774 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-5j6br_14a62a5b-5719-4192-b2d8-40dc2e000f41/manager/0.log" Dec 06 02:13:31 crc kubenswrapper[4991]: I1206 02:13:31.139851 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-h9lxn_fd88f0ae-0cbd-4958-a05a-51e18d3ff32f/manager/0.log" Dec 06 02:13:31 crc kubenswrapper[4991]: I1206 02:13:31.157396 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-h9lxn_fd88f0ae-0cbd-4958-a05a-51e18d3ff32f/kube-rbac-proxy/0.log" Dec 06 02:13:31 crc kubenswrapper[4991]: I1206 02:13:31.303252 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-96kw2_37dd0775-3a19-4005-9cf0-7c038c293004/kube-rbac-proxy/0.log" Dec 06 02:13:31 crc kubenswrapper[4991]: I1206 02:13:31.538685 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-96kw2_37dd0775-3a19-4005-9cf0-7c038c293004/manager/0.log" Dec 06 02:13:31 crc kubenswrapper[4991]: I1206 02:13:31.544599 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-ftl59_852a1018-8ddc-4a27-ba0c-8a1fcf0d010f/kube-rbac-proxy/0.log" Dec 06 02:13:31 crc kubenswrapper[4991]: I1206 02:13:31.579189 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-ftl59_852a1018-8ddc-4a27-ba0c-8a1fcf0d010f/manager/0.log" Dec 06 02:13:31 crc kubenswrapper[4991]: I1206 02:13:31.745046 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-lwqvk_d21383b5-1b67-4b8f-a176-05769a28528d/kube-rbac-proxy/0.log" Dec 06 02:13:31 crc kubenswrapper[4991]: I1206 02:13:31.839129 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-lwqvk_d21383b5-1b67-4b8f-a176-05769a28528d/manager/0.log" Dec 06 02:13:31 crc kubenswrapper[4991]: I1206 02:13:31.858040 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-trx9l_7f3695d3-1d23-4779-90a6-d1b62ac3edc5/kube-rbac-proxy/0.log" Dec 06 02:13:31 crc kubenswrapper[4991]: I1206 02:13:31.962361 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-trx9l_7f3695d3-1d23-4779-90a6-d1b62ac3edc5/manager/0.log" Dec 06 02:13:32 crc kubenswrapper[4991]: I1206 02:13:32.057478 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-4pvnj_4fafae92-8cd1-450f-9e3b-7abd1020ab75/kube-rbac-proxy/0.log" Dec 06 02:13:32 crc kubenswrapper[4991]: I1206 02:13:32.186999 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-4pvnj_4fafae92-8cd1-450f-9e3b-7abd1020ab75/manager/0.log" Dec 06 02:13:32 crc kubenswrapper[4991]: I1206 02:13:32.221026 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-rlhms_e5351a0d-ece3-441f-95f6-ccc4281e9462/kube-rbac-proxy/0.log" Dec 06 02:13:32 crc kubenswrapper[4991]: I1206 02:13:32.310669 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-rlhms_e5351a0d-ece3-441f-95f6-ccc4281e9462/manager/0.log" Dec 06 02:13:32 crc kubenswrapper[4991]: I1206 02:13:32.409527 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-xjgls_8c5d96f9-58a1-4ee7-82dd-f71237ef875c/kube-rbac-proxy/0.log" Dec 06 02:13:32 crc kubenswrapper[4991]: I1206 02:13:32.503446 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-xjgls_8c5d96f9-58a1-4ee7-82dd-f71237ef875c/manager/0.log" Dec 06 02:13:32 crc kubenswrapper[4991]: I1206 02:13:32.565034 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-4hx8k_91bdc603-1933-4f22-9ecd-d3e95f2f0924/kube-rbac-proxy/0.log" Dec 06 02:13:32 crc kubenswrapper[4991]: I1206 02:13:32.654162 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-4hx8k_91bdc603-1933-4f22-9ecd-d3e95f2f0924/manager/0.log" Dec 06 02:13:32 crc kubenswrapper[4991]: I1206 02:13:32.758232 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4wxbnr_1d549c0d-f4fd-4eeb-852b-78e302f65110/manager/0.log" Dec 06 02:13:32 crc kubenswrapper[4991]: I1206 02:13:32.762456 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4wxbnr_1d549c0d-f4fd-4eeb-852b-78e302f65110/kube-rbac-proxy/0.log" Dec 06 02:13:33 crc kubenswrapper[4991]: I1206 02:13:33.072174 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-qv7hh_8ed7eb4a-d10c-40bd-9db4-2f8c0434fd75/registry-server/0.log" Dec 06 02:13:33 crc kubenswrapper[4991]: I1206 02:13:33.221528 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-9phkz_a4fd6e71-89f4-41d3-beed-a894e45b8fe8/kube-rbac-proxy/0.log" Dec 06 02:13:33 crc kubenswrapper[4991]: I1206 02:13:33.260675 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5f6c6584bc-6swc7_dd84f245-235a-406c-be63-88e9170e9902/operator/0.log" Dec 06 02:13:33 crc kubenswrapper[4991]: I1206 02:13:33.322022 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-9phkz_a4fd6e71-89f4-41d3-beed-a894e45b8fe8/manager/0.log" Dec 06 02:13:33 crc kubenswrapper[4991]: I1206 02:13:33.508468 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-72txt_725ee172-29cd-40c6-b0a1-9be795634566/kube-rbac-proxy/0.log" Dec 06 02:13:33 crc kubenswrapper[4991]: I1206 02:13:33.521809 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-72txt_725ee172-29cd-40c6-b0a1-9be795634566/manager/0.log" Dec 06 02:13:33 crc kubenswrapper[4991]: I1206 02:13:33.723049 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-gkl7x_6b1fa1e2-0eb6-48bf-bebc-93db7d90a644/kube-rbac-proxy/0.log" Dec 06 02:13:33 crc kubenswrapper[4991]: I1206 02:13:33.824302 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-z5dnf_5cfe648a-c326-46aa-83ed-31659feb31d0/operator/0.log" Dec 06 02:13:33 crc kubenswrapper[4991]: I1206 02:13:33.957497 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-gkl7x_6b1fa1e2-0eb6-48bf-bebc-93db7d90a644/manager/0.log" Dec 06 02:13:34 crc kubenswrapper[4991]: I1206 02:13:34.068805 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-gdckl_54706fa6-a00c-4f66-bce5-660d5475ff7b/kube-rbac-proxy/0.log" Dec 06 02:13:34 crc kubenswrapper[4991]: I1206 02:13:34.122631 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-gdckl_54706fa6-a00c-4f66-bce5-660d5475ff7b/manager/0.log" Dec 06 02:13:34 crc kubenswrapper[4991]: I1206 02:13:34.211839 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7d5d6748cd-5gjhx_a60f2feb-b041-4bb0-b718-9e669f3cf9c8/manager/0.log" Dec 06 02:13:34 crc kubenswrapper[4991]: I1206 02:13:34.313347 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-5v8fh_7869a8de-b32f-4600-af69-49a42ec4d096/kube-rbac-proxy/0.log" Dec 06 02:13:34 crc kubenswrapper[4991]: I1206 02:13:34.350294 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-5v8fh_7869a8de-b32f-4600-af69-49a42ec4d096/manager/0.log" Dec 06 02:13:34 crc kubenswrapper[4991]: I1206 02:13:34.403662 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-r6xm6_35ecf5a7-ec56-4cfc-a96b-84dada809373/kube-rbac-proxy/0.log" Dec 06 02:13:34 crc kubenswrapper[4991]: I1206 02:13:34.441923 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-r6xm6_35ecf5a7-ec56-4cfc-a96b-84dada809373/manager/0.log" Dec 06 02:13:46 crc kubenswrapper[4991]: I1206 02:13:46.739382 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 02:13:46 crc kubenswrapper[4991]: I1206 02:13:46.739919 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 02:13:53 crc kubenswrapper[4991]: I1206 02:13:53.612295 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-b7j79"] Dec 06 02:13:53 crc kubenswrapper[4991]: E1206 02:13:53.613286 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c465c11-6dc6-44de-b251-b7d2ff1f82f1" containerName="container-00" Dec 06 02:13:53 crc kubenswrapper[4991]: I1206 02:13:53.613300 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c465c11-6dc6-44de-b251-b7d2ff1f82f1" containerName="container-00" Dec 06 02:13:53 crc kubenswrapper[4991]: I1206 02:13:53.613513 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c465c11-6dc6-44de-b251-b7d2ff1f82f1" containerName="container-00" Dec 06 02:13:53 crc kubenswrapper[4991]: I1206 02:13:53.614913 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b7j79" Dec 06 02:13:53 crc kubenswrapper[4991]: I1206 02:13:53.635482 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b7j79"] Dec 06 02:13:53 crc kubenswrapper[4991]: I1206 02:13:53.768484 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r5pt\" (UniqueName: \"kubernetes.io/projected/78ff959f-2f72-47bb-94c8-84b8bba0fe99-kube-api-access-4r5pt\") pod \"redhat-marketplace-b7j79\" (UID: \"78ff959f-2f72-47bb-94c8-84b8bba0fe99\") " pod="openshift-marketplace/redhat-marketplace-b7j79" Dec 06 02:13:53 crc kubenswrapper[4991]: I1206 02:13:53.768582 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78ff959f-2f72-47bb-94c8-84b8bba0fe99-utilities\") pod \"redhat-marketplace-b7j79\" (UID: \"78ff959f-2f72-47bb-94c8-84b8bba0fe99\") " pod="openshift-marketplace/redhat-marketplace-b7j79" Dec 06 02:13:53 crc kubenswrapper[4991]: I1206 02:13:53.768650 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78ff959f-2f72-47bb-94c8-84b8bba0fe99-catalog-content\") pod \"redhat-marketplace-b7j79\" (UID: \"78ff959f-2f72-47bb-94c8-84b8bba0fe99\") " pod="openshift-marketplace/redhat-marketplace-b7j79" Dec 06 02:13:53 crc kubenswrapper[4991]: I1206 02:13:53.871141 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78ff959f-2f72-47bb-94c8-84b8bba0fe99-catalog-content\") pod \"redhat-marketplace-b7j79\" (UID: \"78ff959f-2f72-47bb-94c8-84b8bba0fe99\") " pod="openshift-marketplace/redhat-marketplace-b7j79" Dec 06 02:13:53 crc kubenswrapper[4991]: I1206 02:13:53.871300 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r5pt\" (UniqueName: \"kubernetes.io/projected/78ff959f-2f72-47bb-94c8-84b8bba0fe99-kube-api-access-4r5pt\") pod \"redhat-marketplace-b7j79\" (UID: \"78ff959f-2f72-47bb-94c8-84b8bba0fe99\") " pod="openshift-marketplace/redhat-marketplace-b7j79" Dec 06 02:13:53 crc kubenswrapper[4991]: I1206 02:13:53.871357 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78ff959f-2f72-47bb-94c8-84b8bba0fe99-utilities\") pod \"redhat-marketplace-b7j79\" (UID: \"78ff959f-2f72-47bb-94c8-84b8bba0fe99\") " pod="openshift-marketplace/redhat-marketplace-b7j79" Dec 06 02:13:53 crc kubenswrapper[4991]: I1206 02:13:53.871871 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78ff959f-2f72-47bb-94c8-84b8bba0fe99-utilities\") pod \"redhat-marketplace-b7j79\" (UID: \"78ff959f-2f72-47bb-94c8-84b8bba0fe99\") " pod="openshift-marketplace/redhat-marketplace-b7j79" Dec 06 02:13:53 crc kubenswrapper[4991]: I1206 02:13:53.871894 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78ff959f-2f72-47bb-94c8-84b8bba0fe99-catalog-content\") pod \"redhat-marketplace-b7j79\" (UID: \"78ff959f-2f72-47bb-94c8-84b8bba0fe99\") " pod="openshift-marketplace/redhat-marketplace-b7j79" Dec 06 02:13:53 crc kubenswrapper[4991]: I1206 02:13:53.902823 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r5pt\" (UniqueName: \"kubernetes.io/projected/78ff959f-2f72-47bb-94c8-84b8bba0fe99-kube-api-access-4r5pt\") pod \"redhat-marketplace-b7j79\" (UID: \"78ff959f-2f72-47bb-94c8-84b8bba0fe99\") " pod="openshift-marketplace/redhat-marketplace-b7j79" Dec 06 02:13:53 crc kubenswrapper[4991]: I1206 02:13:53.948726 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b7j79" Dec 06 02:13:54 crc kubenswrapper[4991]: I1206 02:13:54.531608 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b7j79"] Dec 06 02:13:54 crc kubenswrapper[4991]: I1206 02:13:54.952458 4991 generic.go:334] "Generic (PLEG): container finished" podID="78ff959f-2f72-47bb-94c8-84b8bba0fe99" containerID="83fd7a9d994a1367fbb0fb6141f4323013bf3a28639dde300e4a74b63ec9fc80" exitCode=0 Dec 06 02:13:54 crc kubenswrapper[4991]: I1206 02:13:54.952552 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b7j79" event={"ID":"78ff959f-2f72-47bb-94c8-84b8bba0fe99","Type":"ContainerDied","Data":"83fd7a9d994a1367fbb0fb6141f4323013bf3a28639dde300e4a74b63ec9fc80"} Dec 06 02:13:54 crc kubenswrapper[4991]: I1206 02:13:54.952896 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b7j79" event={"ID":"78ff959f-2f72-47bb-94c8-84b8bba0fe99","Type":"ContainerStarted","Data":"e0f391daf26aa464aa0d7373006e5e005ea918df6ae11d8898636b3ef1f81c53"} Dec 06 02:13:54 crc kubenswrapper[4991]: I1206 02:13:54.954811 4991 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 02:13:55 crc kubenswrapper[4991]: I1206 02:13:55.964445 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b7j79" event={"ID":"78ff959f-2f72-47bb-94c8-84b8bba0fe99","Type":"ContainerStarted","Data":"5bdf1244b2ef7fe4d340555d53b3bc524a995f29343e54e3ac0327c64b8af6fb"} Dec 06 02:13:56 crc kubenswrapper[4991]: I1206 02:13:56.628736 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-c9qfb_9a829457-575c-4b6b-bc3b-68c117691597/control-plane-machine-set-operator/0.log" Dec 06 02:13:56 crc kubenswrapper[4991]: I1206 02:13:56.772250 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-hwqgl_6f04204c-29c6-4b62-b4f4-28935672a20e/kube-rbac-proxy/0.log" Dec 06 02:13:56 crc kubenswrapper[4991]: I1206 02:13:56.869100 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-hwqgl_6f04204c-29c6-4b62-b4f4-28935672a20e/machine-api-operator/0.log" Dec 06 02:13:56 crc kubenswrapper[4991]: I1206 02:13:56.977028 4991 generic.go:334] "Generic (PLEG): container finished" podID="78ff959f-2f72-47bb-94c8-84b8bba0fe99" containerID="5bdf1244b2ef7fe4d340555d53b3bc524a995f29343e54e3ac0327c64b8af6fb" exitCode=0 Dec 06 02:13:56 crc kubenswrapper[4991]: I1206 02:13:56.977143 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b7j79" event={"ID":"78ff959f-2f72-47bb-94c8-84b8bba0fe99","Type":"ContainerDied","Data":"5bdf1244b2ef7fe4d340555d53b3bc524a995f29343e54e3ac0327c64b8af6fb"} Dec 06 02:13:57 crc kubenswrapper[4991]: I1206 02:13:57.989850 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b7j79" event={"ID":"78ff959f-2f72-47bb-94c8-84b8bba0fe99","Type":"ContainerStarted","Data":"22589e4b3e20672122b60d7b371d5ff06f3be049a08ed368a5c136aa2e90f26e"} Dec 06 02:13:58 crc kubenswrapper[4991]: I1206 02:13:58.012674 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-b7j79" podStartSLOduration=2.574273767 podStartE2EDuration="5.012655628s" podCreationTimestamp="2025-12-06 02:13:53 +0000 UTC" firstStartedPulling="2025-12-06 02:13:54.954538375 +0000 UTC m=+4308.304828843" lastFinishedPulling="2025-12-06 02:13:57.392920236 +0000 UTC m=+4310.743210704" observedRunningTime="2025-12-06 02:13:58.007938303 +0000 UTC m=+4311.358228791" watchObservedRunningTime="2025-12-06 02:13:58.012655628 +0000 UTC m=+4311.362946096" Dec 06 02:14:03 crc kubenswrapper[4991]: I1206 02:14:03.949787 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-b7j79" Dec 06 02:14:03 crc kubenswrapper[4991]: I1206 02:14:03.950584 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-b7j79" Dec 06 02:14:04 crc kubenswrapper[4991]: I1206 02:14:04.011342 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-b7j79" Dec 06 02:14:04 crc kubenswrapper[4991]: I1206 02:14:04.113795 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-b7j79" Dec 06 02:14:04 crc kubenswrapper[4991]: I1206 02:14:04.253164 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b7j79"] Dec 06 02:14:06 crc kubenswrapper[4991]: I1206 02:14:06.063341 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-b7j79" podUID="78ff959f-2f72-47bb-94c8-84b8bba0fe99" containerName="registry-server" containerID="cri-o://22589e4b3e20672122b60d7b371d5ff06f3be049a08ed368a5c136aa2e90f26e" gracePeriod=2 Dec 06 02:14:06 crc kubenswrapper[4991]: I1206 02:14:06.588594 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b7j79" Dec 06 02:14:06 crc kubenswrapper[4991]: I1206 02:14:06.665806 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4r5pt\" (UniqueName: \"kubernetes.io/projected/78ff959f-2f72-47bb-94c8-84b8bba0fe99-kube-api-access-4r5pt\") pod \"78ff959f-2f72-47bb-94c8-84b8bba0fe99\" (UID: \"78ff959f-2f72-47bb-94c8-84b8bba0fe99\") " Dec 06 02:14:06 crc kubenswrapper[4991]: I1206 02:14:06.666075 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78ff959f-2f72-47bb-94c8-84b8bba0fe99-catalog-content\") pod \"78ff959f-2f72-47bb-94c8-84b8bba0fe99\" (UID: \"78ff959f-2f72-47bb-94c8-84b8bba0fe99\") " Dec 06 02:14:06 crc kubenswrapper[4991]: I1206 02:14:06.666151 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78ff959f-2f72-47bb-94c8-84b8bba0fe99-utilities\") pod \"78ff959f-2f72-47bb-94c8-84b8bba0fe99\" (UID: \"78ff959f-2f72-47bb-94c8-84b8bba0fe99\") " Dec 06 02:14:06 crc kubenswrapper[4991]: I1206 02:14:06.666874 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78ff959f-2f72-47bb-94c8-84b8bba0fe99-utilities" (OuterVolumeSpecName: "utilities") pod "78ff959f-2f72-47bb-94c8-84b8bba0fe99" (UID: "78ff959f-2f72-47bb-94c8-84b8bba0fe99"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 02:14:06 crc kubenswrapper[4991]: I1206 02:14:06.678448 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78ff959f-2f72-47bb-94c8-84b8bba0fe99-kube-api-access-4r5pt" (OuterVolumeSpecName: "kube-api-access-4r5pt") pod "78ff959f-2f72-47bb-94c8-84b8bba0fe99" (UID: "78ff959f-2f72-47bb-94c8-84b8bba0fe99"). InnerVolumeSpecName "kube-api-access-4r5pt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 02:14:06 crc kubenswrapper[4991]: I1206 02:14:06.685183 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78ff959f-2f72-47bb-94c8-84b8bba0fe99-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "78ff959f-2f72-47bb-94c8-84b8bba0fe99" (UID: "78ff959f-2f72-47bb-94c8-84b8bba0fe99"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 02:14:06 crc kubenswrapper[4991]: I1206 02:14:06.768745 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78ff959f-2f72-47bb-94c8-84b8bba0fe99-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 02:14:06 crc kubenswrapper[4991]: I1206 02:14:06.768802 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4r5pt\" (UniqueName: \"kubernetes.io/projected/78ff959f-2f72-47bb-94c8-84b8bba0fe99-kube-api-access-4r5pt\") on node \"crc\" DevicePath \"\"" Dec 06 02:14:06 crc kubenswrapper[4991]: I1206 02:14:06.768821 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78ff959f-2f72-47bb-94c8-84b8bba0fe99-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 02:14:07 crc kubenswrapper[4991]: I1206 02:14:07.086140 4991 generic.go:334] "Generic (PLEG): container finished" podID="78ff959f-2f72-47bb-94c8-84b8bba0fe99" containerID="22589e4b3e20672122b60d7b371d5ff06f3be049a08ed368a5c136aa2e90f26e" exitCode=0 Dec 06 02:14:07 crc kubenswrapper[4991]: I1206 02:14:07.086223 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b7j79" event={"ID":"78ff959f-2f72-47bb-94c8-84b8bba0fe99","Type":"ContainerDied","Data":"22589e4b3e20672122b60d7b371d5ff06f3be049a08ed368a5c136aa2e90f26e"} Dec 06 02:14:07 crc kubenswrapper[4991]: I1206 02:14:07.086286 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b7j79" Dec 06 02:14:07 crc kubenswrapper[4991]: I1206 02:14:07.086316 4991 scope.go:117] "RemoveContainer" containerID="22589e4b3e20672122b60d7b371d5ff06f3be049a08ed368a5c136aa2e90f26e" Dec 06 02:14:07 crc kubenswrapper[4991]: I1206 02:14:07.086291 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b7j79" event={"ID":"78ff959f-2f72-47bb-94c8-84b8bba0fe99","Type":"ContainerDied","Data":"e0f391daf26aa464aa0d7373006e5e005ea918df6ae11d8898636b3ef1f81c53"} Dec 06 02:14:07 crc kubenswrapper[4991]: I1206 02:14:07.120338 4991 scope.go:117] "RemoveContainer" containerID="5bdf1244b2ef7fe4d340555d53b3bc524a995f29343e54e3ac0327c64b8af6fb" Dec 06 02:14:07 crc kubenswrapper[4991]: I1206 02:14:07.159066 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b7j79"] Dec 06 02:14:07 crc kubenswrapper[4991]: I1206 02:14:07.162626 4991 scope.go:117] "RemoveContainer" containerID="83fd7a9d994a1367fbb0fb6141f4323013bf3a28639dde300e4a74b63ec9fc80" Dec 06 02:14:07 crc kubenswrapper[4991]: I1206 02:14:07.170583 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-b7j79"] Dec 06 02:14:07 crc kubenswrapper[4991]: I1206 02:14:07.219106 4991 scope.go:117] "RemoveContainer" containerID="22589e4b3e20672122b60d7b371d5ff06f3be049a08ed368a5c136aa2e90f26e" Dec 06 02:14:07 crc kubenswrapper[4991]: E1206 02:14:07.219590 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22589e4b3e20672122b60d7b371d5ff06f3be049a08ed368a5c136aa2e90f26e\": container with ID starting with 22589e4b3e20672122b60d7b371d5ff06f3be049a08ed368a5c136aa2e90f26e not found: ID does not exist" containerID="22589e4b3e20672122b60d7b371d5ff06f3be049a08ed368a5c136aa2e90f26e" Dec 06 02:14:07 crc kubenswrapper[4991]: I1206 02:14:07.219782 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22589e4b3e20672122b60d7b371d5ff06f3be049a08ed368a5c136aa2e90f26e"} err="failed to get container status \"22589e4b3e20672122b60d7b371d5ff06f3be049a08ed368a5c136aa2e90f26e\": rpc error: code = NotFound desc = could not find container \"22589e4b3e20672122b60d7b371d5ff06f3be049a08ed368a5c136aa2e90f26e\": container with ID starting with 22589e4b3e20672122b60d7b371d5ff06f3be049a08ed368a5c136aa2e90f26e not found: ID does not exist" Dec 06 02:14:07 crc kubenswrapper[4991]: I1206 02:14:07.219897 4991 scope.go:117] "RemoveContainer" containerID="5bdf1244b2ef7fe4d340555d53b3bc524a995f29343e54e3ac0327c64b8af6fb" Dec 06 02:14:07 crc kubenswrapper[4991]: E1206 02:14:07.220318 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bdf1244b2ef7fe4d340555d53b3bc524a995f29343e54e3ac0327c64b8af6fb\": container with ID starting with 5bdf1244b2ef7fe4d340555d53b3bc524a995f29343e54e3ac0327c64b8af6fb not found: ID does not exist" containerID="5bdf1244b2ef7fe4d340555d53b3bc524a995f29343e54e3ac0327c64b8af6fb" Dec 06 02:14:07 crc kubenswrapper[4991]: I1206 02:14:07.220350 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bdf1244b2ef7fe4d340555d53b3bc524a995f29343e54e3ac0327c64b8af6fb"} err="failed to get container status \"5bdf1244b2ef7fe4d340555d53b3bc524a995f29343e54e3ac0327c64b8af6fb\": rpc error: code = NotFound desc = could not find container \"5bdf1244b2ef7fe4d340555d53b3bc524a995f29343e54e3ac0327c64b8af6fb\": container with ID starting with 5bdf1244b2ef7fe4d340555d53b3bc524a995f29343e54e3ac0327c64b8af6fb not found: ID does not exist" Dec 06 02:14:07 crc kubenswrapper[4991]: I1206 02:14:07.220475 4991 scope.go:117] "RemoveContainer" containerID="83fd7a9d994a1367fbb0fb6141f4323013bf3a28639dde300e4a74b63ec9fc80" Dec 06 02:14:07 crc kubenswrapper[4991]: E1206 02:14:07.220822 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83fd7a9d994a1367fbb0fb6141f4323013bf3a28639dde300e4a74b63ec9fc80\": container with ID starting with 83fd7a9d994a1367fbb0fb6141f4323013bf3a28639dde300e4a74b63ec9fc80 not found: ID does not exist" containerID="83fd7a9d994a1367fbb0fb6141f4323013bf3a28639dde300e4a74b63ec9fc80" Dec 06 02:14:07 crc kubenswrapper[4991]: I1206 02:14:07.220916 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83fd7a9d994a1367fbb0fb6141f4323013bf3a28639dde300e4a74b63ec9fc80"} err="failed to get container status \"83fd7a9d994a1367fbb0fb6141f4323013bf3a28639dde300e4a74b63ec9fc80\": rpc error: code = NotFound desc = could not find container \"83fd7a9d994a1367fbb0fb6141f4323013bf3a28639dde300e4a74b63ec9fc80\": container with ID starting with 83fd7a9d994a1367fbb0fb6141f4323013bf3a28639dde300e4a74b63ec9fc80 not found: ID does not exist" Dec 06 02:14:09 crc kubenswrapper[4991]: I1206 02:14:09.054606 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78ff959f-2f72-47bb-94c8-84b8bba0fe99" path="/var/lib/kubelet/pods/78ff959f-2f72-47bb-94c8-84b8bba0fe99/volumes" Dec 06 02:14:11 crc kubenswrapper[4991]: I1206 02:14:11.735922 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-4w5qq_4c9fcb25-80e7-4c9a-b31d-68a4b1b82411/cert-manager-controller/0.log" Dec 06 02:14:11 crc kubenswrapper[4991]: I1206 02:14:11.938305 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-wcfbf_8bcf50ab-be54-4a1a-95ca-f43426ebf55e/cert-manager-cainjector/0.log" Dec 06 02:14:11 crc kubenswrapper[4991]: I1206 02:14:11.996762 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-2zqr8_217fa056-c208-4c13-af81-12e5e1c4d231/cert-manager-webhook/0.log" Dec 06 02:14:16 crc kubenswrapper[4991]: I1206 02:14:16.740529 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 02:14:16 crc kubenswrapper[4991]: I1206 02:14:16.741532 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 02:14:26 crc kubenswrapper[4991]: I1206 02:14:26.118992 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-gm4xg_7b08a3d0-cce4-42df-b89a-a9b11ef39a0d/nmstate-console-plugin/0.log" Dec 06 02:14:26 crc kubenswrapper[4991]: I1206 02:14:26.223618 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-6vvck_ac4eae73-f13b-4e52-9b25-f837bba897b0/nmstate-handler/0.log" Dec 06 02:14:26 crc kubenswrapper[4991]: I1206 02:14:26.287369 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-9czh7_8ba61c7d-dc05-4834-9e0e-e18bb755d1a9/kube-rbac-proxy/0.log" Dec 06 02:14:26 crc kubenswrapper[4991]: I1206 02:14:26.293529 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-9czh7_8ba61c7d-dc05-4834-9e0e-e18bb755d1a9/nmstate-metrics/0.log" Dec 06 02:14:26 crc kubenswrapper[4991]: I1206 02:14:26.501416 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-nl2cl_47befc6f-6b9a-4a39-a029-1d55f420cf3b/nmstate-operator/0.log" Dec 06 02:14:26 crc kubenswrapper[4991]: I1206 02:14:26.528998 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-q6ftt_461dd677-1efc-468e-833f-125c37a2c7b8/nmstate-webhook/0.log" Dec 06 02:14:42 crc kubenswrapper[4991]: I1206 02:14:42.921143 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-qtv46_8110ff83-33b6-4f27-946e-383e8bff44c7/kube-rbac-proxy/0.log" Dec 06 02:14:43 crc kubenswrapper[4991]: I1206 02:14:43.085435 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-qtv46_8110ff83-33b6-4f27-946e-383e8bff44c7/controller/0.log" Dec 06 02:14:43 crc kubenswrapper[4991]: I1206 02:14:43.137220 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fkgqs_a82fdef0-9a99-4150-b86b-c8839caa41a1/cp-frr-files/0.log" Dec 06 02:14:43 crc kubenswrapper[4991]: I1206 02:14:43.332281 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fkgqs_a82fdef0-9a99-4150-b86b-c8839caa41a1/cp-frr-files/0.log" Dec 06 02:14:43 crc kubenswrapper[4991]: I1206 02:14:43.383544 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fkgqs_a82fdef0-9a99-4150-b86b-c8839caa41a1/cp-metrics/0.log" Dec 06 02:14:43 crc kubenswrapper[4991]: I1206 02:14:43.432964 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fkgqs_a82fdef0-9a99-4150-b86b-c8839caa41a1/cp-reloader/0.log" Dec 06 02:14:43 crc kubenswrapper[4991]: I1206 02:14:43.483427 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fkgqs_a82fdef0-9a99-4150-b86b-c8839caa41a1/cp-reloader/0.log" Dec 06 02:14:43 crc kubenswrapper[4991]: I1206 02:14:43.575752 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fkgqs_a82fdef0-9a99-4150-b86b-c8839caa41a1/cp-frr-files/0.log" Dec 06 02:14:43 crc kubenswrapper[4991]: I1206 02:14:43.630201 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fkgqs_a82fdef0-9a99-4150-b86b-c8839caa41a1/cp-reloader/0.log" Dec 06 02:14:43 crc kubenswrapper[4991]: I1206 02:14:43.688408 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fkgqs_a82fdef0-9a99-4150-b86b-c8839caa41a1/cp-metrics/0.log" Dec 06 02:14:43 crc kubenswrapper[4991]: I1206 02:14:43.696309 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fkgqs_a82fdef0-9a99-4150-b86b-c8839caa41a1/cp-metrics/0.log" Dec 06 02:14:43 crc kubenswrapper[4991]: I1206 02:14:43.923617 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fkgqs_a82fdef0-9a99-4150-b86b-c8839caa41a1/cp-frr-files/0.log" Dec 06 02:14:43 crc kubenswrapper[4991]: I1206 02:14:43.945530 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fkgqs_a82fdef0-9a99-4150-b86b-c8839caa41a1/controller/0.log" Dec 06 02:14:43 crc kubenswrapper[4991]: I1206 02:14:43.946693 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fkgqs_a82fdef0-9a99-4150-b86b-c8839caa41a1/cp-reloader/0.log" Dec 06 02:14:43 crc kubenswrapper[4991]: I1206 02:14:43.982702 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fkgqs_a82fdef0-9a99-4150-b86b-c8839caa41a1/cp-metrics/0.log" Dec 06 02:14:44 crc kubenswrapper[4991]: I1206 02:14:44.148320 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fkgqs_a82fdef0-9a99-4150-b86b-c8839caa41a1/frr-metrics/0.log" Dec 06 02:14:44 crc kubenswrapper[4991]: I1206 02:14:44.209881 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fkgqs_a82fdef0-9a99-4150-b86b-c8839caa41a1/kube-rbac-proxy-frr/0.log" Dec 06 02:14:44 crc kubenswrapper[4991]: I1206 02:14:44.234316 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fkgqs_a82fdef0-9a99-4150-b86b-c8839caa41a1/kube-rbac-proxy/0.log" Dec 06 02:14:44 crc kubenswrapper[4991]: I1206 02:14:44.405115 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fkgqs_a82fdef0-9a99-4150-b86b-c8839caa41a1/reloader/0.log" Dec 06 02:14:44 crc kubenswrapper[4991]: I1206 02:14:44.467220 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-t5xxb_61ac5c00-6180-4a7b-a4c0-fd98da3bb456/frr-k8s-webhook-server/0.log" Dec 06 02:14:44 crc kubenswrapper[4991]: I1206 02:14:44.705642 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-545bb986f8-rh6kd_4fd49c2f-3ac6-4444-ab0a-4d9a47617377/manager/0.log" Dec 06 02:14:44 crc kubenswrapper[4991]: I1206 02:14:44.867021 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-8495449b99-p7f45_2eba891b-78cc-4a1c-8f65-a09829f349e8/webhook-server/0.log" Dec 06 02:14:44 crc kubenswrapper[4991]: I1206 02:14:44.970371 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-4hfhs_79797e2b-b768-414a-8f3f-0a68ae045269/kube-rbac-proxy/0.log" Dec 06 02:14:45 crc kubenswrapper[4991]: I1206 02:14:45.606075 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fkgqs_a82fdef0-9a99-4150-b86b-c8839caa41a1/frr/0.log" Dec 06 02:14:45 crc kubenswrapper[4991]: I1206 02:14:45.649593 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-4hfhs_79797e2b-b768-414a-8f3f-0a68ae045269/speaker/0.log" Dec 06 02:14:46 crc kubenswrapper[4991]: I1206 02:14:46.739822 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 02:14:46 crc kubenswrapper[4991]: I1206 02:14:46.740204 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 02:14:46 crc kubenswrapper[4991]: I1206 02:14:46.740294 4991 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" Dec 06 02:14:46 crc kubenswrapper[4991]: I1206 02:14:46.741174 4991 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"acdbee5b0dca7edd2b37629b57df167a9453fec9ce78915f11caf35e1a51bfe0"} pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 02:14:46 crc kubenswrapper[4991]: I1206 02:14:46.741240 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" containerID="cri-o://acdbee5b0dca7edd2b37629b57df167a9453fec9ce78915f11caf35e1a51bfe0" gracePeriod=600 Dec 06 02:14:47 crc kubenswrapper[4991]: I1206 02:14:47.494490 4991 generic.go:334] "Generic (PLEG): container finished" podID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerID="acdbee5b0dca7edd2b37629b57df167a9453fec9ce78915f11caf35e1a51bfe0" exitCode=0 Dec 06 02:14:47 crc kubenswrapper[4991]: I1206 02:14:47.494579 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" event={"ID":"f35cea72-9e31-4432-9e3b-16a50ebce794","Type":"ContainerDied","Data":"acdbee5b0dca7edd2b37629b57df167a9453fec9ce78915f11caf35e1a51bfe0"} Dec 06 02:14:47 crc kubenswrapper[4991]: I1206 02:14:47.495132 4991 scope.go:117] "RemoveContainer" containerID="3f69fd2784f79b950316e24a856a47d6fb8cc89cd4f6585f544e9f30b9f5185a" Dec 06 02:14:47 crc kubenswrapper[4991]: I1206 02:14:47.495180 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" event={"ID":"f35cea72-9e31-4432-9e3b-16a50ebce794","Type":"ContainerStarted","Data":"369666e7a5a42942a80da266a62c4d04fa304b1fe28f89d531fbf75c37dd1f3b"} Dec 06 02:15:00 crc kubenswrapper[4991]: I1206 02:15:00.193124 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416455-mb9x7"] Dec 06 02:15:00 crc kubenswrapper[4991]: E1206 02:15:00.194877 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78ff959f-2f72-47bb-94c8-84b8bba0fe99" containerName="extract-content" Dec 06 02:15:00 crc kubenswrapper[4991]: I1206 02:15:00.194903 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="78ff959f-2f72-47bb-94c8-84b8bba0fe99" containerName="extract-content" Dec 06 02:15:00 crc kubenswrapper[4991]: E1206 02:15:00.194923 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78ff959f-2f72-47bb-94c8-84b8bba0fe99" containerName="registry-server" Dec 06 02:15:00 crc kubenswrapper[4991]: I1206 02:15:00.194930 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="78ff959f-2f72-47bb-94c8-84b8bba0fe99" containerName="registry-server" Dec 06 02:15:00 crc kubenswrapper[4991]: E1206 02:15:00.194949 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78ff959f-2f72-47bb-94c8-84b8bba0fe99" containerName="extract-utilities" Dec 06 02:15:00 crc kubenswrapper[4991]: I1206 02:15:00.194972 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="78ff959f-2f72-47bb-94c8-84b8bba0fe99" containerName="extract-utilities" Dec 06 02:15:00 crc kubenswrapper[4991]: I1206 02:15:00.195284 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="78ff959f-2f72-47bb-94c8-84b8bba0fe99" containerName="registry-server" Dec 06 02:15:00 crc kubenswrapper[4991]: I1206 02:15:00.196424 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416455-mb9x7" Dec 06 02:15:00 crc kubenswrapper[4991]: I1206 02:15:00.202554 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 06 02:15:00 crc kubenswrapper[4991]: I1206 02:15:00.202638 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 06 02:15:00 crc kubenswrapper[4991]: I1206 02:15:00.247485 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416455-mb9x7"] Dec 06 02:15:00 crc kubenswrapper[4991]: I1206 02:15:00.318436 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn2nl\" (UniqueName: \"kubernetes.io/projected/a0a51c8e-2bb1-4d76-ac42-198a8bb3cea2-kube-api-access-kn2nl\") pod \"collect-profiles-29416455-mb9x7\" (UID: \"a0a51c8e-2bb1-4d76-ac42-198a8bb3cea2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416455-mb9x7" Dec 06 02:15:00 crc kubenswrapper[4991]: I1206 02:15:00.318519 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a0a51c8e-2bb1-4d76-ac42-198a8bb3cea2-secret-volume\") pod \"collect-profiles-29416455-mb9x7\" (UID: \"a0a51c8e-2bb1-4d76-ac42-198a8bb3cea2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416455-mb9x7" Dec 06 02:15:00 crc kubenswrapper[4991]: I1206 02:15:00.318661 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a0a51c8e-2bb1-4d76-ac42-198a8bb3cea2-config-volume\") pod \"collect-profiles-29416455-mb9x7\" (UID: \"a0a51c8e-2bb1-4d76-ac42-198a8bb3cea2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416455-mb9x7" Dec 06 02:15:00 crc kubenswrapper[4991]: I1206 02:15:00.420383 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn2nl\" (UniqueName: \"kubernetes.io/projected/a0a51c8e-2bb1-4d76-ac42-198a8bb3cea2-kube-api-access-kn2nl\") pod \"collect-profiles-29416455-mb9x7\" (UID: \"a0a51c8e-2bb1-4d76-ac42-198a8bb3cea2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416455-mb9x7" Dec 06 02:15:00 crc kubenswrapper[4991]: I1206 02:15:00.420472 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a0a51c8e-2bb1-4d76-ac42-198a8bb3cea2-secret-volume\") pod \"collect-profiles-29416455-mb9x7\" (UID: \"a0a51c8e-2bb1-4d76-ac42-198a8bb3cea2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416455-mb9x7" Dec 06 02:15:00 crc kubenswrapper[4991]: I1206 02:15:00.420543 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a0a51c8e-2bb1-4d76-ac42-198a8bb3cea2-config-volume\") pod \"collect-profiles-29416455-mb9x7\" (UID: \"a0a51c8e-2bb1-4d76-ac42-198a8bb3cea2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416455-mb9x7" Dec 06 02:15:00 crc kubenswrapper[4991]: I1206 02:15:00.421407 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a0a51c8e-2bb1-4d76-ac42-198a8bb3cea2-config-volume\") pod \"collect-profiles-29416455-mb9x7\" (UID: \"a0a51c8e-2bb1-4d76-ac42-198a8bb3cea2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416455-mb9x7" Dec 06 02:15:00 crc kubenswrapper[4991]: I1206 02:15:00.431658 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a0a51c8e-2bb1-4d76-ac42-198a8bb3cea2-secret-volume\") pod \"collect-profiles-29416455-mb9x7\" (UID: \"a0a51c8e-2bb1-4d76-ac42-198a8bb3cea2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416455-mb9x7" Dec 06 02:15:00 crc kubenswrapper[4991]: I1206 02:15:00.443010 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn2nl\" (UniqueName: \"kubernetes.io/projected/a0a51c8e-2bb1-4d76-ac42-198a8bb3cea2-kube-api-access-kn2nl\") pod \"collect-profiles-29416455-mb9x7\" (UID: \"a0a51c8e-2bb1-4d76-ac42-198a8bb3cea2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416455-mb9x7" Dec 06 02:15:00 crc kubenswrapper[4991]: I1206 02:15:00.525410 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416455-mb9x7" Dec 06 02:15:01 crc kubenswrapper[4991]: I1206 02:15:01.068197 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416455-mb9x7"] Dec 06 02:15:01 crc kubenswrapper[4991]: I1206 02:15:01.607668 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnzf99_6f2005a0-682b-485a-bb40-b60cd9683c2e/util/0.log" Dec 06 02:15:01 crc kubenswrapper[4991]: I1206 02:15:01.628200 4991 generic.go:334] "Generic (PLEG): container finished" podID="a0a51c8e-2bb1-4d76-ac42-198a8bb3cea2" containerID="148ded3f8796e4a5174e6dcbce4de6c566440d1054cf48bda55cce7240d64e67" exitCode=0 Dec 06 02:15:01 crc kubenswrapper[4991]: I1206 02:15:01.628246 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416455-mb9x7" event={"ID":"a0a51c8e-2bb1-4d76-ac42-198a8bb3cea2","Type":"ContainerDied","Data":"148ded3f8796e4a5174e6dcbce4de6c566440d1054cf48bda55cce7240d64e67"} Dec 06 02:15:01 crc kubenswrapper[4991]: I1206 02:15:01.628272 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416455-mb9x7" event={"ID":"a0a51c8e-2bb1-4d76-ac42-198a8bb3cea2","Type":"ContainerStarted","Data":"3d733281fd1d6e0fb16b4287c9068acfedf1df7d9e04d7e7a8abd342b3051d7f"} Dec 06 02:15:01 crc kubenswrapper[4991]: I1206 02:15:01.815395 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mc9ts"] Dec 06 02:15:01 crc kubenswrapper[4991]: I1206 02:15:01.819724 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mc9ts" Dec 06 02:15:01 crc kubenswrapper[4991]: I1206 02:15:01.834718 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mc9ts"] Dec 06 02:15:01 crc kubenswrapper[4991]: I1206 02:15:01.874880 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnzf99_6f2005a0-682b-485a-bb40-b60cd9683c2e/util/0.log" Dec 06 02:15:01 crc kubenswrapper[4991]: I1206 02:15:01.937368 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnzf99_6f2005a0-682b-485a-bb40-b60cd9683c2e/pull/0.log" Dec 06 02:15:01 crc kubenswrapper[4991]: I1206 02:15:01.970627 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt6hq\" (UniqueName: \"kubernetes.io/projected/755e0e60-a808-43eb-a0a3-82f89978a640-kube-api-access-tt6hq\") pod \"community-operators-mc9ts\" (UID: \"755e0e60-a808-43eb-a0a3-82f89978a640\") " pod="openshift-marketplace/community-operators-mc9ts" Dec 06 02:15:01 crc kubenswrapper[4991]: I1206 02:15:01.970730 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/755e0e60-a808-43eb-a0a3-82f89978a640-utilities\") pod \"community-operators-mc9ts\" (UID: \"755e0e60-a808-43eb-a0a3-82f89978a640\") " pod="openshift-marketplace/community-operators-mc9ts" Dec 06 02:15:01 crc kubenswrapper[4991]: I1206 02:15:01.970771 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/755e0e60-a808-43eb-a0a3-82f89978a640-catalog-content\") pod \"community-operators-mc9ts\" (UID: \"755e0e60-a808-43eb-a0a3-82f89978a640\") " pod="openshift-marketplace/community-operators-mc9ts" Dec 06 02:15:02 crc kubenswrapper[4991]: I1206 02:15:02.000503 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnzf99_6f2005a0-682b-485a-bb40-b60cd9683c2e/pull/0.log" Dec 06 02:15:02 crc kubenswrapper[4991]: I1206 02:15:02.072682 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt6hq\" (UniqueName: \"kubernetes.io/projected/755e0e60-a808-43eb-a0a3-82f89978a640-kube-api-access-tt6hq\") pod \"community-operators-mc9ts\" (UID: \"755e0e60-a808-43eb-a0a3-82f89978a640\") " pod="openshift-marketplace/community-operators-mc9ts" Dec 06 02:15:02 crc kubenswrapper[4991]: I1206 02:15:02.072780 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/755e0e60-a808-43eb-a0a3-82f89978a640-utilities\") pod \"community-operators-mc9ts\" (UID: \"755e0e60-a808-43eb-a0a3-82f89978a640\") " pod="openshift-marketplace/community-operators-mc9ts" Dec 06 02:15:02 crc kubenswrapper[4991]: I1206 02:15:02.072819 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/755e0e60-a808-43eb-a0a3-82f89978a640-catalog-content\") pod \"community-operators-mc9ts\" (UID: \"755e0e60-a808-43eb-a0a3-82f89978a640\") " pod="openshift-marketplace/community-operators-mc9ts" Dec 06 02:15:02 crc kubenswrapper[4991]: I1206 02:15:02.073333 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/755e0e60-a808-43eb-a0a3-82f89978a640-catalog-content\") pod \"community-operators-mc9ts\" (UID: \"755e0e60-a808-43eb-a0a3-82f89978a640\") " pod="openshift-marketplace/community-operators-mc9ts" Dec 06 02:15:02 crc kubenswrapper[4991]: I1206 02:15:02.073364 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/755e0e60-a808-43eb-a0a3-82f89978a640-utilities\") pod \"community-operators-mc9ts\" (UID: \"755e0e60-a808-43eb-a0a3-82f89978a640\") " pod="openshift-marketplace/community-operators-mc9ts" Dec 06 02:15:02 crc kubenswrapper[4991]: I1206 02:15:02.101419 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt6hq\" (UniqueName: \"kubernetes.io/projected/755e0e60-a808-43eb-a0a3-82f89978a640-kube-api-access-tt6hq\") pod \"community-operators-mc9ts\" (UID: \"755e0e60-a808-43eb-a0a3-82f89978a640\") " pod="openshift-marketplace/community-operators-mc9ts" Dec 06 02:15:02 crc kubenswrapper[4991]: I1206 02:15:02.143575 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mc9ts" Dec 06 02:15:02 crc kubenswrapper[4991]: I1206 02:15:02.206748 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnzf99_6f2005a0-682b-485a-bb40-b60cd9683c2e/util/0.log" Dec 06 02:15:02 crc kubenswrapper[4991]: I1206 02:15:02.310293 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnzf99_6f2005a0-682b-485a-bb40-b60cd9683c2e/extract/0.log" Dec 06 02:15:02 crc kubenswrapper[4991]: I1206 02:15:02.322707 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnzf99_6f2005a0-682b-485a-bb40-b60cd9683c2e/pull/0.log" Dec 06 02:15:02 crc kubenswrapper[4991]: I1206 02:15:02.660609 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x7pmx_c4cc76d8-965d-4228-8e93-4d9e736a9f95/util/0.log" Dec 06 02:15:02 crc kubenswrapper[4991]: I1206 02:15:02.909207 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mc9ts"] Dec 06 02:15:02 crc kubenswrapper[4991]: I1206 02:15:02.970782 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x7pmx_c4cc76d8-965d-4228-8e93-4d9e736a9f95/util/0.log" Dec 06 02:15:02 crc kubenswrapper[4991]: I1206 02:15:02.984000 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x7pmx_c4cc76d8-965d-4228-8e93-4d9e736a9f95/pull/0.log" Dec 06 02:15:03 crc kubenswrapper[4991]: I1206 02:15:03.011821 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x7pmx_c4cc76d8-965d-4228-8e93-4d9e736a9f95/pull/0.log" Dec 06 02:15:03 crc kubenswrapper[4991]: I1206 02:15:03.182760 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x7pmx_c4cc76d8-965d-4228-8e93-4d9e736a9f95/util/0.log" Dec 06 02:15:03 crc kubenswrapper[4991]: I1206 02:15:03.195579 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416455-mb9x7" Dec 06 02:15:03 crc kubenswrapper[4991]: I1206 02:15:03.211471 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x7pmx_c4cc76d8-965d-4228-8e93-4d9e736a9f95/pull/0.log" Dec 06 02:15:03 crc kubenswrapper[4991]: I1206 02:15:03.219517 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x7pmx_c4cc76d8-965d-4228-8e93-4d9e736a9f95/extract/0.log" Dec 06 02:15:03 crc kubenswrapper[4991]: I1206 02:15:03.301730 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a0a51c8e-2bb1-4d76-ac42-198a8bb3cea2-config-volume\") pod \"a0a51c8e-2bb1-4d76-ac42-198a8bb3cea2\" (UID: \"a0a51c8e-2bb1-4d76-ac42-198a8bb3cea2\") " Dec 06 02:15:03 crc kubenswrapper[4991]: I1206 02:15:03.301804 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kn2nl\" (UniqueName: \"kubernetes.io/projected/a0a51c8e-2bb1-4d76-ac42-198a8bb3cea2-kube-api-access-kn2nl\") pod \"a0a51c8e-2bb1-4d76-ac42-198a8bb3cea2\" (UID: \"a0a51c8e-2bb1-4d76-ac42-198a8bb3cea2\") " Dec 06 02:15:03 crc kubenswrapper[4991]: I1206 02:15:03.301884 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a0a51c8e-2bb1-4d76-ac42-198a8bb3cea2-secret-volume\") pod \"a0a51c8e-2bb1-4d76-ac42-198a8bb3cea2\" (UID: \"a0a51c8e-2bb1-4d76-ac42-198a8bb3cea2\") " Dec 06 02:15:03 crc kubenswrapper[4991]: I1206 02:15:03.303187 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0a51c8e-2bb1-4d76-ac42-198a8bb3cea2-config-volume" (OuterVolumeSpecName: "config-volume") pod "a0a51c8e-2bb1-4d76-ac42-198a8bb3cea2" (UID: "a0a51c8e-2bb1-4d76-ac42-198a8bb3cea2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 02:15:03 crc kubenswrapper[4991]: I1206 02:15:03.309184 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0a51c8e-2bb1-4d76-ac42-198a8bb3cea2-kube-api-access-kn2nl" (OuterVolumeSpecName: "kube-api-access-kn2nl") pod "a0a51c8e-2bb1-4d76-ac42-198a8bb3cea2" (UID: "a0a51c8e-2bb1-4d76-ac42-198a8bb3cea2"). InnerVolumeSpecName "kube-api-access-kn2nl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 02:15:03 crc kubenswrapper[4991]: I1206 02:15:03.309267 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0a51c8e-2bb1-4d76-ac42-198a8bb3cea2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a0a51c8e-2bb1-4d76-ac42-198a8bb3cea2" (UID: "a0a51c8e-2bb1-4d76-ac42-198a8bb3cea2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 02:15:03 crc kubenswrapper[4991]: I1206 02:15:03.403977 4991 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a0a51c8e-2bb1-4d76-ac42-198a8bb3cea2-config-volume\") on node \"crc\" DevicePath \"\"" Dec 06 02:15:03 crc kubenswrapper[4991]: I1206 02:15:03.404043 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kn2nl\" (UniqueName: \"kubernetes.io/projected/a0a51c8e-2bb1-4d76-ac42-198a8bb3cea2-kube-api-access-kn2nl\") on node \"crc\" DevicePath \"\"" Dec 06 02:15:03 crc kubenswrapper[4991]: I1206 02:15:03.404056 4991 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a0a51c8e-2bb1-4d76-ac42-198a8bb3cea2-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 06 02:15:03 crc kubenswrapper[4991]: I1206 02:15:03.404843 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-w9mn9_97d40092-c83e-4410-9535-73e4aad9eafc/extract-utilities/0.log" Dec 06 02:15:03 crc kubenswrapper[4991]: I1206 02:15:03.565093 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-w9mn9_97d40092-c83e-4410-9535-73e4aad9eafc/extract-content/0.log" Dec 06 02:15:03 crc kubenswrapper[4991]: I1206 02:15:03.588647 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-w9mn9_97d40092-c83e-4410-9535-73e4aad9eafc/extract-content/0.log" Dec 06 02:15:03 crc kubenswrapper[4991]: I1206 02:15:03.630343 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-w9mn9_97d40092-c83e-4410-9535-73e4aad9eafc/extract-utilities/0.log" Dec 06 02:15:03 crc kubenswrapper[4991]: I1206 02:15:03.646887 4991 generic.go:334] "Generic (PLEG): container finished" podID="755e0e60-a808-43eb-a0a3-82f89978a640" containerID="61a622430697d053533a04c93e42c0eb6bb90f891591e4183bf7e3fad6ba60f3" exitCode=0 Dec 06 02:15:03 crc kubenswrapper[4991]: I1206 02:15:03.646963 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mc9ts" event={"ID":"755e0e60-a808-43eb-a0a3-82f89978a640","Type":"ContainerDied","Data":"61a622430697d053533a04c93e42c0eb6bb90f891591e4183bf7e3fad6ba60f3"} Dec 06 02:15:03 crc kubenswrapper[4991]: I1206 02:15:03.647012 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mc9ts" event={"ID":"755e0e60-a808-43eb-a0a3-82f89978a640","Type":"ContainerStarted","Data":"e06b140f04d10a8e6569dd2e5feb5875d82961af5cbff1b4ae625a5bd66c308b"} Dec 06 02:15:03 crc kubenswrapper[4991]: I1206 02:15:03.652466 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416455-mb9x7" event={"ID":"a0a51c8e-2bb1-4d76-ac42-198a8bb3cea2","Type":"ContainerDied","Data":"3d733281fd1d6e0fb16b4287c9068acfedf1df7d9e04d7e7a8abd342b3051d7f"} Dec 06 02:15:03 crc kubenswrapper[4991]: I1206 02:15:03.652502 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d733281fd1d6e0fb16b4287c9068acfedf1df7d9e04d7e7a8abd342b3051d7f" Dec 06 02:15:03 crc kubenswrapper[4991]: I1206 02:15:03.652565 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416455-mb9x7" Dec 06 02:15:03 crc kubenswrapper[4991]: I1206 02:15:03.808781 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-w9mn9_97d40092-c83e-4410-9535-73e4aad9eafc/extract-utilities/0.log" Dec 06 02:15:03 crc kubenswrapper[4991]: I1206 02:15:03.838522 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-w9mn9_97d40092-c83e-4410-9535-73e4aad9eafc/extract-content/0.log" Dec 06 02:15:04 crc kubenswrapper[4991]: I1206 02:15:04.110589 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-v9ncc_d9f080b8-d4ac-4dd5-9ce8-850cb9303ae1/extract-utilities/0.log" Dec 06 02:15:04 crc kubenswrapper[4991]: I1206 02:15:04.307475 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416410-prk68"] Dec 06 02:15:04 crc kubenswrapper[4991]: I1206 02:15:04.308592 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-v9ncc_d9f080b8-d4ac-4dd5-9ce8-850cb9303ae1/extract-utilities/0.log" Dec 06 02:15:04 crc kubenswrapper[4991]: I1206 02:15:04.340973 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-v9ncc_d9f080b8-d4ac-4dd5-9ce8-850cb9303ae1/extract-content/0.log" Dec 06 02:15:04 crc kubenswrapper[4991]: I1206 02:15:04.344470 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416410-prk68"] Dec 06 02:15:04 crc kubenswrapper[4991]: I1206 02:15:04.387329 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-w9mn9_97d40092-c83e-4410-9535-73e4aad9eafc/registry-server/0.log" Dec 06 02:15:04 crc kubenswrapper[4991]: I1206 02:15:04.390717 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-v9ncc_d9f080b8-d4ac-4dd5-9ce8-850cb9303ae1/extract-content/0.log" Dec 06 02:15:04 crc kubenswrapper[4991]: I1206 02:15:04.533070 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-v9ncc_d9f080b8-d4ac-4dd5-9ce8-850cb9303ae1/extract-content/0.log" Dec 06 02:15:04 crc kubenswrapper[4991]: I1206 02:15:04.588866 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-v9ncc_d9f080b8-d4ac-4dd5-9ce8-850cb9303ae1/extract-utilities/0.log" Dec 06 02:15:04 crc kubenswrapper[4991]: I1206 02:15:04.663758 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mc9ts" event={"ID":"755e0e60-a808-43eb-a0a3-82f89978a640","Type":"ContainerStarted","Data":"081bb92305e44f46682b59068950ba526552ca30d7052e23687e3b0d7af26ec0"} Dec 06 02:15:04 crc kubenswrapper[4991]: I1206 02:15:04.814388 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-xs7vk_10a1dada-c415-427c-a5ed-36eac2391451/marketplace-operator/0.log" Dec 06 02:15:04 crc kubenswrapper[4991]: I1206 02:15:04.959006 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-chxpn_a9de55b2-75a9-4843-bfee-297b2522e7f0/extract-utilities/0.log" Dec 06 02:15:05 crc kubenswrapper[4991]: I1206 02:15:05.051703 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25dc6d8b-19d1-45c4-97ed-e5c426a46dce" path="/var/lib/kubelet/pods/25dc6d8b-19d1-45c4-97ed-e5c426a46dce/volumes" Dec 06 02:15:05 crc kubenswrapper[4991]: I1206 02:15:05.142473 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-v9ncc_d9f080b8-d4ac-4dd5-9ce8-850cb9303ae1/registry-server/0.log" Dec 06 02:15:05 crc kubenswrapper[4991]: I1206 02:15:05.197498 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-chxpn_a9de55b2-75a9-4843-bfee-297b2522e7f0/extract-utilities/0.log" Dec 06 02:15:05 crc kubenswrapper[4991]: I1206 02:15:05.223277 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-chxpn_a9de55b2-75a9-4843-bfee-297b2522e7f0/extract-content/0.log" Dec 06 02:15:05 crc kubenswrapper[4991]: I1206 02:15:05.240054 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-chxpn_a9de55b2-75a9-4843-bfee-297b2522e7f0/extract-content/0.log" Dec 06 02:15:05 crc kubenswrapper[4991]: I1206 02:15:05.402458 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-chxpn_a9de55b2-75a9-4843-bfee-297b2522e7f0/extract-content/0.log" Dec 06 02:15:05 crc kubenswrapper[4991]: I1206 02:15:05.425211 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-chxpn_a9de55b2-75a9-4843-bfee-297b2522e7f0/extract-utilities/0.log" Dec 06 02:15:05 crc kubenswrapper[4991]: I1206 02:15:05.525208 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-chxpn_a9de55b2-75a9-4843-bfee-297b2522e7f0/registry-server/0.log" Dec 06 02:15:05 crc kubenswrapper[4991]: I1206 02:15:05.644404 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wb2rc_aa0283c0-7c60-47d5-9bf8-28bf2c1d0ecc/extract-utilities/0.log" Dec 06 02:15:05 crc kubenswrapper[4991]: I1206 02:15:05.677680 4991 generic.go:334] "Generic (PLEG): container finished" podID="755e0e60-a808-43eb-a0a3-82f89978a640" containerID="081bb92305e44f46682b59068950ba526552ca30d7052e23687e3b0d7af26ec0" exitCode=0 Dec 06 02:15:05 crc kubenswrapper[4991]: I1206 02:15:05.677730 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mc9ts" event={"ID":"755e0e60-a808-43eb-a0a3-82f89978a640","Type":"ContainerDied","Data":"081bb92305e44f46682b59068950ba526552ca30d7052e23687e3b0d7af26ec0"} Dec 06 02:15:06 crc kubenswrapper[4991]: I1206 02:15:06.053214 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wb2rc_aa0283c0-7c60-47d5-9bf8-28bf2c1d0ecc/extract-content/0.log" Dec 06 02:15:06 crc kubenswrapper[4991]: I1206 02:15:06.094282 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wb2rc_aa0283c0-7c60-47d5-9bf8-28bf2c1d0ecc/extract-utilities/0.log" Dec 06 02:15:06 crc kubenswrapper[4991]: I1206 02:15:06.130234 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wb2rc_aa0283c0-7c60-47d5-9bf8-28bf2c1d0ecc/extract-content/0.log" Dec 06 02:15:06 crc kubenswrapper[4991]: I1206 02:15:06.252412 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wb2rc_aa0283c0-7c60-47d5-9bf8-28bf2c1d0ecc/extract-content/0.log" Dec 06 02:15:06 crc kubenswrapper[4991]: I1206 02:15:06.278395 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wb2rc_aa0283c0-7c60-47d5-9bf8-28bf2c1d0ecc/extract-utilities/0.log" Dec 06 02:15:06 crc kubenswrapper[4991]: I1206 02:15:06.693893 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mc9ts" event={"ID":"755e0e60-a808-43eb-a0a3-82f89978a640","Type":"ContainerStarted","Data":"d676a62491039e66a7655c698e896c96459c0b4cffe6aa86a57419002e7553cd"} Dec 06 02:15:06 crc kubenswrapper[4991]: I1206 02:15:06.713717 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mc9ts" podStartSLOduration=3.02231714 podStartE2EDuration="5.713701357s" podCreationTimestamp="2025-12-06 02:15:01 +0000 UTC" firstStartedPulling="2025-12-06 02:15:03.650782855 +0000 UTC m=+4377.001073323" lastFinishedPulling="2025-12-06 02:15:06.342167072 +0000 UTC m=+4379.692457540" observedRunningTime="2025-12-06 02:15:06.712780933 +0000 UTC m=+4380.063071411" watchObservedRunningTime="2025-12-06 02:15:06.713701357 +0000 UTC m=+4380.063991825" Dec 06 02:15:06 crc kubenswrapper[4991]: I1206 02:15:06.797809 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wb2rc_aa0283c0-7c60-47d5-9bf8-28bf2c1d0ecc/registry-server/0.log" Dec 06 02:15:12 crc kubenswrapper[4991]: I1206 02:15:12.143700 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mc9ts" Dec 06 02:15:12 crc kubenswrapper[4991]: I1206 02:15:12.145519 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mc9ts" Dec 06 02:15:12 crc kubenswrapper[4991]: I1206 02:15:12.194997 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mc9ts" Dec 06 02:15:12 crc kubenswrapper[4991]: I1206 02:15:12.806203 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mc9ts" Dec 06 02:15:12 crc kubenswrapper[4991]: I1206 02:15:12.866436 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mc9ts"] Dec 06 02:15:14 crc kubenswrapper[4991]: I1206 02:15:14.796252 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mc9ts" podUID="755e0e60-a808-43eb-a0a3-82f89978a640" containerName="registry-server" containerID="cri-o://d676a62491039e66a7655c698e896c96459c0b4cffe6aa86a57419002e7553cd" gracePeriod=2 Dec 06 02:15:15 crc kubenswrapper[4991]: I1206 02:15:15.373445 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mc9ts" Dec 06 02:15:15 crc kubenswrapper[4991]: I1206 02:15:15.475949 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/755e0e60-a808-43eb-a0a3-82f89978a640-catalog-content\") pod \"755e0e60-a808-43eb-a0a3-82f89978a640\" (UID: \"755e0e60-a808-43eb-a0a3-82f89978a640\") " Dec 06 02:15:15 crc kubenswrapper[4991]: I1206 02:15:15.476192 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/755e0e60-a808-43eb-a0a3-82f89978a640-utilities\") pod \"755e0e60-a808-43eb-a0a3-82f89978a640\" (UID: \"755e0e60-a808-43eb-a0a3-82f89978a640\") " Dec 06 02:15:15 crc kubenswrapper[4991]: I1206 02:15:15.476242 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tt6hq\" (UniqueName: \"kubernetes.io/projected/755e0e60-a808-43eb-a0a3-82f89978a640-kube-api-access-tt6hq\") pod \"755e0e60-a808-43eb-a0a3-82f89978a640\" (UID: \"755e0e60-a808-43eb-a0a3-82f89978a640\") " Dec 06 02:15:15 crc kubenswrapper[4991]: I1206 02:15:15.478931 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/755e0e60-a808-43eb-a0a3-82f89978a640-utilities" (OuterVolumeSpecName: "utilities") pod "755e0e60-a808-43eb-a0a3-82f89978a640" (UID: "755e0e60-a808-43eb-a0a3-82f89978a640"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 02:15:15 crc kubenswrapper[4991]: I1206 02:15:15.483410 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/755e0e60-a808-43eb-a0a3-82f89978a640-kube-api-access-tt6hq" (OuterVolumeSpecName: "kube-api-access-tt6hq") pod "755e0e60-a808-43eb-a0a3-82f89978a640" (UID: "755e0e60-a808-43eb-a0a3-82f89978a640"). InnerVolumeSpecName "kube-api-access-tt6hq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 02:15:15 crc kubenswrapper[4991]: I1206 02:15:15.543605 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/755e0e60-a808-43eb-a0a3-82f89978a640-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "755e0e60-a808-43eb-a0a3-82f89978a640" (UID: "755e0e60-a808-43eb-a0a3-82f89978a640"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 02:15:15 crc kubenswrapper[4991]: I1206 02:15:15.578175 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/755e0e60-a808-43eb-a0a3-82f89978a640-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 02:15:15 crc kubenswrapper[4991]: I1206 02:15:15.578214 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/755e0e60-a808-43eb-a0a3-82f89978a640-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 02:15:15 crc kubenswrapper[4991]: I1206 02:15:15.578223 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tt6hq\" (UniqueName: \"kubernetes.io/projected/755e0e60-a808-43eb-a0a3-82f89978a640-kube-api-access-tt6hq\") on node \"crc\" DevicePath \"\"" Dec 06 02:15:15 crc kubenswrapper[4991]: I1206 02:15:15.806842 4991 generic.go:334] "Generic (PLEG): container finished" podID="755e0e60-a808-43eb-a0a3-82f89978a640" containerID="d676a62491039e66a7655c698e896c96459c0b4cffe6aa86a57419002e7553cd" exitCode=0 Dec 06 02:15:15 crc kubenswrapper[4991]: I1206 02:15:15.806903 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mc9ts" event={"ID":"755e0e60-a808-43eb-a0a3-82f89978a640","Type":"ContainerDied","Data":"d676a62491039e66a7655c698e896c96459c0b4cffe6aa86a57419002e7553cd"} Dec 06 02:15:15 crc kubenswrapper[4991]: I1206 02:15:15.806965 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mc9ts" Dec 06 02:15:15 crc kubenswrapper[4991]: I1206 02:15:15.808033 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mc9ts" event={"ID":"755e0e60-a808-43eb-a0a3-82f89978a640","Type":"ContainerDied","Data":"e06b140f04d10a8e6569dd2e5feb5875d82961af5cbff1b4ae625a5bd66c308b"} Dec 06 02:15:15 crc kubenswrapper[4991]: I1206 02:15:15.808132 4991 scope.go:117] "RemoveContainer" containerID="d676a62491039e66a7655c698e896c96459c0b4cffe6aa86a57419002e7553cd" Dec 06 02:15:15 crc kubenswrapper[4991]: I1206 02:15:15.842849 4991 scope.go:117] "RemoveContainer" containerID="081bb92305e44f46682b59068950ba526552ca30d7052e23687e3b0d7af26ec0" Dec 06 02:15:15 crc kubenswrapper[4991]: I1206 02:15:15.861610 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mc9ts"] Dec 06 02:15:15 crc kubenswrapper[4991]: I1206 02:15:15.872565 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mc9ts"] Dec 06 02:15:15 crc kubenswrapper[4991]: I1206 02:15:15.873476 4991 scope.go:117] "RemoveContainer" containerID="61a622430697d053533a04c93e42c0eb6bb90f891591e4183bf7e3fad6ba60f3" Dec 06 02:15:15 crc kubenswrapper[4991]: I1206 02:15:15.932576 4991 scope.go:117] "RemoveContainer" containerID="d676a62491039e66a7655c698e896c96459c0b4cffe6aa86a57419002e7553cd" Dec 06 02:15:15 crc kubenswrapper[4991]: E1206 02:15:15.933506 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d676a62491039e66a7655c698e896c96459c0b4cffe6aa86a57419002e7553cd\": container with ID starting with d676a62491039e66a7655c698e896c96459c0b4cffe6aa86a57419002e7553cd not found: ID does not exist" containerID="d676a62491039e66a7655c698e896c96459c0b4cffe6aa86a57419002e7553cd" Dec 06 02:15:15 crc kubenswrapper[4991]: I1206 02:15:15.933556 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d676a62491039e66a7655c698e896c96459c0b4cffe6aa86a57419002e7553cd"} err="failed to get container status \"d676a62491039e66a7655c698e896c96459c0b4cffe6aa86a57419002e7553cd\": rpc error: code = NotFound desc = could not find container \"d676a62491039e66a7655c698e896c96459c0b4cffe6aa86a57419002e7553cd\": container with ID starting with d676a62491039e66a7655c698e896c96459c0b4cffe6aa86a57419002e7553cd not found: ID does not exist" Dec 06 02:15:15 crc kubenswrapper[4991]: I1206 02:15:15.933594 4991 scope.go:117] "RemoveContainer" containerID="081bb92305e44f46682b59068950ba526552ca30d7052e23687e3b0d7af26ec0" Dec 06 02:15:15 crc kubenswrapper[4991]: E1206 02:15:15.934228 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"081bb92305e44f46682b59068950ba526552ca30d7052e23687e3b0d7af26ec0\": container with ID starting with 081bb92305e44f46682b59068950ba526552ca30d7052e23687e3b0d7af26ec0 not found: ID does not exist" containerID="081bb92305e44f46682b59068950ba526552ca30d7052e23687e3b0d7af26ec0" Dec 06 02:15:15 crc kubenswrapper[4991]: I1206 02:15:15.934285 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"081bb92305e44f46682b59068950ba526552ca30d7052e23687e3b0d7af26ec0"} err="failed to get container status \"081bb92305e44f46682b59068950ba526552ca30d7052e23687e3b0d7af26ec0\": rpc error: code = NotFound desc = could not find container \"081bb92305e44f46682b59068950ba526552ca30d7052e23687e3b0d7af26ec0\": container with ID starting with 081bb92305e44f46682b59068950ba526552ca30d7052e23687e3b0d7af26ec0 not found: ID does not exist" Dec 06 02:15:15 crc kubenswrapper[4991]: I1206 02:15:15.934322 4991 scope.go:117] "RemoveContainer" containerID="61a622430697d053533a04c93e42c0eb6bb90f891591e4183bf7e3fad6ba60f3" Dec 06 02:15:15 crc kubenswrapper[4991]: E1206 02:15:15.934693 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61a622430697d053533a04c93e42c0eb6bb90f891591e4183bf7e3fad6ba60f3\": container with ID starting with 61a622430697d053533a04c93e42c0eb6bb90f891591e4183bf7e3fad6ba60f3 not found: ID does not exist" containerID="61a622430697d053533a04c93e42c0eb6bb90f891591e4183bf7e3fad6ba60f3" Dec 06 02:15:15 crc kubenswrapper[4991]: I1206 02:15:15.934752 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61a622430697d053533a04c93e42c0eb6bb90f891591e4183bf7e3fad6ba60f3"} err="failed to get container status \"61a622430697d053533a04c93e42c0eb6bb90f891591e4183bf7e3fad6ba60f3\": rpc error: code = NotFound desc = could not find container \"61a622430697d053533a04c93e42c0eb6bb90f891591e4183bf7e3fad6ba60f3\": container with ID starting with 61a622430697d053533a04c93e42c0eb6bb90f891591e4183bf7e3fad6ba60f3 not found: ID does not exist" Dec 06 02:15:17 crc kubenswrapper[4991]: I1206 02:15:17.052325 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="755e0e60-a808-43eb-a0a3-82f89978a640" path="/var/lib/kubelet/pods/755e0e60-a808-43eb-a0a3-82f89978a640/volumes" Dec 06 02:15:19 crc kubenswrapper[4991]: I1206 02:15:19.048019 4991 scope.go:117] "RemoveContainer" containerID="54631a6f269a9d0599b393fce077d2ab0395ce90b3697f6808de48227ca63118" Dec 06 02:15:34 crc kubenswrapper[4991]: E1206 02:15:34.921865 4991 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.110:33904->38.102.83.110:34783: read tcp 38.102.83.110:33904->38.102.83.110:34783: read: connection reset by peer Dec 06 02:15:34 crc kubenswrapper[4991]: E1206 02:15:34.922603 4991 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.110:33904->38.102.83.110:34783: write tcp 38.102.83.110:33904->38.102.83.110:34783: write: broken pipe Dec 06 02:15:44 crc kubenswrapper[4991]: E1206 02:15:44.374119 4991 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.110:34500->38.102.83.110:34783: write tcp 38.102.83.110:34500->38.102.83.110:34783: write: broken pipe Dec 06 02:16:54 crc kubenswrapper[4991]: I1206 02:16:54.916141 4991 generic.go:334] "Generic (PLEG): container finished" podID="79cf9126-93c7-4477-a7ad-fd310b716db4" containerID="f5672bfa3ad30959a08c4a894dde2c067d46106efc4594d3136d3b231175f3ae" exitCode=0 Dec 06 02:16:54 crc kubenswrapper[4991]: I1206 02:16:54.916335 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n2nnk/must-gather-2fsl5" event={"ID":"79cf9126-93c7-4477-a7ad-fd310b716db4","Type":"ContainerDied","Data":"f5672bfa3ad30959a08c4a894dde2c067d46106efc4594d3136d3b231175f3ae"} Dec 06 02:16:54 crc kubenswrapper[4991]: I1206 02:16:54.920566 4991 scope.go:117] "RemoveContainer" containerID="f5672bfa3ad30959a08c4a894dde2c067d46106efc4594d3136d3b231175f3ae" Dec 06 02:16:55 crc kubenswrapper[4991]: I1206 02:16:55.703404 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n2nnk_must-gather-2fsl5_79cf9126-93c7-4477-a7ad-fd310b716db4/gather/0.log" Dec 06 02:17:06 crc kubenswrapper[4991]: I1206 02:17:06.053493 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n2nnk/must-gather-2fsl5"] Dec 06 02:17:06 crc kubenswrapper[4991]: I1206 02:17:06.054300 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-n2nnk/must-gather-2fsl5" podUID="79cf9126-93c7-4477-a7ad-fd310b716db4" containerName="copy" containerID="cri-o://218f4da2d95d861357f740c41588478ee075fcbc153364e53df3e84e9f7ecad0" gracePeriod=2 Dec 06 02:17:06 crc kubenswrapper[4991]: I1206 02:17:06.066802 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n2nnk/must-gather-2fsl5"] Dec 06 02:17:06 crc kubenswrapper[4991]: I1206 02:17:06.507166 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n2nnk_must-gather-2fsl5_79cf9126-93c7-4477-a7ad-fd310b716db4/copy/0.log" Dec 06 02:17:06 crc kubenswrapper[4991]: I1206 02:17:06.508376 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2nnk/must-gather-2fsl5" Dec 06 02:17:06 crc kubenswrapper[4991]: I1206 02:17:06.652885 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mcnh\" (UniqueName: \"kubernetes.io/projected/79cf9126-93c7-4477-a7ad-fd310b716db4-kube-api-access-6mcnh\") pod \"79cf9126-93c7-4477-a7ad-fd310b716db4\" (UID: \"79cf9126-93c7-4477-a7ad-fd310b716db4\") " Dec 06 02:17:06 crc kubenswrapper[4991]: I1206 02:17:06.653044 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/79cf9126-93c7-4477-a7ad-fd310b716db4-must-gather-output\") pod \"79cf9126-93c7-4477-a7ad-fd310b716db4\" (UID: \"79cf9126-93c7-4477-a7ad-fd310b716db4\") " Dec 06 02:17:06 crc kubenswrapper[4991]: I1206 02:17:06.661086 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79cf9126-93c7-4477-a7ad-fd310b716db4-kube-api-access-6mcnh" (OuterVolumeSpecName: "kube-api-access-6mcnh") pod "79cf9126-93c7-4477-a7ad-fd310b716db4" (UID: "79cf9126-93c7-4477-a7ad-fd310b716db4"). InnerVolumeSpecName "kube-api-access-6mcnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 02:17:06 crc kubenswrapper[4991]: I1206 02:17:06.787263 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mcnh\" (UniqueName: \"kubernetes.io/projected/79cf9126-93c7-4477-a7ad-fd310b716db4-kube-api-access-6mcnh\") on node \"crc\" DevicePath \"\"" Dec 06 02:17:06 crc kubenswrapper[4991]: I1206 02:17:06.803885 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79cf9126-93c7-4477-a7ad-fd310b716db4-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "79cf9126-93c7-4477-a7ad-fd310b716db4" (UID: "79cf9126-93c7-4477-a7ad-fd310b716db4"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 02:17:06 crc kubenswrapper[4991]: I1206 02:17:06.889280 4991 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/79cf9126-93c7-4477-a7ad-fd310b716db4-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 06 02:17:07 crc kubenswrapper[4991]: I1206 02:17:07.061204 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n2nnk_must-gather-2fsl5_79cf9126-93c7-4477-a7ad-fd310b716db4/copy/0.log" Dec 06 02:17:07 crc kubenswrapper[4991]: I1206 02:17:07.061500 4991 generic.go:334] "Generic (PLEG): container finished" podID="79cf9126-93c7-4477-a7ad-fd310b716db4" containerID="218f4da2d95d861357f740c41588478ee075fcbc153364e53df3e84e9f7ecad0" exitCode=143 Dec 06 02:17:07 crc kubenswrapper[4991]: I1206 02:17:07.061568 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2nnk/must-gather-2fsl5" Dec 06 02:17:07 crc kubenswrapper[4991]: I1206 02:17:07.071532 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79cf9126-93c7-4477-a7ad-fd310b716db4" path="/var/lib/kubelet/pods/79cf9126-93c7-4477-a7ad-fd310b716db4/volumes" Dec 06 02:17:07 crc kubenswrapper[4991]: I1206 02:17:07.072223 4991 scope.go:117] "RemoveContainer" containerID="218f4da2d95d861357f740c41588478ee075fcbc153364e53df3e84e9f7ecad0" Dec 06 02:17:07 crc kubenswrapper[4991]: I1206 02:17:07.113264 4991 scope.go:117] "RemoveContainer" containerID="f5672bfa3ad30959a08c4a894dde2c067d46106efc4594d3136d3b231175f3ae" Dec 06 02:17:07 crc kubenswrapper[4991]: I1206 02:17:07.187310 4991 scope.go:117] "RemoveContainer" containerID="218f4da2d95d861357f740c41588478ee075fcbc153364e53df3e84e9f7ecad0" Dec 06 02:17:07 crc kubenswrapper[4991]: E1206 02:17:07.187866 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"218f4da2d95d861357f740c41588478ee075fcbc153364e53df3e84e9f7ecad0\": container with ID starting with 218f4da2d95d861357f740c41588478ee075fcbc153364e53df3e84e9f7ecad0 not found: ID does not exist" containerID="218f4da2d95d861357f740c41588478ee075fcbc153364e53df3e84e9f7ecad0" Dec 06 02:17:07 crc kubenswrapper[4991]: I1206 02:17:07.187931 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"218f4da2d95d861357f740c41588478ee075fcbc153364e53df3e84e9f7ecad0"} err="failed to get container status \"218f4da2d95d861357f740c41588478ee075fcbc153364e53df3e84e9f7ecad0\": rpc error: code = NotFound desc = could not find container \"218f4da2d95d861357f740c41588478ee075fcbc153364e53df3e84e9f7ecad0\": container with ID starting with 218f4da2d95d861357f740c41588478ee075fcbc153364e53df3e84e9f7ecad0 not found: ID does not exist" Dec 06 02:17:07 crc kubenswrapper[4991]: I1206 02:17:07.187964 4991 scope.go:117] "RemoveContainer" containerID="f5672bfa3ad30959a08c4a894dde2c067d46106efc4594d3136d3b231175f3ae" Dec 06 02:17:07 crc kubenswrapper[4991]: E1206 02:17:07.188466 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5672bfa3ad30959a08c4a894dde2c067d46106efc4594d3136d3b231175f3ae\": container with ID starting with f5672bfa3ad30959a08c4a894dde2c067d46106efc4594d3136d3b231175f3ae not found: ID does not exist" containerID="f5672bfa3ad30959a08c4a894dde2c067d46106efc4594d3136d3b231175f3ae" Dec 06 02:17:07 crc kubenswrapper[4991]: I1206 02:17:07.188515 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5672bfa3ad30959a08c4a894dde2c067d46106efc4594d3136d3b231175f3ae"} err="failed to get container status \"f5672bfa3ad30959a08c4a894dde2c067d46106efc4594d3136d3b231175f3ae\": rpc error: code = NotFound desc = could not find container \"f5672bfa3ad30959a08c4a894dde2c067d46106efc4594d3136d3b231175f3ae\": container with ID starting with f5672bfa3ad30959a08c4a894dde2c067d46106efc4594d3136d3b231175f3ae not found: ID does not exist" Dec 06 02:17:16 crc kubenswrapper[4991]: E1206 02:17:16.690641 4991 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79cf9126_93c7_4477_a7ad_fd310b716db4.slice/crio-069679d256cef451a0e66142b17c5133903ede73900306db0c9182098f1be591\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79cf9126_93c7_4477_a7ad_fd310b716db4.slice\": RecentStats: unable to find data in memory cache]" Dec 06 02:17:16 crc kubenswrapper[4991]: I1206 02:17:16.740120 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 02:17:16 crc kubenswrapper[4991]: I1206 02:17:16.740199 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 02:17:26 crc kubenswrapper[4991]: E1206 02:17:26.948835 4991 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79cf9126_93c7_4477_a7ad_fd310b716db4.slice/crio-069679d256cef451a0e66142b17c5133903ede73900306db0c9182098f1be591\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79cf9126_93c7_4477_a7ad_fd310b716db4.slice\": RecentStats: unable to find data in memory cache]" Dec 06 02:17:37 crc kubenswrapper[4991]: E1206 02:17:37.223803 4991 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79cf9126_93c7_4477_a7ad_fd310b716db4.slice/crio-069679d256cef451a0e66142b17c5133903ede73900306db0c9182098f1be591\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79cf9126_93c7_4477_a7ad_fd310b716db4.slice\": RecentStats: unable to find data in memory cache]" Dec 06 02:17:46 crc kubenswrapper[4991]: I1206 02:17:46.739348 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 02:17:46 crc kubenswrapper[4991]: I1206 02:17:46.739961 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 02:17:47 crc kubenswrapper[4991]: E1206 02:17:47.487978 4991 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79cf9126_93c7_4477_a7ad_fd310b716db4.slice/crio-069679d256cef451a0e66142b17c5133903ede73900306db0c9182098f1be591\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79cf9126_93c7_4477_a7ad_fd310b716db4.slice\": RecentStats: unable to find data in memory cache]" Dec 06 02:17:57 crc kubenswrapper[4991]: E1206 02:17:57.741384 4991 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79cf9126_93c7_4477_a7ad_fd310b716db4.slice/crio-069679d256cef451a0e66142b17c5133903ede73900306db0c9182098f1be591\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79cf9126_93c7_4477_a7ad_fd310b716db4.slice\": RecentStats: unable to find data in memory cache]" Dec 06 02:18:16 crc kubenswrapper[4991]: I1206 02:18:16.739568 4991 patch_prober.go:28] interesting pod/machine-config-daemon-ppxx8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 02:18:16 crc kubenswrapper[4991]: I1206 02:18:16.740301 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 02:18:16 crc kubenswrapper[4991]: I1206 02:18:16.740372 4991 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" Dec 06 02:18:16 crc kubenswrapper[4991]: I1206 02:18:16.741538 4991 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"369666e7a5a42942a80da266a62c4d04fa304b1fe28f89d531fbf75c37dd1f3b"} pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 02:18:16 crc kubenswrapper[4991]: I1206 02:18:16.741856 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerName="machine-config-daemon" containerID="cri-o://369666e7a5a42942a80da266a62c4d04fa304b1fe28f89d531fbf75c37dd1f3b" gracePeriod=600 Dec 06 02:18:16 crc kubenswrapper[4991]: I1206 02:18:16.930978 4991 generic.go:334] "Generic (PLEG): container finished" podID="f35cea72-9e31-4432-9e3b-16a50ebce794" containerID="369666e7a5a42942a80da266a62c4d04fa304b1fe28f89d531fbf75c37dd1f3b" exitCode=0 Dec 06 02:18:16 crc kubenswrapper[4991]: I1206 02:18:16.931056 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" event={"ID":"f35cea72-9e31-4432-9e3b-16a50ebce794","Type":"ContainerDied","Data":"369666e7a5a42942a80da266a62c4d04fa304b1fe28f89d531fbf75c37dd1f3b"} Dec 06 02:18:16 crc kubenswrapper[4991]: I1206 02:18:16.931226 4991 scope.go:117] "RemoveContainer" containerID="acdbee5b0dca7edd2b37629b57df167a9453fec9ce78915f11caf35e1a51bfe0" Dec 06 02:18:17 crc kubenswrapper[4991]: E1206 02:18:17.888775 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 02:18:17 crc kubenswrapper[4991]: I1206 02:18:17.948556 4991 scope.go:117] "RemoveContainer" containerID="369666e7a5a42942a80da266a62c4d04fa304b1fe28f89d531fbf75c37dd1f3b" Dec 06 02:18:17 crc kubenswrapper[4991]: E1206 02:18:17.949138 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 02:18:19 crc kubenswrapper[4991]: I1206 02:18:19.243226 4991 scope.go:117] "RemoveContainer" containerID="7bb8fa2528aad2fea8e781407c65ef5d74386220828229537196b321a35c5459" Dec 06 02:18:19 crc kubenswrapper[4991]: I1206 02:18:19.289812 4991 scope.go:117] "RemoveContainer" containerID="918572af3cfaa906d3309814fe043f12ff9ce7e5952fefd76c3fc7b25039187a" Dec 06 02:18:30 crc kubenswrapper[4991]: I1206 02:18:30.037850 4991 scope.go:117] "RemoveContainer" containerID="369666e7a5a42942a80da266a62c4d04fa304b1fe28f89d531fbf75c37dd1f3b" Dec 06 02:18:30 crc kubenswrapper[4991]: E1206 02:18:30.038993 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 02:18:45 crc kubenswrapper[4991]: I1206 02:18:45.038347 4991 scope.go:117] "RemoveContainer" containerID="369666e7a5a42942a80da266a62c4d04fa304b1fe28f89d531fbf75c37dd1f3b" Dec 06 02:18:45 crc kubenswrapper[4991]: E1206 02:18:45.039311 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 02:18:58 crc kubenswrapper[4991]: I1206 02:18:58.038258 4991 scope.go:117] "RemoveContainer" containerID="369666e7a5a42942a80da266a62c4d04fa304b1fe28f89d531fbf75c37dd1f3b" Dec 06 02:18:58 crc kubenswrapper[4991]: E1206 02:18:58.039004 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 02:19:12 crc kubenswrapper[4991]: I1206 02:19:12.039407 4991 scope.go:117] "RemoveContainer" containerID="369666e7a5a42942a80da266a62c4d04fa304b1fe28f89d531fbf75c37dd1f3b" Dec 06 02:19:12 crc kubenswrapper[4991]: E1206 02:19:12.041074 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 02:19:26 crc kubenswrapper[4991]: I1206 02:19:26.039462 4991 scope.go:117] "RemoveContainer" containerID="369666e7a5a42942a80da266a62c4d04fa304b1fe28f89d531fbf75c37dd1f3b" Dec 06 02:19:26 crc kubenswrapper[4991]: E1206 02:19:26.040706 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 02:19:39 crc kubenswrapper[4991]: I1206 02:19:39.038446 4991 scope.go:117] "RemoveContainer" containerID="369666e7a5a42942a80da266a62c4d04fa304b1fe28f89d531fbf75c37dd1f3b" Dec 06 02:19:39 crc kubenswrapper[4991]: E1206 02:19:39.039612 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 02:19:50 crc kubenswrapper[4991]: I1206 02:19:50.423155 4991 scope.go:117] "RemoveContainer" containerID="369666e7a5a42942a80da266a62c4d04fa304b1fe28f89d531fbf75c37dd1f3b" Dec 06 02:19:50 crc kubenswrapper[4991]: E1206 02:19:50.424059 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 02:20:02 crc kubenswrapper[4991]: I1206 02:20:02.038176 4991 scope.go:117] "RemoveContainer" containerID="369666e7a5a42942a80da266a62c4d04fa304b1fe28f89d531fbf75c37dd1f3b" Dec 06 02:20:02 crc kubenswrapper[4991]: E1206 02:20:02.040087 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794" Dec 06 02:20:16 crc kubenswrapper[4991]: I1206 02:20:16.038095 4991 scope.go:117] "RemoveContainer" containerID="369666e7a5a42942a80da266a62c4d04fa304b1fe28f89d531fbf75c37dd1f3b" Dec 06 02:20:16 crc kubenswrapper[4991]: E1206 02:20:16.039210 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ppxx8_openshift-machine-config-operator(f35cea72-9e31-4432-9e3b-16a50ebce794)\"" pod="openshift-machine-config-operator/machine-config-daemon-ppxx8" podUID="f35cea72-9e31-4432-9e3b-16a50ebce794"